Search
Homepage This page's url is: -crn- Rankings and Research Companies Channelcast Marketing Matters CRNtv Events WOTC Jobs Dell EMC Newsroom Hitachi Vantara Newsroom HP Reinvent Newsroom IBM PartnerWorld Newsroom HPE Zone Tech Provider Zone

HP CTO Lays Out HP's Vision For Future Computer Architecture

Martin Fink, HP CTO and director of HP Labs, has proposed replacing the same computer architecture that's been in place for the last 60 years with technologies that allow automated data analytics on systems featuring high-density storage and memory and using 16,000 times less electricity per bit of data.

photo
Martin Fink and HP's memristor wafer

Hewlett-Packard is looking at ways to completely rearchitect the idea of what a computer is in order to take advantage of several recent technological advances that increase storage density, data movement speed and power consumption.

That's the word from Martin Fink, HP CTO and director of HP Labs, who laid out his vision of the future of computing architecture, and what HP is doing to achieve that vision in the near future, during his keynote presentation at this week's Nth Symposium conference in Anaheim, Calif.

It's a move that is long overdue, Fink said.

[Related: The History Of The Hard Drive And Its Future ]

"If you just take a little step back, what you actually figure out is that the fundamental design of computing has not changed since the 1950s," he said. "So for 60-plus years, we have been doing computing the very same way. We have the CPU. It does the basic math. We have some main memory. And we have some sort of IO, which is typically storage, and eventually became a lot of networking."

At the core, this situation has not really changed, Fink said.

"And guess what?" he said. "We, as an IT industry, we are so smart that we virtualize all this. And when we virtualize it, we've basically recreated the exact same thing in virtual form. Aren't we really, really smart."

It will be a welcome change, said Rich Baldwin, CIO and chief strategy officer at Nth Generation Computing, a San Diego-based solution provider, longtime HP partner and host of the Nth Symposium.

The role of IT people is changing rapidly, Baldwin said.

"They're becoming more like knowledge brokers and are getting less involved in making things work," he said. "IT has to get to the point where people can use the power of computers without knowing how they work."

Fink said the way computers are architected has to change given the fact that enterprises are now commonly handling petabytes and more of data.

"And, typically, once you reach the petabyte range, you are starting to deal with unstructured data," he said. "And when we think about where the world is going, we think about what problems are we trying to solve. My team actually starts thinking at the exabyte range."

NEXT: Rethinking Computer Performance, Power Consumption


The amount of power consumed by IT is also becoming a huge problem, HP's Fink said.

"If we actually took the cloud and called it a country, on a power consumption basis, the cloud would be the fifth-largest country in the world. ... And, we are barely scratching the surface of the amount of data that's going to come at us," he said.

Managing the ever-increasing data center power consumption could be tackled by having a lot of smart people looking at how to better manage energy, but that would be the wrong approach, he said.

"We can continue down the path of how to become the best scientists on Planet Earth in terms of managing energy," he said. "Why don't we start thinking about the problem in terms of how we stop using energy in the first place."

HP, Fink said, is doing its part with the development of its Project Moonshot server, which takes advantage of mobile device technology to use 89 percent less energy than other servers.

Hp.com is one of the busiest websites on the planet, and HP moved a significant part of that site to Moonshot servers, which Fink called "HP on HP."

"We can basically power hp.com on the equivalent power of 12 60-watt bulbs," he said.

Nearly everyone in IT departments has been in a situation where the focus is on IT, with the problem of power consumption being "someone else's problem."

"As an industry, we have ... to look beyond the 'I'm measured on opex' or 'I'm measured on capex' or 'somebody else owns building my data center and I just put the stuff in it,'" Fink said. "And you really need to start forcing the conversation around looking at this holistically."

Another area where computer architecture is changing is in network connectivity, where new photonics cables can be used to replace the traditional copper cables currently in use.

Fink called thick copper cables the 'energy-sucking devil,' and said that the more copper one uses, the more energy it consumes.

3

"And our math right now suggests that we can get to a point where we will use 16,000 times less energy per bit to process [data] by moving from pure copper-based systems to photonics-enabled systems," he said.

NEXT: Rethinking Computer Storage, Memory


It is also time to start moving away from the traditional storage hierarchy based on on-chip cache, main memory and mass storage. "Eighty percent, give or take, of what that computer is doing is actually moving data back and forth through this chain," Fink said.

While HP recently introduced a flash-optimized version of its 3PAR storage solution, HP and its peers need to take storage technology to new levels to meet future demands for high scalability and low power consumption, Fink said.

Traditional computer DRAM and flash memory are getting more and more dense, but the industry is approaching the point at which it will be hard to tell if a particular bit is a zero or a one. The industry, Fink said, is responding with three new memory technologies.

The first is spin transfer torque, which Fink said can be thought of as a bar magnet that represents a "1" when spun one way and represents a "0" when spun the other way, with the flipping done by a pulse current. It has high performance but a density lower than that of DRAM.

The second is phase change memory, or PCRAM. Phase change works by heating up a material that becomes a glass when cooled quickly and a crystal when cooled slowly. It has some of the properties of flash, but it has performance that, in some cases, can be worse that flash, he said.

The third is memristor, which works by changing the resistance of the right material from 0 ohms to 25,000 ohms by moving the material the distance of only 1 nanometer.

"The cool thing about this is that we can actually scale this tremendously," Fink said. "So we can achieve levels of scale that are much more significant than we can reach with flash or DRAM. It is persistent, a non-volatile memory. And we can have a line of site to performance that reaches the level of DRAM."

Fink showed a wafer that he said is expected to offer an estimated memory capacity of about 60 TB. Within another two years or so, that wafer will have a capacity of about 1.5 PB, all without the need for energy to retain the memory states.

NEXT: Wrapping Up The Future Of Computing


Memristor could be the most exciting technology introduced yet, Nth's Baldwin said.

"The power requirement for memristor is almost nothing, yet it will scale to a couple petabytes of capacity in a few years," he said. "It will replace disk and solid-state storage."

Combining Project Moonshot servers, new massively-scalable memory technologies and new photonic technology for moving mass amounts of data creates a new kind of computer architecture, HP's Fink said to applause from the audience.

"And now you know why I have one of the coolest jobs on Earth, because we get to solve these problems."

Even that is not yet enough.

Given the growing importance of big data and business analytics, HP is also looking at how to redesign applications to take the data scientist out of the solution and enable the ability to access data immediately without an intermediary, Fink said.

Getting that intermediary will be a big advance in taking advantage of new computer architectures, Baldwin said.

"I use HP Autonomy and IDOL for big data without understanding the math behind it," he said. "If we can get to the point where we can just ask the question and get the answer, imagine what we can do."

The next problem will be how to manage a million nodes, Fink said.

"That's not an exaggeration," he said. "When you think about that cloud infrastructure of the future, it's not a rack of 64 nodes. It's not a couple of racks of 1,000 nodes. It's a million cores. And how are you going to manage that?"

HP has a team looking at how to manage systems at that scale.

"The end goal is, we now have all the places to put the data, we can move the data, we can process the data," Fink said. "We now need the capabilities to access that data very, very easily in order to turn that data into value."

Baldwin said it is exciting to see a renewed push on R&D at HP under current CEO Meg Whitman.

"HP is really investing big in R&D now with Meg," he said. "She sees the value in that investment."

PUBLISHED AUGUST 2, 2013

Back to Top

Video

 

sponsored resources