HP CTO Lays Out HP's Vision For Future Computer Architecture


 

Martin Fink HP
Martin Fink and HP's memristor wafer

 

Hewlett-Packard is looking at ways to completely rearchitect the idea of what a computer is in order to take advantage of several recent technological advances that increase storage density, data movement speed and power consumption.

That's the word from Martin Fink, HP CTO and director of HP Labs, who laid out his vision of the future of computing architecture, and what HP is doing to achieve that vision in the near future, during his keynote presentation at this week's Nth Symposium conference in Anaheim, Calif.

It's a move that is long overdue, Fink said.

 

[Related: The History Of The Hard Drive And Its Future]

"If you just take a little step back, what you actually figure out is that the fundamental design of computing has not changed since the 1950s," he said. "So for 60-plus years, we have been doing computing the very same way. We have the CPU. It does the basic math. We have some main memory. And we have some sort of IO, which is typically storage, and eventually became a lot of networking."

At the core, this situation has not really changed, Fink said.

"And guess what?" he said. "We, as an IT industry, we are so smart that we virtualize all this. And when we virtualize it, we've basically recreated the exact same thing in virtual form. Aren't we really, really smart."

It will be a welcome change, said Rich Baldwin, CIO and chief strategy officer at Nth Generation Computing, a San Diego-based solution provider, longtime HP partner and host of the Nth Symposium.

The role of IT people is changing rapidly, Baldwin said.

"They're becoming more like knowledge brokers and are getting less involved in making things work," he said. "IT has to get to the point where people can use the power of computers without knowing how they work."

Fink said the way computers are architected has to change given the fact that enterprises are now commonly handling petabytes and more of data.

"And, typically, once you reach the petabyte range, you are starting to deal with unstructured data," he said. "And when we think about where the world is going, we think about what problems are we trying to solve. My team actually starts thinking at the exabyte range."

NEXT: Rethinking Computer Performance, Power Consumption