Email this article   Print article 


HP CTO Lays Out HP's Vision For Future Computer Architecture

By Joseph F. Kovar
August 02, 2013    6:22 PM ET

Page 1 of 4

Martin Fink HP
Martin Fink and HP's memristor wafer

Hewlett-Packard is looking at ways to completely rearchitect the idea of what a computer is in order to take advantage of several recent technological advances that increase storage density, data movement speed and power consumption.

That's the word from Martin Fink, HP CTO and director of HP Labs, who laid out his vision of the future of computing architecture, and what HP is doing to achieve that vision in the near future, during his keynote presentation at this week's Nth Symposium conference in Anaheim, Calif.

It's a move that is long overdue, Fink said.

[Related: The History Of The Hard Drive And Its Future]

"If you just take a little step back, what you actually figure out is that the fundamental design of computing has not changed since the 1950s," he said. "So for 60-plus years, we have been doing computing the very same way. We have the CPU. It does the basic math. We have some main memory. And we have some sort of IO, which is typically storage, and eventually became a lot of networking."

At the core, this situation has not really changed, Fink said.

"And guess what?" he said. "We, as an IT industry, we are so smart that we virtualize all this. And when we virtualize it, we've basically recreated the exact same thing in virtual form. Aren't we really, really smart."

It will be a welcome change, said Rich Baldwin, CIO and chief strategy officer at Nth Generation Computing, a San Diego-based solution provider, longtime HP partner and host of the Nth Symposium.

The role of IT people is changing rapidly, Baldwin said.

"They're becoming more like knowledge brokers and are getting less involved in making things work," he said. "IT has to get to the point where people can use the power of computers without knowing how they work."

Fink said the way computers are architected has to change given the fact that enterprises are now commonly handling petabytes and more of data.

"And, typically, once you reach the petabyte range, you are starting to deal with unstructured data," he said. "And when we think about where the world is going, we think about what problems are we trying to solve. My team actually starts thinking at the exabyte range."

NEXT: Rethinking Computer Performance, Power Consumption

1 | 2 | 3 | 4 | Next >>

To continue reading this article, please download the free CRN Tech News app for your iPad or Windows 8 device.
Related: Videos | Slide Shows | Comments

SHARE THIS ARTICLE

More Data Center

Recent Articles

Top Server Makers: Can You List Who's In The 'Other' Category?

Meet the companies that make up the 'other' server maker gang that has been stealing server market share from the big three.

Software, Hardware, High-Speed Sailing: 20 Scenes From Oracle OpenWorld

Oracle OpenWorld, now one of the world's biggest IT conferences, was more interesting than usual thanks to drama around Oracle Team USA's America's Cup win grabbing attention between all the hardware, software and cloud news.

10 Secrets To Success With HP PartnerOne

Hewlett-Packard this week released final PartnerOne terms and conditions that go into effect Nov. 1 including hefty back-end rebate incentives. Here are 10 secrets to succeeding with the new HP PartnerOne.

  More Slide Shows




Related Videos
Loading...