Search
Homepage Rankings and Research Companies Channelcast Marketing Matters CRNtv Events WOTC Jobs Cisco Partner Summit Digital 2020 Lenovo Tech World Newsroom Dell Technologies World Digital Experience 2020 HPE Zone Masergy Zenith Partner Program Newsroom Dell Technologies Newsroom Fortinet Secure Network Hub Hitachi Vantara Digital Newsroom IBM Newsroom Juniper Newsroom The IoT Integrator Lenovo Channel-First NetApp Data Fabric Intel Tech Provider Zone

How Penguin Computing Is Fighting COVID-19 With Hybrid HPC

'We have several researchers that have joined in and are utilizing that environment, and at the moment, we're doing that at no cost for COVID-19 research,' Penguin Computing President Sid Mair says of the system integrator's cloud HPC service, which complements its on-premise offerings.

Back 1 ... 3   4   5   6  
photo

Do you see the pandemic impacting the balance in demand between cloud-based computing versus on-premise computing for HPC?

I think the jury's still out on that a little bit. When you build an on-premise HPC environment, in general, once you have the on-premise environment put together, you have a limited number of resources that are managing that on a daily basis. So you can practice social distancing and everything else in general for that environment itself. However, there's also a huge economic benefit for the cloud and for those researchers that want access to things and don't have the resources to do it. For instance, if you're doing cloud-based HPC, you're eliminating all that — the management side of it, the infrastructure, the cost of the power — all of those things get absorbed into the service that we provide. And then you're only paying for the compute cycles that you need in order to get your job done.

I believe we're the only high-speed, large, scale-out HPC capability in the marketplace in the cloud. There are many that do virtualized things, but to provide one high-speed interconnected HPC environment that looks and feels just like someone would use on their on-premise system, I believe we really are about the only company that does that.

What's happening today for the government systems, research systems, university systems, higher education systems and medical research systems, [the research is] eating up all their on-premise cycles very quickly. They can't upgrade quick enough in order to continue to do their research. So being able to walk in and move their workflow over into an HPC environment that works and acts and implements just like they would do it on-premise but they're doing it in the cloud is becoming very, very beneficial to our researchers. Because they can almost instantly move their codes over to our environment and not take three, four, five, six weeks to try and port it to a system that uses virtualization rather than a true high-performance computing software stack or build environment.

How typical is it for the DOE or another government agency to be concerned about accelerating the competing they are already doing?

High-performance computing is an essential capability that is just embedded in everything today, from the financial world to the biomedical world to anything for oil and gas, seismic [activity], weather, you name it. And, of course, on the government side, national security, etc. The exascale program, sponsored through the DOE primarily, is meant to maintain a competitive edge for the United States in many aspects of science and technology. I think they're always looking for ways in which they can continue to grow in their computing capabilities.

Now today, computing capabilities are a lot different than they were even 10 years ago, where most of your computing was concentrated on physics-based computing. Today, it's heavily concentrated in three major areas: artificial intelligence, which is the big buzzword nowadays. And that's either machine learning or more of a training environment, inference computing. And then high-performance computing, which is primarily physics-based but not always today. And the last one is data analytics and how do you understand data. And in reality, it's very gray between all three of those environments today. All three require the ability to create high-performance computing capabilities, whether it is floating-point capabilities or other capabilities to be able to move or manipulate data.

 
 
Back 1 ... 3   4   5   6  

sponsored resources