Expand SANs' Capacity
That's because customers are looking to leverage existing storage resources before adding more. In many instances, only a small percentage of capacity is used on some storage arrays, while another application or workgroup may need more capacity. Virtualization,also known as volume management, or pooling of capacity on SANs into multiple logical units,effectively lets storage administrators utilize capacity from available resources on a network. That can reduce not only hardware costs but,more significantly, proponents say,management costs.
While virtualization will offer solution providers an opportunity to enhance their SAN-integration-service offerings, it will also open the door to new business with accounts that may be looking to consolidate their storage hardware.
"A VAR trying to penetrate a new account that already has SANs can say the previous supplier has not fully harnessed what you've already bought," says Augie Gonzalez, director of product marketing at DataCore, a provider of storage-virtualization software.
There are three key forms of storage virtualization: host, storage array, and fabric or switch-based. While host-based virtualization has been around for some time, the latest crop of SAN virtualization wares lets administrators dynamically reconfigure distributed storage resources without bringing servers down,a key feature when mission-critical applications are involved. Virtualization is also viable for customers looking to back up their SANs with lower-end arrays to maintain business continuity should disaster strike,something that has become of great concern in the post-Sept. 11 climate.
But the reality is when it comes to SANs and multivendor storage management, virtualization is quite new to most IT organizations. "Virtualization is both the most hyped and misunderstand form of storage networking," says Derek Gamradt, CTO at StorNet, a national storage integrator based in Englewood, Colo. "The lack of definition of virtualization gives some debate as to who has what."
For example, vendors such as EMC, Hitachi and XIOtech already support virtualization within their own RAID devices, he says. In the long term, proponents say advances in virtualization will make it possible for customers to pool disparate SAN arrays by adding intelligence at the fabric, or switch, level. For example, Veritas, the leading supplier of host-based, volume-management software with its Volume Manager, recently announced plans to move up the stack. It has partnered with several key switch vendors, including Cisco, McData, Rhapsody Networks and XIOtech, through a program called Veritas Powered to develop virtualization engines that work in the storage-switching fabric.
"The ability to manage from the host down in Volume Manager is a great way to pool resources managed by the application," says Marty Ward, director of product marketing for availability products at Veritas. By moving toward network-based virtualization, "we're looking more at data-center management," Ward says.
Today, a plethora of software companies,many of them start-ups, including DataCore, FalconStor and StoreAge Networking Technologies,are promoting virtualization tools that let customers share disk resources in hardware-agnostic configurations. "These products have the ability to virtualize third-party hardware platforms, and that's what our customers want,one management interface across multiple storage platforms," StorNet's Gamradt says.
That's because customers that have SANs based on such infrastructures as EMC's Symmetrix, IBM's Enterprise Storage Server and other high-end systems don't want to spend several hundred thousand dollars to put those systems in other locations just for backup, say vendors promoting virtualization. "Customers that invest anywhere from $300,000 upward of $1 million are saying they can't afford to double that expenditure at another site just in case something happens," DataCore's Gonzalez says.
Software-based virtualization lets customers buy low-end commodity hardware; the solution provider's margin comes from the integration services, while providing the infrastructure at a fraction of the cost to the customer. These products pool volumes of data and mirror them onto available disks, often via IP networks.
"Many customers are pretty well fed up with the way their vendors have locked them into proprietary corners," Gamradt says. "When you virtualize disks, you liberate that customer from being held hostage by any one hardware provider."
Beyond White-Board Status
Still, demand for virtualization has been slower to take off than vendors have promised, Gamradt says. "I wouldn't say that it has met expectations, but companies are at least giving it a serious look, as opposed to just letting us present it on a white board," he says.
Some VARs are less bullish. "Virtualization of storage is talked about and marketed, but it's not a realized product in the field today," says Marc Duvoisin, a regional services manager with solution provider Dimension Data, which Duvoisin says does talk to customers about virtualization at a strategic level.
According to research firm Gartner, the market for volume-management software, which includes virtualization, will reach only $197 million this year, and nearly 80 percent of that is based on the host-centric Veritas Volume Manager. The software-based virtualization market will grow to $404 million by 2006, Gartner maintains.
"There's mostly small folks out there making a big stir out of virtualization, but what they have is a variation on array controllers," says Gartner analyst Robert Passmore. "Then you have a lot of large players promising future capability."
Those larger players include IBM, which next year plans to release a virtualization appliance that runs on Intel-based hardware running Linux, and Hewlett-Packard, which acquired a virtualization appliance under development by Compaq called VersaStor, due out by year's end.
Indeed, relationships with hardware vendors may be key to the success of many of the pure-software-based suppliers of virtualization technology. For example, Veritas has relationships with just about all of the key hardware vendors. Last year, HP purchased StorageApps, a supplier of array-based virtualization software. For its part, DataCore has OEM relationships with Fujitsu, Hitachi and, most recently, IBM.
"This is still a largely hardware-oriented game," says Mike Kahn, chairman of The Clipper Group, an IT advisory firm. "The move is inevitable. The larger enterprises are still taking a close look; they may be exploring,or even deploying,it in limited ways, but it's not the norm yet."
Revving Up
Still, many solution providers and systems integrators are priming their clients on virtualization now. "We tell our clients, 'Once you acquire the SAN architecture that's best for you, you want it to embrace virtualization,'" says Joseph Kadiri, PwC Consulting's director for IT services.
Mike Piltoff, vice president of solutions marketing at Boca Raton, Fla.-based Champion Solutions Group, agrees, but says he has yet to see a solution he's comfortable recommending.
"We've looked at every virtualization solution that has been made available to look at, but we haven't found any that we believe has a suitable approach," Piltoff says. He is waiting for IBM to release its Linux-based virtualization engine, code-named LoadStone, which the company will offer as an appliance running on Intel hardware. Champion is IBM's largest seller of nonmainframe-based storage gear.
The problem with existing software-based solutions, Piltoff says, is they lack scalability, reliability and availability. Chris Gahagan, senior vice president of storage infrastructure software at EMC, which stands to see its market for Symmetrix erode if companies mirror those infrastructures with commodity hardware, shares that view.
"A lot of these vendors that are proposing virtualization say, 'You basically put JBOD behind the virtualization appliance, but that doesn't work for a whole host of reasons,'" he says, a key reason being scalability. "If you look at the DataCore product, [for example, it can't scale because it's based on off-the-shelf PC technology. If you look at pushing all the data through the PCI bus or the backplane, you are significantly limited. One-third of a port on a Symmetrix can saturate a single PCI bus. If you wanted to put DataCore on a fully loaded Symmetrix, you'd have to put two dozen DataCore boxes in front of the Symmetrix just to manage the I/O load. That doesn't seem like a good consolidation strategy."
DataCore's Gonzalez says SANsymphony addresses that issue through caching and memory management. "If you put on a Symmetrix, Hitachi Lightning, IBM Shark and all the other big array guys, we will make them go faster," he says. "From a user perspective, those things will go faster because we cache the I/O upstream. We act as a caching front-end."
Veritas' Ward says his company has thousands of customers that replicate disk arrays to commodity hardware. "All we're sending over are changes to data," he says. "If it didn't work, no one would be buying it."
For its part, EMC says it has been virtualizing storage from the beginning in its own environments. Looking to a day when it's pooling data from different SANs, Gahagan says EMC is awaiting improvements in switching technology,something the company says will happen through improvements in chip technology.
Still, solution providers looking to deploy virtualization now say the existing wares do solve many customer problems. For example, it's not uncommon to need to reallocate several hundred gigabytes of storage on the fly from one array to another, says Pat Taylor, president of Proactive Technologies, a Carrollton, Texas-based solution provider that supplies software and white-box PCs and servers to companies in the graphics-, printing- and prepress-production industries.
Taylor's company has sold DataCore's SANsymphony software to several companies that have been able to virtualize multiple storage arrays without having to bring systems down, which can be costly from an operational point of view.
Virtualizing arrays means Taylor also doesn't have to sell customers more RAID when they typically will have excess capacity elsewhere. "Therein lies the beauty for the graphics business: I can make more efficient use of the space I've already purchased," he says.
Of course, that comes at the expense of selling more boxes to his clients. But given the commodity status of hardware and his belief that he can offer more value helping clients leverage existing resources, Taylor says virtualization is the way to go. "Our business model isn't about cranking out a bunch of boxes," Taylor says. "We're a small company,our business relies on really developing relationships."
Integrated Approach
While those products might be suitable for small enterprises or for backup, critics of using a pure-software-based approach say they are not suitable for large backbones. For the best results, some say an integrated hardware approach will offer more reliability, even if it means less flexibility on the hardware front. "The difference is we are delivering it as an integrated solution, not just selling it as software on a CD," says Bruce Hillsberg, director of software strategy and technology in IBM's storage systems group.
Another differentiator IBM says it will offer is a significant amount of cache in LoadStone. "We can have up to 4 GB of cache in there," Hillsberg says. "We think that's what makes the in-band approach work well and gives high performance." In-band refers to the virtualization technique that IBM, DataCore and others promote as using the network to pool storage resources. The other approach, called out-of-band, basically sends messages to the applications and storage systems and sends the actual data on a subnet (see glossary, below).
Meanwhile, HP's virtualization engine, VersaStor, is currently in beta at Microsoft, says Mike Feinberg, HP's CTO for network and tape solutions. Among VersaStor's features HP is touting is attribute-based virtualization. "You will be able to give attributes associated with policies," Feinberg says, so applications can be assigned the appropriate levels of service. Critics say it will require customers to use HP's host-bus adapters.
While there's no shortage of debate over virtualization, it is clearly a technology that's evolving, Gamradt says, and solution providers should be looking at it as a way to extend their offerings in storage networking. "I wouldn't recommend a company enter the storage-management space through the virtualization door," he says. "If you're already doing SAN implementations, [then this is clearly the next area [you want to focus on."