Virtualization may be the most over-hyped word in technology lingo today, but the benefits are real for those who know how to deploy it, said one ISV at Microsoft's TechEd 2007 conference.
Ed Harnish, vice president of marketing at Acronis, a Burlington, Mass.-based storage management vendor, offered 10 tips for deploying virtualization software before hundreds of IT pros and solution providers gathered at the event, held this week in Orlando, Fla.
1. For best performance, virtualized servers should consume 60 percent to 80 percent of a system's resources. Utilization above or below that range will result in performance degradation of all virtual machines on that server.
2. Don't try to resurrect older servers stuffed in a corner. Virtualization on the x86 platform was enabled by the availability of more powerful VT-enabled processors and requires powerful processors plus lots of memory and disk space.
3. Carefully choose your virtual disk types. Dynamic, fixed and physically-linked disks run the same from a functional standpoint, but each offers varying levels of performance for different workloads.
4. Dynamic is considered the best virtual disk format because it's easy to set up and offers the most flexibility. Still, it tends to use as much space as needed, resulting in a performance hit over time.
5. The fixed disk method is closest to what's common in the physical world. For example, if 250 Gbytes of space is being assigned to a select number of virtual machines, the downside would be in backup. All 250 Gbytes would be backed up even if only 50 gigabytes are in use, which leads to a lot of blank space being stored.
6. Using a physical disk or linked method provides the best performance because virtual machines can be written directly to the drive without first saving to it a temporary virtual drive first. That enables virtual machines and operating systems to write to the drive concurrently -- but it's not for the faint of heart. "You can get system corruption if you don't set it up properly," Harnish said.
7. Because the industry lacks good predictive tools and tools for sizing and removing data, customers should evaluate host requirements carefully. Select a motherboard with default components that can be disabled. That allows the use of a faster video card and other add-ins in expansion slots to boost the performance of virtualization.
8. Clean up before virtualizing. Use Windows Update and get current hardware drivers to be as up-to-date as possible, delete unused applications and remove inappropriate applications from the tray. Instant messaging, for example, isn't good candidate for virtualization. Dump or store what's in the recycle bin, and remove unneeded applications from the startup menu to enhance performance. Also, back up data to physical servers for quick access to historical archives.
9. Businesses shouldn't virtualize 100 percent of their environment or expect any physical-to-virtual (P-to-V) migration to be permanent. Be ready to switch workloads from virtual machines to physical servers and P-to-V back as requirements in the organization shift. P-to-V can be done manually, but it's arduous. Use third-party P-to-V tools from ISVs such as Platespin's PowerConvert 6.6, Leostream's P-to-V 3.0 and Acronis' FullCircle, which was launched at Tech Ed.
10. Carefully consider whether to back Microsoft's virtual hard disk format or VMware's virtual machine disk format. Trying to support both isn't practical. Databases don't virtualize well. What if Microsoft comes out with a specialized version of SQL Server for its virtual server? Linux workloads may be supported on Microsoft's platform, but another product may support mixed workloads even better. Rule of thumb: The value of the data long outlives the value of the server, so the choice of the data format is more crucial than the choice of the platform.
VMware statistics find that customers typically spend $7 in services for every $1 spent on virtualization software, Harnish said, noting that vitualization software offers a big return on investment but isn't self-sustaining.
"Virtualization is overhyped as the end-all, be-all, and people think all IT staff can go home and it will run by itself," Harnish said. "It's not true."