How To Win with Optimized Virtual Server Protection In Five Steps


The benefits of server virtualization are many. Increasingly, solution providers are fielding requests to transition customers to large scale virtual server deployments. Benefits include cost savings, to business flexibility to application agility. Virtualization technologies are rapidly becoming the heart of the modern data center. Here, Phil Curran, senior product marketing manager at CommVault, offers five tips on implementing such solutions.—Jennifer Bosavage, editor

As virtualization has steadily moved out of test/development environments and into the mainstream data center to support business and tier-one applications, customers are faced with new challenges related to the protection and management of data on virtual servers – and virtual machines (VMs) themselves. Technology solution providers that understand these challenges are best positioned to become valued and trusted advisors for their customers.

Here are five guiding principles solution providers can use to establish a leading practice in virtual server data protection and help their customers deploy virtual servers more simply and quickly.


[Related: How To Help SMBs Can Meet Their Cloud Storage Needs
]


1. Choose a backup solution that is performance-optimized for the virtual platform.

The benefits of virtualized and cloud-based infrastructure such as workload consolidation, cost savings, application agility and reduced physical footprint are well documented. However, the demands on the remaining resources for data management and protection are quickly multiplied. Storage and backup teams are being asked to protect large and growing datastores with computer, network and storage resources that are often taxed to the limit. The amount of data stored on VMs is skyrocketing, compounding an already untenable situation. This can lead to broken backup windows, reductions in performance of front-end applications and missed service levels.

End users need a backup solution that delivers the performance, manageability and scalability required to support their most mission-critical applications on virtual servers and be assured that their data is fully protected and easily recoverable. Solution providers need to provide a modern solution that understands the unique and dynamic nature of their customers’ virtual environment to minimize performance impact on front-end applications. By deploying snapshot and replication tools, solution providers can implement a data protection strategy that reduces down to minutes the time required to protect the largest datastores and the most demanding workloads while diminishing any impact on front-end applications. An ideal solution marries the efficiency benefits of a snapshot-based approach with a backup catalog that dramatically simplifies recovery operations.

2.
Ensure application-aware data protection.


As more critical applications are virtualized, it is necessary to apply the same level of protection and recovery that they had in a purely physical server setting. The modern data center now demands concerted backup and restore capabilities that will ensure maximum uptime of critical applications running on VMs. The key is selecting a data and information management solution that delivers application-aware, consistent data protection while staying within the constraints imposed by a highly consolidated, virtualized environment. But not all virtual server backup solutions are equal and the distinction is in the details. Application-aware protection is not simply a matter of saying “we have it” or “we leverage VSS” (Volume Shadow Copy Service) – that is just not good enough for most deployments. Application-aware data protection must include capabilities such as transaction log truncation, seamless search and individual application item recovery to ensure a point in time consistent image of the application running inside a VM. This provides the confidence to virtualize more applications realizing the benefits of the virtual platform sooner. Solution providers need to understand their customers’ requirements for application protection and recovery and deliver an optimized solution to match.

Next Page:
Three more tips for success--


3. Integrate data replication to meet critical application service levels.


Data grows and changes quickly; relying on a backup that’s 24 hours old is no longer sufficient. As organizations deploy more critical applications in the virtual server environment, they need to be able to recover in near-real time – not from a backup copy from the previous night.

For instance, many Exchange deployments have four-hour recovery point objectives (RPO) while many database applications have sub one-hour RPOs. Traditional backups have no hope of meeting these service level agreements (SLAs). Creating frequent recovery points without impacting production activity is a huge challenge.

Solution providers need to integrate VM snapshot and data replication technologies to enable customers to rapidly create recovery-ready copies of production data efficiently and cost-effectively. Advanced capabilities such as data encryption, compression and bandwidth throttling ensure data is protected and optimize the use of available network bandwidth. Copies of data can be immediately accessed to create multiple recovery points, to perform traditional backups without impacting server performance or to produce a second copy at another location for disaster recovery. As a result, customers can resume business as usual with minimal disruption and loss of data.

4. Restore data at granular level.



To accelerate data recovery, customers need an integrated approach to restoring data granularly at the volume, file or application object level. Traditional approaches that require remounting an entire virtual machine datastore and searching through the contents to recover a single file, for example an email, are simply too time-consuming and resource-intensive. Newer approaches are being introduced that enable file and object level restore. However, they may require a second pass to generate the granular catalog which adds additional processing time and risk into the data protection process. More advanced solutions allow users to recover individual files or entire VM images from the same single-pass backup operation.


5. Provide automated management tools to enhance virtual machine protection.


Administrators can today deploy new VMs in minutes. This ease of implementation can lead to virtual machine sprawl, making it time consuming to keep track of new VMs and manually apply data protection policies. Important virtual machines may be created and never backed up, creating major risk for the business.

Capabilities that deliver deep hypervisor integration and advanced automation will help customers protect hundreds of VMs in minutes and easily scale to thousands of VMs. Advanced features like VM auto-protection automatically discovers new VMs and transparently add them to data protection policies.

Virtualization is spreading like wildfire and along with it comes the shift to virtualized data centers and around the clock operations. To better manage and protect virtual data, end users are increasingly rethinking traditional backup techniques and so must the solution providers who serve as their trusted advisors. These five guidelines allow solution providers to help customers modernize data protection and overcome virtual server deployment challenges. In doing so, customers can elevate their overall approach to data and information management to a level that allows them to adapt and scale quickly as their business requirements evolve over time.