What's Holding Up Utility Computing?

But while vendors and integrators have long touted the virtues of utility computing, CIOs and end-user companies have taken a cautious approach to a full-scale IT utility. Although some have rolled out streamlined computing operations through server consolidation and virtualization, others have relegated utility computing to the back burner.

Some progress can be found, however. In our November 2005 survey of more than 200 IT and business executives at large and midsize companies, 59% are planning for some type of a utility environment this year, with proof-of-concept projects to be completed in 2007. A majority expects to at least partially complete utility deployment by 2010. (See chart, below.)

For the most part, these are small, incremental projects that won't require a lot of human and financial resources. This doesn't discount the potential impact of small-scale deployments, which can result in immediate savings and help users gain experience.

For example, a user with 50 Microsoft servers averaging 20% utilization—through consolidation, virtualization, and workload management—could drop to just five servers while reducing systems management and operational staffing.

id
unit-1659132512259
type
Sponsored post

Two important considerations are behind survey participants' move to utility computing. First, nearly all regard its development and deployment as a strategic advantage. Second, they plan to implement utility environments through a series of targeted IT or business services. For example, a $200 billion financial-services firm is creating a utility environment for all its E-mail resources, services, and systems. We also found other financial-services companies, as well as telecommunications and health-care providers, developing IT utility environments for customer-facing print-service resources. The goal is to streamline development, production, privacy, and security for account information in printed customer communications. Such services tend to be managed centrally with a homogeneous set of resources—Windows on Intel server platforms—making them likely candidates for a utility environment.

Although small-scale deployments can deliver immediate short-term business benefits, they can also inflict additional management headaches and costs over the long haul. The best way to avoid these unintended consequences is to first identify inhibitors in the organization, then deploy services incrementally.

We've identified eight sets of inhibitors to utility-computing deployments in most large and midsize organizations. The first is the need to control resources. All business units and IT organizations must understand that utility-computing applications will operate in a shared-resource environment. Departments have to be willing to cede control and management to the utility-operations group. But how do you get them to cooperate?

Creating an internal selling and PR program is a simple and effective way to get executives on board with the plan, according to the corporate CFO for IT at one of the largest electronics component manufacturers in the world, with more than 20 IT resource and data centers. This may include an analysis of the redundancies and costs that could be eliminated or minimized by sharing resources. Presentations and other promotional materials should help sell the utility-computing concept.

A second inhibitor is the need for up-front planning. Deploying utility computing is a strategic investment. But many executives, faced with budget pressures, focus primarily on tactical business and IT decisions. Instead, they need to identify long-term, strategic benefits as well as the potential for short-term ROI.

Advance planning should entail building an inventory of all applications that will run within the utility. Afterward, document all characteristics in detail—including the CPU, network, and storage resources required for allocation; the applications, middleware, and operating systems needed for provisioning; all data requirements; and all operational characteristics and policies, such as availability, batch window, billing, reporting, and response time—so the organization can adjust its resources dynamically.

Many enterprises have a mishmash of IT environments—Windows, Unix, Linux, and mainframes (IBM's MVS)—and not all utility-software vendors can support all environments. This is the third obstacle, and it could limit the number of applications deployed in the near term to one or two supported environments. Most vendors say they intend to support all common environments in the future. But standards for a truly heterogeneous environment won't be available until 2008. Companies should start deploying a utility in a homogeneous IT environment such as Windows or Linux on Intel server platforms, then extend it to other environments as technologies mature.

The fourth inhibitor is the emergence of multiple systems-management frameworks, such as Computer Associates Unicenter, Hewlett-Packard OpenView, IBM Tivoli, and Microsoft SMS/WLM. Many have invested significantly in one or more of these, and the inability to manage them could seriously limit utility deployment. To start off, enterprises should create a utility environment within each of their installed frameworks. Later, when so-called manager of managers technologies become more widely available and mature, organizations can merge the frameworks.

Preserving existing IT investments counts as the fifth inhibitor. Executives are wary of bringing their environments up to current levels just to deploy them in a utility. Clearly, this will increase the up-front cost. Organizations must determine the compatibility of all components they plan to deploy within the utility—servers, operating and storage systems, middleware, and independent software vendor applications—before they invest in new systems.

Another inhibitor involves budgeting, appropriation, and vendor disruption. Many businesses have a budget model that assumes a predetermined cost by month, quarter, or year. IT budgets and chargeback systems will likely have to adapt to the dynamic usage of resources within a utility. Old methods of capacity planning and costing will become obsolete within the utility—not only for the utility infrastructure as a whole, but also for applications' resource-usage charges.

Meanwhile, a utility-computing environment will tax IT vendors' ability to serve a dynamic market given to the kinds of peak and off-peak cycles seen in the energy and telecom industries. Many hardware vendors offer a capacity-on-demand option, whereby additional resources are billed to the customer as required. However, the dynamic capability of a utility will be especially disruptive to software vendors with pricing models based on the total capacity of the system to which their product is licensed. Some software vendors are beginning to respond to customer demand for pay-as-you-go licenses, and this will likely increase utility adoption.

Limited employee skills are another inhibitor. Many organizations, especially those that haven't deployed a systems-management framework, may lack the expertise needed to deploy a utility environment. Even if the deployment plan calls for third-party consulting or outsourcing, in-house training is a must, especially when it comes to developing and managing related policies and service-level agreements—tasks involving both IT and the business units.

The final inhibitor to utility computing is ROI. Many organizations require a minimum project ROI within a specified time—for example, a 15% return within 24 months. Unfortunately for CIOs, the deployment of utility computing is a strategic investment that commands high up-front costs for planning, training, software, and hardware setup, and it requires a long-term outlook toward realizing a substantial return.

It generally takes at least two years for utility computing to yield a positive ROI. But what if your company or business unit requires every IT investment to achieve payback within 12 months? At one U.S. consumer-brokerage firm, IT is complying by segmenting its deployment plan into measurable, process-specific chunks that can satisfy the rule when treated individually.

After identifying the obstacles to utility computing, organizations can embark on an action plan. The full deployment assumes the availability of a complex set of functions in a mature, operational environment. Standards for communication among these interrelated functions, particularly in a heterogeneous environment, are also required.

Businesses looking to start a project in the first quarter of this year should identify a small subset of the enterprise IT environment that is homogeneous. This reduces the planning period for the first phase of deployment, since fewer applications, storage/data resources, and technologies will have to be analyzed and accommodated.

Determine readiness by asking: