Seven reasons virtualization hasn't fully taken off in the enterprise

Virtualization is certainly not a new technology, but actual deployments have lagged behind predictions.

Those selling virtualization dream of markets with 90 percent penetration, but the reality is that enterprise virtualization is far below that—despite the fact that server virtualization is nearly 50 years old, originating in the mid-1960s on the CP-40 project CP/CMS that provided virtualization capability on IBM System/360-67 mainframes.

So why is virtualization taking so long?

#1 Disappointing Returns on Investment (ROI)

Yes, you heard it right. No one ever expects virtualization to fail on an ROI point because it seems to be a “no-brainer” that removing servers from data center floors, thereby conserving floor space and reducing energy consumption, is going to get you a return—and it does. The question is, is this enough?

According to a recent CA survey of 800 organizations, 44 percent of respondents were unable to declare their virtualization efforts a success. A central issue was not having sufficient IT metrics in place to assess whether virtualized machines were performing at optimal levels.

#2 An inability to assure that everything in a virtual environment works well together

It’s great to virtualize servers, but end-to-end applications also require storage and networks. If these other resources aren’t virtualized, you don’t have a fully virtualized environment and it’s difficult to assess the specific benefits that the virtual portion of your application performance is delivering. This lack of visibility is extremely frustrating for IT. It is one of the reasons why virtualization ROI is relegated to reductions of floor space and energy consumption—but little else.

#3 Lack of deep-down virtualization know-how

Virtualization vendors have automated best practices into virtual machine setup, maintenance and tuning, but ultimately “right fitting” your applications and the resources they consume is still a job for IT. Most IT staff members have come from environments where applications (and app development) were performed on dedicated physical servers, so they are used to having larger footprints of operating systems than the virtual environment requires (since its applications share resources). Unfortunately, staff tends to carry over a dedicated server mentality to the virtual environment, where they size operating systems much larger than they need to be and consume more resources than needed.

#4 Resistance from staff

Application developers who grew up with cheap distributed physical servers like the idea of having their “own” physical server, dedicated to their development work. This requires their managers and CIOs to work harder in effecting the long-term changes in IT culture that must accompany changes in process.

#5 Resistance from vendors

Every software vendor that you subscribe to publishes a set of “best practices” designed to optimize the performance of its particular application. Unfortunately, many of these best practices demand physically dedicated severs, and are not intended for the shared environment that virtualization creates.

#6 The rise of Big Data

Big data and the parallel computing that it requires is not a good virtualization candidate, and must be run on its own iron.

#7 Other pressing projects

The rise of big data, and the continuous stream of new projects that IT is asked to do, rapidly fill up work schedules. In most cases, this renders virtualization a kind of “plodding exercise” that gets done in pieces and in the background, whenever organizations have the time to address it.

Despite the assortment of challenges, virtualization is a key foundational piece and driver for the future evolution of the data center and for the ascent to the cloud—which is where IT is going. Some organizations will attain virtualization faster than others, but none will be without it. The key for IT managers and strategists is recognizing how virtualization can pay off once you implement it, what its impediments to fully optimized results are likely to be, and how you can minimize those risks as you strive for optimized performance. 


Mary E. Shacklett is president of Transworld Data, a technology research and market development firm. Prior to founding the company, Mary was Senior Vice President of Marketing and Technology at TCCU, Inc., a financial services firm; Vice President o...

Editor's Picks