Virtualization

Seven reasons virtualization hasn't fully taken off in the enterprise

Virtualization is certainly not a new technology, but actual deployments have lagged behind predictions.

Those selling virtualization dream of markets with 90 percent penetration, but the reality is that enterprise virtualization is far below that—despite the fact that server virtualization is nearly 50 years old, originating in the mid-1960s on the CP-40 project CP/CMS that provided virtualization capability on IBM System/360-67 mainframes.

So why is virtualization taking so long?

#1 Disappointing Returns on Investment (ROI)

Yes, you heard it right. No one ever expects virtualization to fail on an ROI point because it seems to be a “no-brainer” that removing servers from data center floors, thereby conserving floor space and reducing energy consumption, is going to get you a return—and it does. The question is, is this enough?

According to a recent CA survey of 800 organizations, 44 percent of respondents were unable to declare their virtualization efforts a success. A central issue was not having sufficient IT metrics in place to assess whether virtualized machines were performing at optimal levels.

#2 An inability to assure that everything in a virtual environment works well together

It’s great to virtualize servers, but end-to-end applications also require storage and networks. If these other resources aren’t virtualized, you don’t have a fully virtualized environment and it’s difficult to assess the specific benefits that the virtual portion of your application performance is delivering. This lack of visibility is extremely frustrating for IT. It is one of the reasons why virtualization ROI is relegated to reductions of floor space and energy consumption—but little else.

#3 Lack of deep-down virtualization know-how

Virtualization vendors have automated best practices into virtual machine setup, maintenance and tuning, but ultimately “right fitting” your applications and the resources they consume is still a job for IT. Most IT staff members have come from environments where applications (and app development) were performed on dedicated physical servers, so they are used to having larger footprints of operating systems than the virtual environment requires (since its applications share resources). Unfortunately, staff tends to carry over a dedicated server mentality to the virtual environment, where they size operating systems much larger than they need to be and consume more resources than needed.

#4 Resistance from staff

Application developers who grew up with cheap distributed physical servers like the idea of having their “own” physical server, dedicated to their development work. This requires their managers and CIOs to work harder in effecting the long-term changes in IT culture that must accompany changes in process.

#5 Resistance from vendors

Every software vendor that you subscribe to publishes a set of “best practices” designed to optimize the performance of its particular application. Unfortunately, many of these best practices demand physically dedicated severs, and are not intended for the shared environment that virtualization creates.

#6 The rise of Big Data

Big data and the parallel computing that it requires is not a good virtualization candidate, and must be run on its own iron.

#7 Other pressing projects

The rise of big data, and the continuous stream of new projects that IT is asked to do, rapidly fill up work schedules. In most cases, this renders virtualization a kind of “plodding exercise” that gets done in pieces and in the background, whenever organizations have the time to address it.

Despite the assortment of challenges, virtualization is a key foundational piece and driver for the future evolution of the data center and for the ascent to the cloud—which is where IT is going. Some organizations will attain virtualization faster than others, but none will be without it. The key for IT managers and strategists is recognizing how virtualization can pay off once you implement it, what its impediments to fully optimized results are likely to be, and how you can minimize those risks as you strive for optimized performance. 


About

Mary E. Shacklett is president of Transworld Data, a technology research and market development firm. Prior to founding the company, Mary was Senior Vice President of Marketing and Technology at TCCU, Inc., a financial services firm; Vice President o...

5 comments
iulius.bidalach
iulius.bidalach

Cloud is a concept... you can use your company servers as "private clouds" for your branch offices.

Security is a concern because many admins do not configure their cloud infrastructure properly... you can encrypt your data on the cloud server, and backup server content somewhere else... also you can use certificate based authentication, etc.

But I agree with you, it is a bad choice to give your resources to another company... you end up depending on them... not to mention network bottlenecks that are caused by the slowest link on the route to the cloud server...

and the fact that all your hosts in your company will have to share this single link to reach the cloud data server... and the fact that you also depend on your ISP... one day without Internet and your company stops. All services gone... picnic day for everyone! unless you use two ISPs for failover redundancy...

but even so the problem may be somewhere between your country and the cloud server... so two ISPas may safeguard you the redundancy because at a certain point, both of them will use the same portion of the fastest route to reach the cloud server...

I see cloud as an IT fashion... too bad that too many invest in it without looking long enough to his drawbacks... let the sheeps being swallowed by the river... they desserve it...

Tink!
Tink!

I am still baffled as to why IT is moving so steadily and quickly toward the cloud. I know there are security measures, but still, the cloud seems so unsafe to me.

TRgscratch
TRgscratch

Wrt #5.   As a software vendor, I have no problem with virtualized servers UNTIL the customer demands some performance metrics.  Then I demand to spec all the hardware

iulius.bidalach
iulius.bidalach

#1 Disappointing Returns on Investment (ROI)

Wrong.

For the same infrastructure, virtualization can help isolate services which otherwise were installed on a single physical server. Isolating services, means dividing responsabilities for IT staff and avoiding failure propagation. It brings a huge investment return for the company because it mantains other services running. In my case, it was a success because programmers / developer team which provided technical assistance on a software solution were blaming the colateral issues they may have arrised on the server on the sysadmin and asked him to solve them. So, keeping their playground aside from other services such as DC, DNS, DHCP otherwise installed on the same physical server was a relief for me. It actually safeguarded my job, and provided smooth operation for the company, and therefore, increase its efficience. Because of that, yes, in my organization it was a 100% success.

#2 An inability to assure that everything in a virtual environment works well together

Wrong.

Hypervisors such as Hyper-V server allow pass trough direct access to the physical disk, if needed. Further more, any hypervisor allows network configuration in a bridge mode which allow direct access to the LAN. You don’t need to virtualize those resources at all.

It is not a question of impossibility, but a question of competence on who the company hires for the job.

I don’t find this frustrating at all, and I don’t see any visibility issue… your virtual server will be perfectly visible on the LAN if properly configured. Regarding the performance decrese, there is indead a slight decrease in disk I/O, but you can use buffer memory to compensate. Also, if you use dedicated VHDX fixed disks and the sistem is prepared bottom-up from the scratch, maximum performance will be achieved.

#3 Lack of deep-down virtualization know-how

Lack of deep-down know how, agree. Resource allocation on a virtual environment can be managed easier and much the same way as on the physical server. Hypervisors allow for resource management (CPU, memory, disks, NIC’s, etc) and some of the features can be altered in realtime, with the server online.

The increase resource consumption of virtual machine is insignificant for the physical server, failing below of an extra 5% when compared to a barebone instalation of the OS. This increase is fully annichilated by the better resource management provided by the hypervisor: a physical server may have 4 GB of RAM fully and permanently dedicated, but a virtual server may have assigned a variable memory, with 2GB when idle and 4 while on full load. The unused memory can be redirected for use to the other virtual machines on the hypervisor. A reducing in hardware cost? YES, without doubt.

#4 Resistance from staff

Wrong. The access to application developers to any server (physical or virtual) is provided nowadays via Remote Desktop Services, because of its obvious work-from-home benefits. Developing team may have their own virtual server.

#5 Resistance from vendors

Wrong. Vendors refer directly to the server OS and resources, without explicitly mentioning if is physical or virtual. It is expected a performance decrease on disk performance but this situation is successfuly mitigated by the actual hypervisor solutions.

#6 The rise of Big Data

Hypervisors allow pass-trough acces to physical discs that own big data without problems. This type of access brings no decrease in performance.

#7 Other pressing projects

Wrong. Virtualization decreases developing time. You can prestage tens of virtual machines within minutes, without leaving the confort your office at all. There is no need to go to the datacenter room where physical server reside, so I really don’t understand how can be more difficult than preparing a physical server.

eclypse
eclypse

I think  #3 Lack of deep-down virtualization know-how is _really_ the key here. We are a very small shop - there are two of us that handle system administration and we do not put anything in "the cloud." Without virtualization, we would probably need at least two or three more people to do the same job we do today. We have VMware, PowerVM, SAN Volume Controller (Storage Virtualization), and we are starting to use Hyper-V as well. We also do all of our own stuff with almost no outside consulting. So, sometimes it takes us a couple of extra days to RTFM for new things, but once they're set up and running, they don't require much beyond basic attention and maintenance. We are very fortunate to have the setup that we have for the size shop that we are. 

The initial setup of our virtual environment was probably the most difficult part. I would even suggest that big data isn't an issue for most environments that could benefit from virtualization. If so, I might even suggest that the virtualization platform/hypervisor might be the thing to consider here. 

The benefits we have seen from this are well worth the investment. Business Continuity/Disaster Recovery are so much more easily handled when your environment is virtualized. Just not having to deal with all those little pizza box servers/pc servers/etc. is also a nice benefit.Creating a new server takes minutes compared to hours or days. Even if you use something like IBM's PowerVM environment, it is still relatively quick to provision new LPARs compared to buying a whole new server and setting it up. Then, if you do proper backups, restoring a virtual server is much easier compared to what we had to do previously - especially if it was a hard drive failure or you had to restore on different hardware.

Now with options like IBM's Flex Systems, which have everything built into one chassis, even SMBs can get the benefits of server and storage virtualization.