Last week, I had the honor again to attend the VMworld conference in San Francisco. VMworld isn’t just about virtualization any more. If you haven’t heard, one of the key themes of many messages revolves around cloud technologies.
I am quickly developing a preference for a collection of private cloud solutions. A private cloud solution will be more easily adopted for many organizations by the simple fact that data does not leave their infrastructures. There is truly an amazing array of options for the infrastructure administrator today when it comes to architecting private cloud solutions.
The one theme that seems to come back to me each time is bandwidth. The first of these challenges is site-to-site connectivity. If site-to-site bandwidth is high, so many options are made available with ease. This includes robust off-site backups to another resource in your environment, replicated workloads, dual-production sites instead of production and disaster recovery sites and more. The other challenge is Internet bandwidth. Most connections are adequate for download traffic, but the new grail revolves around upload backup. That’s key to utilizing a public cloud-based data protection solution, which may be a good ease of entry technology. Some organizations and geographic locations are given situations that make bandwidth readily available or inexpensive. Jason Hiner’s recent poll had over 2,500 TechRepublic responses and 66% report that their home Internet connection was faster than their work connection. As an Infrastructure administrator, that is disturbing.
In talking with other administrators at VMworld, it was clear that I am not the only one dealing with these challenges. This is made more complicated as infrastructure teams may not be the same teams that manage (and pay for) these network connections within an organization. Anyone who has gone through a budget process knows that it can be difficult to get another group to increase their costs for your project.
The next step is to go about achieving more bandwidth. One approach is to make a service catalog and engage application owners in the support for more bandwidth. This can give legitimacy to the request as well as expose concerns that may not be able to be addressed otherwise. Another strategy is to utilize better technologies to achieve these goals, such as compression and de-duplication before transfer over a network for large data migrations.
How do you go about the battle for bandwidth? Please share your comments below.
Rick Vanover is a software strategy specialist for Veeam Software, based in Columbus, Ohio. Rick has years of IT experience and focuses on virtualization, Windows-based server administration, and system hardware.