Software and hardware fail. It's unavoidable.
To help limit failure, though, IT organizations can invest exponentially more money to go from 99.9% availability to 99.99% availability, otherwise called three or four nines, respectively. Three nines availability translates to 8.76 hours of unplanned downtime a year vs. 52.26 minutes of downtime for four nines availability.
Amazon Web Services (AWS) service commitment for S3 provides service credits when availability falls below 99.9%. AWS S3 experienced an outage on February 28th, affecting many web properties. While no one is surprised that S3 experienced an outage, it does bring up some common misconceptions of public cloud that should be addressed.
Here are three myths to keep in mind when considering migrating to public cloud.
1. Public cloud is highly available
The availability of public cloud versus private infrastructure is relative. Cloud providers such as AWS invest an incredible amount of money in the design and build of Tier 4 data centers to host services such as S3, and customers inherit the physical attributes of the Tier 4 data center such as highly redundant power, cooling, and network connectivity. While physically redundant, services such as S3 and EC2 are incredibly complex due to the scale of the services.
The scale of public cloud services reduces that availability. While an AWS, Google Cloud, or Microsoft Azure data center may have the availability of a Tier 4 data center, that availability doesn't translate to the hosted services. Customers must design and budget for four nines availability, which may include distributing services across availability zones (AZ) or cloud providers.
2. Public Cloud is cheap
Many stories highlight how small startups were able to disrupt larger incumbents by leveraging public cloud. The limited startup costs associated with public cloud are legendary—look no further than Snap, the maker of Snapchat. Snap filed for an IPO at somewhere around a $24 billion valuation. Snap's service was born in the public cloud and competes against Facebook's Instagram service.
There's no doubt that public cloud provides companies like Snap the agility and scale to compete with the likes of Facebook. However, Snap is reportedly scheduled to spend over $2 billion on cloud services between AWS and Google Compute Platform (GCP). There are examples of companies such as Dropbox leaving AWS to optimize for cost as the service has matured. Public cloud provides agility and the ability to quickly scale or retire services as a project succeeds or fails. Cost savings are possible, but are not a given.
3. Cloud is just someone else's computer
I see the meme on laptops during every tech conference. The public cloud isn't simply someone else's computer. It's an oversimplification to call public cloud a hosted data center service. Public cloud, and cloud computing in general, is a different way to deliver computer services. Cloud computing customers can deploy highly redundant, geographically diverse infrastructure by just logging into a dashboard and clicking through a few wizards.
Compare deploying a multi-site application using physical compute infrastructure. It takes several months just to provision redundant network connectivity, not to mention the time needed to negotiate space in a hosted data center for the building and deployment of physical infrastructure.
Like many revolutionary services, there are plenty of myths about cloud computing. What are some of the myths you've come across regarding cloud computing? Tell us in the comments.
- Amazon S3: The smart person's guide (TechRepublic)
- 4 cloud VPS providers that could be a better fit for SMBs than Amazon or Google (TechRepublic)
- 3 hybrid cloud alternatives to AWS CloudFormation (TechRepublic)
- Beware these pitfalls when moving enterprise applications to the public cloud (ZDNet)
- Moving to the cloud? Three things to think about before you make the jump (ZDNet)
- Intro: Cloud Computing - Moving to IaaS (August 2016) (ZDNet)
Keith Townsend is a technology management consultant with more than 15 years of related experience designing, implementing, and managing data center technologies. His areas of expertise include virtualization, networking, and storage solutions for Fortune 500 organizations. He holds a BA in computing and a MS in information technology from DePaul University.