It’s the epitome of first-world IT problems: How do you select an enterprise data backup product when there are so many choices?
There’s no single right answer, but there are many factors that are easily overlooked when making the decision. I posed the question to Enterprise Strategy Group analyst Christophe Bertrand because ESG analysts have a reputation for candor in the storage industry.
The first thing Bertrand said is that companies need to understand their data size is usually more than meets the eye. “On average, for every 1 terabyte of production data, you’re going to have 4-5 terabytes of protection data,” which could be backups, clones, and snapshots, he noted. “There’s also another 5 terabytes of enablement data,” which is used for functions such as analytics and quality assurance, he explained.
SEE: Power checklist: Managing backups (Tech Pro Research)
Bertrand said another critical factor is the size of your organization. Size impacts whether your data will be stored entirely on-premise, at an off-site data center, in a cloud, or through some hybrid configuration. Size also impacts the people who’ll administer the backups, who could range from storage specialists in large corporations to junior IT generalists in smaller companies.
Regulations and security are equally important. Data backups used to matter solely for just-in-case situations such as broken hard drives or corrupted bits, but now they’re also relevant to keep your business in compliance with local laws ranging from industry specifics (HIPAA, etc.) to geopolitical privacy acts (GDPR and such).
Then there are the perpetually underappreciated issues of what your backup data does for fun while it’s waiting there, and whether it’s ready to be recovered at urgent situations. Bertrand’s predecessor Jason Buffington spoke earlier this year about ways to make good use of backup data for other purposes. Bertrand added that it’s one thing to be good at making backup copies of your company data, but do you know how long it takes to reincarnate your servers and shared drives from those backups, or if the backups work at all? Policies for regularly scheduled tests of backup recoverability matter just as much policies to create backups in the first place.
“The biggest mistake is to not do your homework in terms of recovery point objectives and recovery time objectives,” Bertrand said.
SEE: Cloud v. data center decision (ZDNet special report) | Download the report as a PDF (TechRepublic)
After evaluating all these factors, companies can consider whether to buy traditional backup software, a converged backup appliance, or a cloud-based service. Traditional backup software is good for large businesses that have storage specialists and small businesses where the data quantity is easily digestable. Converged appliances are ideal for companies with many branch offices due to the technology’s plug-and-play intentions and minimal need for extra hardware purchases. Cloud backup services (or self-hosted clouds in the form of off-site replication) may be right for companies where data is created in regions prone to man-made or natural disasters.
“There are many vendors out there, and everyone does a pretty decent job at protecting data and recovering data,” Bertrand said. His parting advice: Don’t overprotect non-critical data, understand your gap between desired downtime vs. acceptable downtime, and plan ahead for five years from now when IT workers may have grown up with touchscreens instead of command lines.