It’s no secret that humans have spent the last few decades using technology to automate as many manual tasks as possible.
However, the data center that powers much of this automation is still very much a manual operation in a shockingly large number of organizations. According to a new research report sponsored by Intel DCM, 43% of data centers use manual methods for tasks like capacity planning and forecasting.
This State of the Data Center report surveyed 200 data center managers operating in the US and UK. Jeff Klaus, the general manager of data center solutions at Intel, said that he was surprised by how high that number was.
While he knows there are a number of operators who engage with manual methods, he assumed it would be closer to 25%. One potential explanation is that the operators simply do not know what automation capabilities are available to them.
“The other is that they know, and they just don’t like what they see–what options they have available to them are either too complicated, too expensive, or both,” Klaus said.
For the manual methods mentioned above, Microsoft Excel was a popular tool for planning. Additionally, almost 10% of respondents said that they walk around a data center with a tape measurer.
The percentage of those using manual methods remained steady regardless of the size of the data center. Almost half said they used manual methods because they thought the alternative approach would be too expensive and 35% didn’t think they had the right resources to make it happen.
A little more than half of the respondents reported using data center infrastructure management (DCIM) tools. Still, about half of respondents said that manual processes took up 40-60% of their time. Klaus said he believes automation through a DCIM product could free up some of this time. The key would be putting that time to good use.
“You don’t want someone, if you give them more time, you don’t want them to do more email and make sure their inbox is clean,” Klaus said. “That doesn’t really get the company anything incremental that’s significant. You want them to look at new projects.”
Another issue the report looked at was cooling efficiency. According to the report, 63% of respondents were using DCIM analytics to optimize cooling efforts. Respondents also mentioned using rack sensors and hot spot audits (physically using a thermal gun to measure temperature).
Of those surveyed, 48% perform hot-spot audits as part of their cooling strategy, while 7% relied on this method exclusively. The study also found that about 20% of data centers are solely using rack-level thermal sensors and spreadsheets to determine cooling.
Outages are an occurrence that keeps nearly every data center manager up at night. Of the 200 respondents, 118 (59%) could put a cost on an outage in their organization, with the average costs hitting at $28,900. And, the average time for recovery was seven hours and 53 minutes– almost an entire work day.
The two biggest challenges in the data center, as determined by respondents, were lack of space (75%) and power constraints (63%).
As noted, the report surveyed data center managers in the US and UK only, with 100 participating from each region. When asked how the inclusion of other countries, specifically in Asia and South America, would affect the results, Klaus said he believed the numbers would be worse and there would be more extremes in the data. Still, the numbers as they’re presented are surprising.
“Information like this should be a slap in the face, or a wake up call, for the many companies that are in the DCIM space, in addition to investors in the space, the VCs that are investing in companies that are in the space,” Klaus said. “It tells me that there’s a lot more work to do, particularly around simplification.”