At first blush, the impacts of Big Data on the data center seem like yet another high-powered application requiring more processing power, more storage, and higher performance networking. In short, it’s yet another burden on data centers that are likely operating at near capacity. However, Big Data is not just a tool for business analysis, but can also be a helpful tool to improve the data center.
Big security
IT security has become a daily item in the general press and an obvious concern in IT management circles. One of the biggest problems with managing the threat presented by nefarious hackers is the sheer volume of data one must investigate to identify and track an unwelcome intruder into your IT infrastructure.
Traditionally, countermeasures have focused on individual systems, which at best were combined with centralized monitoring meant to identify and associate different threats. However, this quickly becomes untenable with tens of thousands of devices and applications, all of which are potential targets.
Big data’s capabilities around processing massive amounts of data in near real-time are a shoo-in for this requirement, and vendors ranging from the “usual suspects” to new niche players are quickly attempting to enter this space. The benefit is obvious: two or three events in unrelated systems might indicate a subtle attack, and a system that can consolidate, analyze, and identify these events could prevent an attack as it’s occurring.
Capacity management
Companies like eBay have already announced successful efforts to optimize their data centers using Big Data analysis. Just as virtualization has retuned underutilized capacity to organizations, these efforts promise to do the same. Again, capacity management presents a perfect use-case for Big Data, as a single piece of hardware might be running multiple virtual machines, with multiple disk images and myriad applications that would take months of digging to fully map.
The next step in this evolution would be combining the analytics with “active” capacity management, so that virtual machines could have resources redeployed and reallocated in real-time, based on historical and predicted demand and other metrics. Imagine your data center capacity intelligently reallocating itself based on a new product release or seasonal demand. While this technology is in its infancy, it will likely shape the next generation of data centers.
Big monitoring
Tying together these technologies is a new generation of monitoring tools, powered by Big Data analytics. While traditional tools do a reasonable job of identifying faults as they occur, in many cases automatically taking corrective action, most lack extensive predictive capabilities and rely on the user to configure alert thresholds and metrics.
If active monitoring were backed with Big Data, your data center monitoring tools might predict a hardware failure on a database server. The tools would then intelligently reallocate the affected application to another server, notify the appropriate personnel, and then restore operations once the problem was rectified.
Even security could be integrated into this “intelligent” data center, isolating applications or infrastructure that were compromised, just as most anti-virus software “quarantines” an infected file.
While Big Data has only just begun to impact the data center, it’s worth following the development of the technology since it presents so many opportunities to protect, repair, and optimize the modern data center.