Image 1 of 20
20 unique data centers
It is predicted that by 2025 data centers will consume one fifth of the Earth’s total power. From cooling to lights to servers, there’s no question that data centers eat up a lot of power.
Recent news that climate change may be happening faster–and more severely–than initially believed makes traditional data center design, and its massive consumption of power, something that needs to be addressed.
These 20 data centers are taking innovative approaches toward finding alternatives to traditional (and outdated) designs.
Project Natick is a Microsoft research endeavor that puts shipping container-sized pods filled to the brim with servers on the bottom of the ocean. The one active test machine currently in operation is just off the coast of Scotland, where Microsoft plans to leave it for up to five years for study.
Project Natick servers require zero human interaction and are designed to remain in place for more than five years without the need for maintenance or repair. These servers can be powered by 100% renewable resources and emit zero emissions. According to Microsoft, “no waste products, whether due to the power generation, computers, or human maintainers are emitted into the environment.”
Designed and constructed by French web hosting company OVH, Roubaix 4 is a 35,000 server data center that uses zero air conditioning.
Instead of wasting money on cooling the building via AC, the entire data center is constructed with a hollow center, and the servers are water cooled. Heat is eliminated using the hollow center of the building, which acts as a massive ventilation shaft.
Located in a former NATO ammunition bunker, Green Mountain’s DC1-Stavanger has several unique things going for it.
First off, it uses gravity to passively pull in cold water from the bottom of a nearby fjord, cycle it through cooling systems, and pump it right back out into the same fjord, effectively cooling the entire data center for free.
Second, the data center itself is air tight, which gives Green Mountain the ability to make the entire space flame-retardant by lowering the oxygen content to 15%, just shy of the percentage needed to ignite a fire.
Third, NATO built the facility to withstand a nuclear bomb, so it’s well protected from environmental or human-caused disasters.
German-based data center firm Cloud&Heat has taken a unique approach to designing data centers and private servers: Its hardware recycles heat.
Cloud&Heat’s server racks capture wasted heat and funnel it to buildings to heat water and air, and it can even distribute workload between data centers based on heating needs, If, for example, the forecast is colder at one data center location, computing work will be shifted to that data center so its servers can provide more heat.
Facebook’s Clonee data center
Facebook’s massive data center in Clonee, Ireland uses a new kind of cooling, developed by Facebook and Nortek Air Solutions, that it calls StatePoint Liquid Cooling, or SPLC.
SPLC is a method of indirect evaporative cooling that Facebook claims is the first of its kind deployed in a large data center environment. SPLC uses a specialized membrane to absorb hot air and re-cool it, with the added benefit that the membrane keeps the air that it recycles for cooling cleaner.
The Chicken Coop
While it’s not a new design, Yahoo’s chicken coop-style data centers are a great innovation in design. Like chicken coops, the Yahoo coop passively eliminates hot air, keeping servers cool with almost no need for extra power consumption.
Yahoo put the chicken coop design up for sale in 2017, but with the company being purchased by Verizon it’s unclear an independent buyer was ever found.
A Raspberry Pi-powered data center?
Sure enough, a company called PCextreme has built a data center that offers not only traditional server architecture, but also Raspberry Pi colocation, which starts at just u20ac3 ($3.50 USD) per month.
PCextreme has designed its own custom racks to hold Raspberry Pis, and the company notes that power consumption for the tiny PCs is negligible: One Pi only uses 3-5 watts of power, as opposed to 75-150 watts for a full machine. Companies that don’t need a full-capacity server for their online operations will find this simple innovation outstanding.
DeepMind-managed data centers
Since acquiring DeepMind, Google has used its powerful AI to drive a number of innovations, and the latest is in its data centers.
Google is now using DeepMind to manage energy use in several of its key data centers, giving the AI total control over heating and cooling or any other minor tweaks that could reduce energy use.
SEE: Google DeepMind founder Demis Hassabis: Three truths about AI (TechRepublic)
Digiplex: Heating houses with data centers
Swedish data hosting firm Digiplex has turned its Stockholm data center into a provider of heat for nearby residents. Partnering with Exergi, the leading supplier of energy in Stockholm, Digiplex has added its exhaust heat to Exergi’s heating grid.
All in all, Digiplex expects to supply enough heat to warm the equivalent of 10,000 homes.
Verne Global Data Center
Verne Global’s data center in Reykjavik, Iceland uses abundant geothermal and hydroelectric technology to generate all of its energy needs. Even more impressive, Verne Global only uses 10% of the capacity of its electricity production, giving it a long way to go before turning to outside, non-renewable energy.
Instant Data Centers
All the data centers mentioned so far are stationary, but data centers do not need to be stuck in one place–at least if Instant Data Centers (IDC) has anything to say about it.
IDC’s unique portable data centers are essentially rack enclosures with a twist: The data centers are completely self-contained and sealed from the outside environment.
IDC racks don’t even use external air to cool themselves, so they can be deployed anywhere, in any weather, and in almost any condition.
Pionen data center
Located 100 feet beneath Stockholm, Bahnhof’s Pionen data center is one of the most secure–and futuristic looking–data centers in the world.
Data stored in Pionen would be safe even in the event a nuclear war devastates the world, thanks to its location inside a decommissioned cold war-era bunker.
Pionen also has its employees in mind, and has things like underground greenhouses to make it seem less post-apocalyptic and more natural.
Citigroup data center, Frankfurt, Germany
Citigroup’s Frankfurt data center was the first to be given a LEED platinum certification, making it one of the first super-sustainable data centers in the world.
The data center has a green roof, green wall on part of the structure, uses reverse osmosis water filtering to reduce sediment buildup in cooling tanks, and server layouts were designed in a modular fashion that saved over 250 kilometers of cable.
Switch Tahoe Reno
Switch’s Tahoe Reno data center, located in Nevada, is the largest colocation data center in the world. The 2,000 acre site, which has 7.2 million square feet of data center space, is powered by 100% renewable energy, mainly from its onsite solar and wind farms.
Switch Tahoe Reno is still under construction, and it will take 12 total installations to get to that 7.2 million square feet number. As of now, Tahoe Reno 1 has been complete, and it alone has 1.3 million square feet of usable space.
The IceCube laboratory
The IceCube isn’t just a data center–it’s a laboratory designed to detect neutrinos. Located near the south pole, the IceCube’s data center may not be its main draw, but it’s still impressive: It has over 1,200 computing cores and capacity to store three petabytes of data.
Surprisingly enough, cooling is a concern for the IceCube’s data center, with some hardware at risk of overheating due to being designed for cold weather operation, not the balmy 65 degree temperature that the data center is kept at.
Opening server racks for maintenance can risk sudden temperature drops that destroys equipment, so IT teams have had to be inventive both in designing the IceCube’s data center and maintaining it at some of the most extreme temperatures on Earth.
Bubba is the nickname for Down Under GeoSolutions’ (DUG) Houston-based supercomputer. DUG and its accompanying data center use unique tech to keep their systems cool: Its servers are all submerged.
DUG developed a polyalphaolefin dielectric fluid to submerge standard high-performance computers in, which it said is non-toxic, has a clean MSDS, and is biodegradable. The fluid can keep servers running for longer than their normal lifespan too, as keeping them submerged eliminates oxidization, protects moving parts, and keeps the temperature surrounding servers consistently lower.
eBay’s South Jordan, UT data center
Ebay’s South Jordan, UT data center is unique in how it distributes its cooling. Instead of constantly cooling the entire data center, eBay designed a system that monitors temperatures throughout the data center and blasts cold air to where it’s needed most, saving the company 50% on energy costs.
The South Jordan data center also gets much of its power from an onsite solar plant, so it’s double good for the environment.
Other World Computing
OWC’s Woodstock, IL data center holds a LEED platinum certification, the highest offered. Not only is the facility completely powered by wind, it’s also heated and cooled using geothermal systems, and OWC has a robust waste reduction program.
All in all, it’s not an innovative data center as much as its a total innovation project to make as minimal an impact as possible while operating a data center.
Bahnhof Lajka Space Station
Located in Kista Science City in Sweden, Bahnhof’s Lajka Space Station is a modular data center with some unique aesthetics. Its main control center is an inflatable building, and the rest of the data center is constructed of steel. Because it’s modular, the Lajka can continue to grow into a sprawling science-fiction-like data center.
tThe CyberBunker, located in Goes, The Netherlands, is housed in an old NATO bunker that is designed to go up to 10 years without outside contact, so it’s quite secure. So secure, in fact, that when local government officials tried to gain entrance they were unable to–all they managed to do was break the door, making entry impossible. A SWAT team also failed to gain entry. In both cases, settlements were paid to CyberBunker for the cost of repairs.
- Cloud v. data center decision (ZDNet special report) | Download the report as a PDF (TechRepublic)
- Photos: The world’s 25 fastest supercomputers (TechRepublic)
- Photos: The greenest data centers in the world (TechRepublic)
- Photos: Inside a $250m+ datacenter complex (TechRepublic)
- Photos: A datacenter in the heart of a city (TechRepublic)
- Photos: These data centers are insanely gorgeous (TechRepublic)