Data Centers

Liquid-cooled computing equipment in the data center: A game changer

Liquid-cooled computing equipment may be in your data center's future. Learn what 3M and LiquidCool Solutions are working on to make that happen.

waterdroponsky798x602042214.jpg
 Image: iStock/pixtawan

Twin Cities Business, a provider of Minnesota business news, asked 3M if they are working on anything that could have as deep an impact as Post-it notes. Kevin Rhodes, chief intellectual property counsel for 3M, said yes. Rhodes then proceeded to talk about 3M Novec Engineered Fluid: all of its potential uses, and how Novec could reshape current thinking on data-center cooling technology.

Two percent of the world's electricity supply

Why would an engineering fluid have that impact? It all starts with reports stating data centers are using close to two percent of the world's supply of electricity at any given time. And 37% of that amount keeps computing equipment cool. Figure A gives the breakdown of electricity usage as determined by Emerson Network Power (PDF).

Figure A

novec1042214.jpg

It is easy to understand why Greenpeace is actively monitoring data centers' power usage.

Enter Novec

Imagine a data center that is quiet, not abnormally cold, uses substantially less electricity, and requires significantly less floor space for the same amount of computing power. That utopian data center is the goal of a 3M, SGI, and Intel partnership that is counting on Novec, a nonflammable, noncorrosive hydrofluoroether with excellent heat-transfer characteristics.

Using Novec, 3M developed a two-phase immersion-cooling technology unlike any in existence. To test the new cooling concept, supercomputer manufacturer SGI built an ICE X, a proof of concept distributed-memory supercomputer. Intel supplied Xeon processor E5-2600 hardware. Simply put, the heat-sensitive components were placed directly into the Novec engineered fluid as shown in Figure B.

Figure B

novec2042214.jpg
 Image courtesy of 3M

The results are astonishing. These are the conclusions extrapolated by 3M, SGI, and Intel:

  • Cooling energy costs were reduced by 95%;
  • Water consumption was reduced markedly; and
  • Equipment can be packed tighter: 100 kW per m2 of computing power vs. 10 kW per m2 for air-cooled systems.

A different approach

Twin Cities Business interviewed another Minnesota company LiquidCool Solutions, which is also involved in liquid-cooled computing equipment. LiquidCool Solutions uses a different approach: a vertical rack that is liquid-cooled (Figure C to the right).

novec3042214.jpg
 Image courtesy of LiquidCool Solutions
Herb Zien, CEO of LiquidCool Solutions, brought up several interesting points that showed how LiquidCool Solution's approach differed. The biggest difference is that LiquidCool uses single-phase immersion-cooling technology, meaning the coolant always remains a liquid, and a heat exchanger removes excess heat.

Zien stressed that getting rid of each device's fan unit was key. Doing so allowed LiquidCool to take a typical 42U rack and fit 64 1U devices in the same space.

Zien also mentioned that technology by LiquidCool Solutions can:

  • Cool any electronics: servers, switches, solid-state storage, electric vehicle battery packs, inverters, power supplies, wherever heat must be dissipated, and take any shape or size.
  • Reduce data center energy use by 40% and space requirements by 50%, and the upfront capital cost is lower than existing cooling systems.

Zien said that LiquidCool Solutions is working with 3M and have tested Novec. They prefer to use a proprietary liquid that is similar to mineral oil. Also, Zien feels that the cost of Novec overrules any benefits it may have.

There is a LiquidCool Solutions system in the field, and it should not be a surprise that it also is in Minnesota. The University of Minnesota's Supercomputing Institute was looking for a hardware solution that would handle a variety of advanced scientific and medical analysis.

Jeffrey McDonald, Assistant Director for HPC Operations, Minnesota Supercomputing Institute said, "The challenge was finding server solutions that provided the performance increases we needed while decreasing our energy consumption and data center footprint, which was on a path we could not sustain. We've deployed LiquidCool's Liquid Submerged Servers and its benchmark and energy usage returns were very positive."

Final thoughts

I put a list together that seems to sum up the advantages of using liquid cooling in data centers:

  • No raised floor
  • No excessive fan noise
  • Habitable room temperature
  • More space for computing equipment
  • Significant savings on electricity costs

I'm sure there are downsides, but it seems the status quo is unsustainable.

Do you agree? How much thought do you give to your data center's footprint? Do you think liquid-cooled computing equipment is in your data center's future? Post your thoughts in the discussion.

See also

About

Information is my field...Writing is my passion...Coupling the two is my mission.

16 comments
tsimmone
tsimmone

Over 40 years ago,there was a super-cooled computer that outperformed everything else at the time.It was cooled with liquid nitrogen and connectors were mercury-wetted which became solid at the low temperature.Prototypes were designed for outer space applications,since the temperature was easy to achieve.

Just wondering if anyone had tried super cooling modern CPU's to enhance speed and  performance?

DesD
DesD

I suspect more than a few IBMers (and I'm not one, just a long time user) would dispute your assertion that their TCMs were inefficient.

3M say Novec gives a 10 times space reduction, and IBM said one Thermal Conduction Module (.07 cu.ft) had the power of a midrange 370, say a 145 at over 100 cubic feet?

The TCM used helium filled aluminium pistons directly on the chips to dissipate up to 300 watts per module, and that was TTL, not CMOS. 30 odd layers of it!  

Imagine what you could do today with Novec plus in-house fabs for the chips plus in-house production facilities, because their chip packaging ability was at least as innovative as the cooling.

DesD
DesD

As others have noted, liquid cooling in the data centre isn't new. But in this case, both versions aren't using water, which may or may not be a game changer.


30 odd years ago, when virtual machine usage was driving up physical processor utilisation, IBM's first 3087 Coolant Distribution Unit was designed to connect to the existing chilled water supply already installed in every potential customer's building. 

A later model  (e.g. with the 3083 mainframes)  added a heat exchanger cooled by the data centre's existing aircons, thus reverting to what I see are now called "traditional" fans.  In either case, it allowed customers to lower costs by better use of current assets.


The next logical progression, as we're seeing with Amazon,Google, etc, is to build and optimise the whole building around its purpose.


In which case, the real game changer could come when either of these new dielectrics is used to cool the building, not just the processors.


And as always, progress builds on the shoulders of what went before....

Systems Guy
Systems Guy

IBM was water cooling mainframes starting in the early 1970's.

chrisbedford
chrisbedford

I just wonder why it has taken so long to come to the data centre. Liquid-cooled computers have been around (unless memory fails) since before or at least about the same time the IBM PC debuted... gamers have been liquid-cooling their display adapters (and CPUs) for at least a decade...

JJr62
JJr62

Didn't IBM use liquid cooling on their mainframes a few decades ago.

ajlane18
ajlane18

Consider the other LiquidCool Solution specific benefits - 

The technology is cool but as a game changer consider .... 

Lower capital cost for facility build - eliminates CRAC units / air handling / chiller plant

Higher compute power / server / rack / cabinet

Higher compute = increased cloud application hosting implications / opportunities

Longer server life - LCS tests show almost no wear after running for 3 years

Maintenance & operations cost reduction

Modularity provides for Mobility - almost building agnostic

Harsh environments application - like Africa, Middle East, offshore, etc.

aokello
aokello

I look forward to more information on each installation being independent, I am interested in building a data centre in Africa and this sounds like a very interesting innovation

ScottCopus
ScottCopus

And as soon as there's a leak or clog, there goes the cooling?  Without traditional cooling fans then would you need to crank down the environmental temps way down (if that's even possible) to just keep your servers running?  Same goes for any maintenance on the liquid system--how do your systems stay running while being serviced?  Can each rack's liquid cooling be self-sustainable?

Michael Kassner
Michael Kassner

@DesD  


Interesting. There is still the thermal inefficiency of the added material, and the fact the TCM had to offload the heat to a water-coolant system. Cooling the processor directly with liquid avoids all of that. I am not sure what your numbers mean. How does the number I mentioned  "100 kW per m2" relate to yours? 

Michael Kassner
Michael Kassner

@DesD  


It is a very different approach this time. The earlier technology used a thermal barrier which was not that efficient. This has the coolant indirect contact with the electronics--much more efficient. 


Another interesting point is that LiquidCool started out supporting the gaming industry with liquid cooling. 

Michael Kassner
Michael Kassner

@chrisbedford  


This approach is radically different. Those PC still require fans, and take up more space. The ones that I worked with were actually as noisy, if not louder. 

Herb Zien
Herb Zien

@ScottCopus

A LiquidCool Solutions system is sealed and the flow path is large, so there is no chance of clogging. A standard 19 inch rack can hold 64 1U IT devices, and it is possible to hot swap a server in less than two minutes without any loss of fluid or negative effect on other servers in the rack. It is worth noting that the fluid flow rate in a server is less than a half-gallon per minute, so total fluid flow needed to dissipate heat from a 100 kW rack is 25 gpm. One final point, entering dielectric fluid can be as high as 45C, so an evaporative fluid cooler should be sufficient to transfer the heat from the dielectric to ambient.

Michael Kassner
Michael Kassner

@ScottCopus  


Good questions, Scott. With the 3M approach, I believe each assembly is independent, but that is not official. As for LiquidCool Solutions I will ask Zien your question. 

Editor's Picks