Twin Cities Business, a provider of Minnesota business news, asked 3M if they are working on anything that could have as deep an impact as Post-it notes. Kevin Rhodes, chief intellectual property counsel for 3M, said yes. Rhodes then proceeded to talk about 3M Novec Engineered Fluid: all of its potential uses, and how Novec could reshape current thinking on data-center cooling technology.

Two percent of the world’s electricity supply

Why would an engineering fluid have that impact? It all starts with reports stating data centers are using close to two percent of the world’s supply of electricity at any given time. And 37% of that amount keeps computing equipment cool. Figure A gives the breakdown of electricity usage as determined by Emerson Network Power (PDF).

Figure A

It is easy to understand why Greenpeace is actively monitoring data centers’ power usage.

Enter Novec

Imagine a data center that is quiet, not abnormally cold, uses substantially less electricity, and requires significantly less floor space for the same amount of computing power. That utopian data center is the goal of a 3M, SGI, and Intel partnership that is counting on Novec, a nonflammable, noncorrosive hydrofluoroether with excellent heat-transfer characteristics.

Using Novec, 3M developed a two-phase immersion-cooling technology unlike any in existence. To test the new cooling concept, supercomputer manufacturer SGI built an ICE X, a proof of concept distributed-memory supercomputer. Intel supplied Xeon processor E5-2600 hardware. Simply put, the heat-sensitive components were placed directly into the Novec engineered fluid as shown in Figure B.

Figure B

The results are astonishing. These are the conclusions extrapolated by 3M, SGI, and Intel:

  • Cooling energy costs were reduced by 95%;
  • Water consumption was reduced markedly; and
  • Equipment can be packed tighter: 100 kW per m2 of computing power vs. 10 kW per m2 for air-cooled systems.

A different approach

Twin Cities Business interviewed another Minnesota company LiquidCool Solutions, which is also involved in liquid-cooled computing equipment. LiquidCool Solutions uses a different approach: a vertical rack that is liquid-cooled (Figure C to the right).

Herb Zien, CEO of LiquidCool Solutions, brought up several interesting points that showed how LiquidCool Solution’s approach differed. The biggest difference is that LiquidCool uses single-phase immersion-cooling technology, meaning the coolant always remains a liquid, and a heat exchanger removes excess heat.

Zien stressed that getting rid of each device’s fan unit was key. Doing so allowed LiquidCool to take a typical 42U rack and fit 64 1U devices in the same space.

Zien also mentioned that technology by LiquidCool Solutions can:

  • Cool any electronics: servers, switches, solid-state storage, electric vehicle battery packs, inverters, power supplies, wherever heat must be dissipated, and take any shape or size.
  • Reduce data center energy use by 40% and space requirements by 50%, and the upfront capital cost is lower than existing cooling systems.

Zien said that LiquidCool Solutions is working with 3M and have tested Novec. They prefer to use a proprietary liquid that is similar to mineral oil. Also, Zien feels that the cost of Novec overrules any benefits it may have.

There is a LiquidCool Solutions system in the field, and it should not be a surprise that it also is in Minnesota. The University of Minnesota’s Supercomputing Institute was looking for a hardware solution that would handle a variety of advanced scientific and medical analysis.

Jeffrey McDonald, Assistant Director for HPC Operations, Minnesota Supercomputing Institute said, “The challenge was finding server solutions that provided the performance increases we needed while decreasing our energy consumption and data center footprint, which was on a path we could not sustain. We’ve deployed LiquidCool’s Liquid Submerged Servers and its benchmark and energy usage returns were very positive.”

Final thoughts

I put a list together that seems to sum up the advantages of using liquid cooling in data centers:

  • No raised floor
  • No excessive fan noise
  • Habitable room temperature
  • More space for computing equipment
  • Significant savings on electricity costs

I’m sure there are downsides, but it seems the status quo is unsustainable.

Do you agree? How much thought do you give to your data center’s footprint? Do you think liquid-cooled computing equipment is in your data center’s future? Post your thoughts in the discussion.

See also