Connect with us
General view of the SuperMUC. Photo: Johannes Naumann.

Product of the Day

How Lenovo cools supercomputers with warm water

As the power demand for high-performance computing increases, the chilled solution to avoid overheating thousands of servers turns out to be warmer than you’d expect.

The Leibniz Supercomputing Centre (LRZ) in Munich, Germany, contains no ordinary supercomputer. Sure, it has thousands of servers, or nodes, stacked in rows in a windowless vault with technicians working diligently on huge data crunching conundrums for research organisations; running simulations to try and better predict future natural disasters like tsunamis and earthquakes. But it is eerily quiet. Almost too quiet. 

The familiar whir of hot air being whooshed away by power hungry computers is almost entirely absent. Where are all the fans? 

Almost all gone, as it turns out. The LRZ SuperMUC NG, which uses massive arrays of Lenovo’s ThinkSystem SD650 servers, requires nearly no fans at all – just those for cooling the power supply units and in the in-row-chillers on every eighth row. 

“The ambient noise in the datacentre is now lower than in a typical office space,” says Rick Koopman, EMEA Technical Leader for High Performance Computing at Lenovo. “We wanted to optimise what we put into a supercomputer and what comes out of it from an efficiency perspective.”  

Despite this, Lenovo has been able to keep the LRZ running all this time while overseeing energy reduction levels of 40 percent; greatly lowering the centre’s electricity bill and environmental impact all at the same time.  

The secret? A focus on sustainability and using warm water to cool the datacentre, which admittedly sounds a bit like trying to fuel an F1 car using its own emissions.

A Green Giant  

Koopman says: “By having this emphasis on sustainability and a reduction on the carbon footprint for their large general-purpose supercomputer, they now have a very efficient system and the SuperMUC NG is just one example. In fact, 177 of the TOP500’s Green500 list of energy efficient supercomputers are Lenovo systems.” 

When the company first began working on SuperMUC at LRZ in 2012, typical HPC compute nodes used processors requiring 100 to 120W (Watts) of power per processor. That figure is now typically over 200W and will increase to over 300W by 2021. However, with more wattage comes greater heat, which ultimately needs to be removed. If these processors are not kept at their optimal operational temperature range of under 80 degrees Celsius, the silicon in the chips begins to break. 

“If you have one server with two 300W processors, four accelerators using up to 500W each with additional memory, drives and network adapters, you’d be looking at more than 3000W per server,” says Koopman. “There are 36 of these servers in one standard 19-inch 42-unit rack.”  

Comparatively, a typical washing machine requires 500Ws. Therefore, one computer rack in this example uses the same power as 210 washing machines all running at the same time.  

Enter Warm Water Cooling  

“The old way was chilling the datacentre room by using fans to blow the hot air away. Hence all the noise. But air cooling is far from efficient for current and future HPC solutions, and because they use increasingly dense arrays of hardware, it is even less workable,” says Koopman. 

This is where the concept of warm water cooling comes in – the idea of pushing water that to us feels warm, but at 45 to 50 degrees Celsius is still cooler than processors running at peak performance. In this way, LRZ is able to remove approximately 90 percent of heat energy from the SD650 nodes, cleanly and quietly. 

The same mass of water stores four times more energy compared to air at a given temperature, and it’s possible to have the water supply in direct contact with all the elements that need to be cooled, to make the process much more targeted.  

“The heat transfer to water is just much more efficient,” says Koopman. 

Since the water is also contained in a pipe system, it can be re-used. Depending on the location of the datacentre and the outdoor temperature, simply running it through heat exchanger equipment on the roof of the datacentre allows the excess heat from the hardware to radiate away. 

On top of the energy savings, green impact and lower electricity bill this solution delivers, the warm water can provide heating for a nearby agriculture greenhouse and can even be used as part of the campus heating for facilities such as LRZ.  

Cooling Tech 

This, however, is just one element of Lenovo’s Neptune liquid cooling technology, which approaches datacentre energy efficiency in three ways: warm water cooling, software optimisation (which has delivered over 10 percent additional energy savings by throttling hardware when needed) and infrastructure advances. 

This technology can be applied as a standard solution for many large datacentres to allow them to reduce the number of required chillers to keep them cool. Fewer chillers are needed for the creation of this cold water and that adds up on a supercomputer scale. But while the solutions are currently being used in datacentres, the technology has the potential for applications elsewhere in IT. 

Subscribe to our free newsletter
To Top