Liquid Cooling: The Next Big Trend in Data Center Design and Operations

Over the past few years, there has been a rise in the number of exciting generative AI solutions entering the marketplace, and increased adoption of AI solutions across enterprises and industries. However, as we’ve discussed in previous articles on The Modern Data Center Journal, the emergence of the AI era has had a massive impact on data centers – increasing the average rack power densities and heat produced within the data center envelope.

With air cooling no longer adequate for cooling AI data centers, operators are turning to liquid cooling solutions that are capable of removing heat from the rack and the white space more quickly and efficiently. This has led to many hyperscalers and colocation providers considering liquid cooling for their upcoming data center projects.

This massive demand for liquid cooling led leading white space integrator Compu Dynamics to bring in Ben Graham, an expert in precision cooling systems who has impressive past experience and knowledge in liquid cooling solutions.

We recently sat down with Ben to learn more about his new role, his previous experience, the trends driving the increase in the adoption of AI, and mechanical considerations when getting liquid cooling into the envelope.

Modern Data Center Journal (MDCJ): Can you tell our readers a bit about your previous experience and your new responsibilities at Compu Dynamics? How did your prior experience prepare you for your role at Compu Dynamics?

Ben Graham: I’ve accumulated extensive experience in precision cooling systems during the last two decades. My expertise is centered primarily in the data center environment, with a comprehensive understanding of all critical components such as chillers, cooling distribution units (CDUs), rear door heat exchangers, cold plates, and more.

I began my career in the Marine Corps, where I worked with refrigeration and radar array cooling units. When I transitioned from the military to civilian life, I began working with the cooling and heating systems of large industrial buildings. For the past several years, I have managed dozens of liquid cooling projects.

My new role as a Mechanical Project Manager and Estimator at Compu Dynamics enables me to use many of the engineering, project management, and equipment installation skills that I’ve developed over the course of my career.

MDCJ: Why is it important to have someone with significant liquid cooling experience at Compu Dynamics? Will liquid cooling see increased adoption in data centers in the near future? If so, what trends are driving that?

Ben Graham: The new generation of advanced AI applications requires incredible processing power, which results in a massive increase in rack power, density, and energy consumption per rack. As the rack density increases, it is no longer viable to remove CPU and GPU heat with air cooling, alone.

Compu Dynamics is one of the industry’s leading White Space Integration partners. Our collaboration with hyperscalers and colocation providers underscores the need for implementing liquid cooling solutions driving the applications for the future. These solutions are pivotal for mission-critical systems in the era of AI.

“Increased care and consideration are needed as the liquid for DLC is moved into the data hall envelope, away from the perimeter, and moved directly to the compute equipment.” – Ben Graham

When dealing with any mission-critical application, it’s essential to have an extensive background and understanding of the systems being installed. It’s potentially even more critical to have knowledge and expertise in liquid cooling distribution and associated heat rejection solutions.

MDCJ: If a data center may be used for artificial intelligence (AI) workloads and applications, when should the owner start to think about liquid cooling? Is it something that should be baked in during the design phase, or can it be added later during construction? Can operating data centers be retrofitted for it?

Ben Graham: The owner or operator should start thinking about liquid cooling during the build and design phase.

Suppose the key personnel within the data center have not already had deep conversations about AI applications, power consumption, and cooling infrastructure. In that case, they’ll be left behind or have to tackle retrofits that could cost substantially more down the road.

We see a lot of legacy data centers utilizing perimeter CRAC units and air handlers. These data centers are now looking toward transitioning to liquid cooling.

For these legacy data centers, we are able to decommission some of those units and free up capacity from existing chilled water loops. That existing capacity can then be diverted – feeding the required cooling distribution units (CDUs) and providing a secondary loop fluid for direct liquid cooling (DLC) applications of each server rack. However, if the data center doesn’t have the available infrastructure, chillers, pumps, and piping loops can be retrofitted into most environments.

“The new generation of advanced AI applications requires incredible processing power, which results in a massive increase in rack power, density, and energy consumption per rack. As the rack density increases, it is no longer viable to remove CPU and GPU heat with air cooling, alone.” – Ben Graham

Data center operators also have to consider the power infrastructure necessary to operate liquid cooling equipment. Whether it’s a five-megawatt facility or a 30-megawatt facility, a data center must have the electrical infrastructure in place to power whatever system is being installed.

MDCJ: What are the major mechanical or engineering considerations that are needed to get cool liquid into the envelope? What new systems or changes to the traditional data center design are needed?

Ben Graham: In the past, cooling demands were met with air cooling and the perimeter CRAC/CRAH units. As a result, data centers could keep the liquid either outside the white space or along the perimeter. Increased care and consideration are needed as the liquid for DLC is moved into the data hall envelope, away from the perimeter, and moved directly to the compute equipment.

Some data center owners and operators might consider forgoing environmental cooling within the data centers when relying on DLC to manage equipment cooling.  However, it remains crucial to uphold environmental conditions within the envelope, even if the entire data center is DLC.

While a significant portion of the compute heat load can be transferred directly into the DLC medium or fluid, it is essential to consistently maintain temperature and humidity, and implement effective air filtration measures.

In our next article in The Modern Data Center Journal, we will feature the second part of our conversation with Ben, which is focused on the challenges that data center owners and operators might face when embracing liquid cooling.

To learn more about how to keep cool in the age of AI, click HERE.

Related Posts