This article was originally published on the Government Technology Insider. To read the original in its entirety, click HERE.
It would be overly dramatic to say that everything changed on November 30, 2022. However, the day that marked the official launch of the generative artificial intelligence (AI) service, ChatGPT, certainly created a shockwave that is still reverberating around the world. Since the official launch of OpenAI’s large language model-based chatbot, the service has garnered more than 100 million users, and AI has skyrocketed in interest and investment across the IT industry.
Fast forward less than one year later, and new uses for AI are emerging daily. In fact, most venture capital-backed start-ups today are seeking their fortunes in some previously undiscovered or unexplored AI-oriented niche. And even established IT companies are trying to find a way to introduce the letters “A” and “I” into their messaging to help drive up their stock price.
It’s clear that ChatGPT flipped the switch on a floodlight that is spurring innovation and creativity around the world. Every day, that light burns more brightly. And the most amazing thing is that much of the potential for AI is still waiting to be discovered and commercialized.
“The uncanny feeling that I got from walking through a virtually empty data center made me realize that using legacy data centers to power AI applications is simply not efficient or sustainable.“
But the proliferation of AI comes with questions and concerns that we need to be prepared for. There are ethical questions about the use of generative AI that need to be addressed. There are legal questions about deep fakes and intellectual property that we need to contemplate as a nation. And then there are the technical impacts of AI, and questions about how our existing infrastructure can support the increase in AI adoption.
And one of the largest impacts that AI will have in the short term is its impact on the data center.
More is less
Each IT cabinet that powers an AI application or solution requires an inordinate amount of power. Conversely, there is a vast amount of heat energy produced by operating these incredibly dense, energy-hungry racks and cabinets. This creates unique challenges when it comes to powering and cooling data centers. And the changes that are being made to accommodate the increase in power usage and cooling requirements are – in turn – impacting the actual design and construction of the data centers, themselves.
Building a data center isn’t an undertaking that can be accomplished in a few weeks or even months. Years may pass from the time a data center provider or hyperscaler identifies a demand signal in a region, conducts the site selection process, gets all necessary permits and regulatory requirements handled, and constructs a data center. This means that the data centers that providers and hyperscalers are cutting the ribbon for today were designed years ago.
Because the designs of these data centers were created before the massive increase in AI adoption, many were designed without contemplating the impact of AI on the facility. Many of these data centers were designed to have massive white spaces in which as many cabinets could be installed as possible. But that’s not necessarily ideal for a data center running AI applications.
When a highly dense collection of racks is placed in a legacy data center, the floor space is often sparsely populated. There will often be fewer cabinets than the white space can physically accommodate. This is because older data centers were designed to accommodate IT loads in a more traditional range of 5-15 kW per rack.
Today, it is not uncommon for AI and HPC cabinets to draw 50kW – 75kW per cabinet. With power densities increasing by a factor of seven, 100 percent of a data center’s capacity would be consumed by just 1/7 of the equipment – and that equipment would account for only 1/7 of the available floor space.
Frankly, as someone who has walked through a traditional data center space being used for AI and high-performance computing (HPC) equipment – it felt a bit strange.
“As data center designers and engineers continue to get a better understanding of the power and cooling requirements of AI applications, they will take that information and work outwards from the racks to the building envelope.“
Then, there’s the issue with cooling. These extremely dense AI cabinets dictate that we rethink the old approaches to data center cooling. Thermodynamics tells us that liquids like water can transfer heat more efficiently than air. To meet the cooling requirements of next-generation, AI data centers, the industry may need to shift away from forced air cooling, completely, and towards one of the many different liquid cooling options entering the market.
We’ll explore the power, network, and cooling changes that AI will require in the data center much more in-depth in future articles. However, it’s important to mention them here because they could result in a change in how we design and construct the AI data center of the future. They could also raise interesting questions about whether the data centers of the future are built to accommodate legacy and AI workloads together, or whether those are separated into separate data centers that are more optimized for their unique requirements.
Rethinking the data center design
It has been said for years that power and cooling systems account for approximately 80 percent of a data center’s cost. The other 20 percent is concrete, steel, and other architectural elements. Since the building itself only accounts for a small fraction of the overall construction cost, people have overlooked the fact that the building envelopes are oversized for the number of AI racks deployed. But they’re being forced to pay attention as data center space has become more limited and in higher demand.
The uncanny feeling that I got from walking through a virtually empty data center made me realize that using legacy data centers to power AI applications is simply not efficient or sustainable. The sheer amount of wasted space made it clear that a new approach is needed, and that we need to rethink how we’re designing and constructing data centers that will power AI applications.
One solution might be to incorporate AI cabinets into data centers with traditional cabinets – data centers that we often refer to as “production data centers.” This is ultimately a half-measure that doesn’t completely solve the problem. However, it could be an effective approach to maximize the space in existing legacy data centers while a new generation of AI-specific data centers can be designed and constructed.
And what would those new, AI-specific data centers look like? Chances are, they would feature much less white space.
“…the official launch of the generative artificial intelligence service, ChatGPT, certainly created a shockwave that is still reverberating around the world.“
As data center designers and engineers continue to get a better understanding of the power and cooling requirements of AI applications, they will take that information and work outwards from the racks to the building envelope.
What we can most likely anticipate is that space allocations will change. More of the data center footprint will be dedicated to back-end UPS systems, switchgear, and electrical distribution. Cooling systems will likely consume more space either inside the facility, outside the facility, or both. Ultimately, future data center white space will be a smaller percentage of the overall building size.
If the industry does standardize on separate, dedicated data center designs for AI and production workloads, we could be poised for a massive upswing in the construction of dedicated AI data centers. While the average hyperscaler and enterprise data center operator would need more production data centers than AI data centers, they already have production data centers – they don’t have any dedicated AI data centers. This could drive a new data center construction boom – but for data centers designed with less white space, specifically optimized for AI workloads and applications.