The Power Problem Facing Modern AI Data Centers

It’s impossible to escape news and conversations about artificial intelligence (AI) in 2023. Whether people are discussing its possible use cases and benefits, lamenting how it will invariably take away jobs from living people, or speculating on the regulations and legislation that will be introduced to govern its use in our society, AI is virtually everywhere.

However, one of the elements of AI that few are discussing is its impact on the data center. And that was exactly what my colleagues Steve Altizer and Lee Piazza set out to do in their recent series of articles.

In the first article in this series, Steve explored the rise of AI and its impact on data center design and construction. Lee then explored the ways in which cooling will need to evolve to deal with the incredible increase in heat produced by the extremely dense racks that power AI applications.

Now, I’d like to look at a major problem that impacts everyone – not just data center owners and operators – which has no real clear solution. I’d like to explore how we can meet the massive power and connectivity requirements of today’s modern AI applications.

More Than We Can Handle

As Lee Piazza explained in his article about cooling, “The racks in production data centers draw – on average – 8 KW to 10 KW of energy. AI and high-performance computing (HPC) cabinets can draw 50kW – 75kW per cabinet or more.”

Lee isn’t wrong. But what he didn’t say is that those numbers are poised to increase. As these AI racks increase in density, the power consumption and requirement will only continue to skyrocket. And that’s a problem.

Click the image above to download a complimentary copy of the eBook, “The Impact Of AI: How The Latest Tech Obsession Is Changing The Data Center As We Know It.”

Today, the generation of power is out of the hands of the data center owner and operator. Power comes from local utilities and power companies, which generate and distribute energy over local power grids and infrastructure. And there simply isn’t going to be enough of it to power both increasingly dense AI data centers and the trappings of modern life for those living around them.

In fact, this isn’t even a “future problem.” It’s actually an issue for many international data center markets today. The immense power requirements of modern data centers have led local governments in places like Frankfurt, Singapore, and Ireland to explore limiting the regions in which data centers can be built – or flat-out restricting the number of data centers that can be built in the region.

And now, even domestic data center hotspots could be facing the same scrutiny.

Northern Virginia has long been one of the world’s hottest data center markets – with data centers so numerous that it earned the moniker “Data Center Alley.” However, recent reports indicate that the increasing power requirements – including those from data centers in the area – could result in demand growing faster than projected increases in supply.

But this is where the problem becomes particularly challenging. Just increasing the amount of energy available in a region isn’t going to solve anything. The energy that’s being purchased or generated within a region still needs to be distributed to those who need it. Unfortunately, modern power grids simply weren’t constructed to deliver power in the quantities required by data centers running AI applications.

Rethinking Generation and Distribution

We’re effectively in a power arms race. As data center operators increase demand, power companies scramble to increase supply. But at some point, the grids that we rely on to distribute power to a region will be unable to keep up with the demand. This means that we need to get creative and break out of what has traditionally been the ”status quo” of power generation and distribution.

If AI applications are going to continue to increase in use and adoption, then we need to potentially rethink the “where” and the “who” of power generation. What do I mean by that? We need to rethink “where” the power is generated and see if we can eliminate the need for long-distance distribution. We also need to rethink “who” is responsible for generating the power that runs these AI data centers.

One solution that we continue to hear discussed in data center circles is a renewed adoption and proliferation of nuclear power.

Nuclear energy has a significant public relations and public perception problem. The residents in areas around data centers may be hesitant to have small nuclear generators being deployed practically in their backyards.

In these concepts being discussed within the industry, small, modular nuclear generators are leveraged close by or centrally located to a number of data centers – potentially in the middle of a data center campus. These nuclear generators then deliver the power necessary for running the energy-hungry AI data centers on that campus.

There are a few reasons why this is an attractive alternative for data center owners and operators. First, it delivers a large amount of energy without the need for long-distance transmission or distribution from energy companies. This practically eliminates the strain that data centers put on a local area’s power supply and grid.

Second, this is a massive step towards helping data center owners and operators meet their carbon neutrality and sustainability initiatives. Many data center providers and operators have committed themselves to aggressive sustainability goals – some even attempting to be carbon neutral by as soon as 2030. Nuclear energy is technically considered carbon-neutral, as it does not directly release CO2 into the atmosphere.

The use of nuclear generators can effectively eliminate two of the largest carbon-production challenges facing data center owners and operators. The first challenge comes from the diesel generators that many data centers leverage for energy backup should there be a wide area power outage or failure in the local power grid. The second comes from the energy that they currently rely on to operate, which – depending on the market – could include a larger percentage of energy generated by burning fossil fuels.

However the proliferation of nuclear generators across data center campuses may face some very understandable and reasonable objections.

Nuclear energy has a significant public relations and public perception problem. The residents in areas around data centers may be hesitant to have small nuclear generators being deployed practically in their backyards. There are also significant problems, such as the generation of nuclear waste and the highly regulated nature of nuclear power, that need to be solved before this can become a reality.

Power comes from local utilities and power companies, which generate and distribute energy over local power grids and infrastructure. And there simply isn’t going to be enough of it to power both increasingly dense AI data centers and the trappings of modern life for those living around them.

Regardless of whether small, modular nuclear generators are the answer for solving AI data center energy challenges, there is a very real concern that “business as usual” will no longer suffice as AI applications increase in usage and adoption.

There is no simple, ready-to-implement solution to this problem. It will require rethinking the ways in which we approach energy generation and distribution, and possibly getting creative until power grids and sustainable energy sources can meet the power demands of AI data centers.

But what about connectivity?

Similar, but Different

When we discuss the things that modern AI applications require, power and connectivity are usually near the top of the list. That’s because training AI applications with the information that they need for inference and to deliver valuable insight requires an incredible number of GPUs housed in multiple racks working together. Then, the trained AI engine needs to be housed in a data center where a large user ecosystem will want to access and utilize it.

Much like power, there is only so much connectivity available to each data center campus. Connectivity comes from local telecom giants and dark fiber providers, and it’s limited in its capacity. Much like power, the connectivity to the campus is not completely in the control of the data center owner or operator since it’s provided by third parties.

As AI adoption increases, it’s essential that service providers keep pace. They need to continue to increase the capacity of their networks and invest in their services to meet the increased bandwidth demands of these AI solutions.

I believe that power grids and local fiber networks will need to continue to evolve to accommodate the bandwidth requirements of the AI boom. We’re already facing power and bandwidth-related issues that could result in downtime or slowdown of services should everyone try to access them at once. Investment in infrastructure – both power and fiber – is necessary if AI is going to play a larger part in how we live and work.

To learn more about the impacts of AI on the data center and how to better prepare data centers for the AI revolution, click HERE.

Related Posts