Power availability in the data center has always been a challenge, but with AI workloads infiltrating the industry at unprecedented speed and scale – requiring up to five times the power of traditional CPUs – the issue of how we effectively power these facilities grows ever more pressing.

AI tasks rely on high-density GPUs with a thermal design power (TDP) that can rapidly reach over 1,000 watts, a far cry from the 300-500 watts typically required for CPUs. It is therefore no surprise that, according to Gartner, by 2027, 40 percent of existing AI data centers will be operationally constrained by power availability. 

Compounding the issue, AI workloads require these GPUs to be packed together as closely as possible, creating a cluster effect that can consist of more than 100,000 chips, equating to up to 30 megawatts of installation in a very intense space. 

Not only does this demand more power in terms of compute, but the energy required to cool these workloads has grown exponentially, with liquid cooling technologies becoming increasingly mainstream. 

During a recent DCD>Broadcast, we spoke to experts from Eaton, a global leader in intelligent power management, and 451 Research, a firm specializing in tech insights and digital transformation, to find out more about how we can more effectively power data centers for the AI era.

“Over the last decade, data center rack density has steadily increased from 2-4 kilowatts (kW) per rack to 8-12kW. But in the last two years, driven by AI demand, we’ve seen densities spike to over 50kW per rack, with some even exceeding 100kW,” says Perkins Liu, senior manager and research analyst at 451 Research.

Modernization and purpose-built data centers

Traditional data centers simply aren’t built to handle the colossal power and density demands AI workloads require. As a result, purpose-built AI data centers, designed specifically to support high power densities, specialized GPU/TPU clusters, and all the unique infrastructure AI requires, are gaining traction. Meanwhile, many existing data centers, some 10 to 15 years old, are struggling to keep pace. 

Aging infrastructure, encompassing power distribution and IT systems, need updates to stay resilient, a reality that has seemingly split the industry into two camps. On the one hand, we’re seeing the rapid proliferation of new AI-enabled data centers, and on the other, we have older legacy facilities in need of modernization.

When it comes to redesigning or retrofitting data centers for AI, defining the boundaries of restricted resources is critical. Normally, a data center is a balanced combination of space and power at any given time, depending on the workload, and while traditional legacy data centers are typically vying for space and power, AI-enabled data centers tend to be more power-constrained due to the high energy consumption required.

“In an AI-enabled data center, you don't necessarily need that much space for the number of racks,” says Lui. “For example, five racks might consume the same amount of power that used to be taken by 40 or 50.”

This is why according to our panel, creating flexible ‘high-density’ zones alongside low-density areas is an important piece of the power puzzle, combining new technologies such as high-power distribution units and liquid cooling to help meet the unique needs of AI workloads, while still integrating with existing systems. That said, assessing which solutions are most viable will vary from case to case, depending on the unique requirements of an individual facility.

Managing power fluctuations for the benefit of all

When it comes to power demand and fluctuation, AI workloads bring with them unique challenges. Traditional data center workloads typically have a steady power demand of around 1MW, with minor variations of about 100kW that are easy to manage. 

In contrast, AI workloads – especially during intense tasks like model training – can cause hefty and unpredictable power swings. These fluctuations can be extreme, causing utilities to (understandably) push back on operators looking to deploy or expand high-density load profiles at scale. This of course creates a further hurdle for operators, who need to figure out where they can build and secure reliable power within practical timelines.

“These extreme fluctuations are unsustainable, posing challenges for utilities and operators, i.e. higher fees and difficulties securing additional power,” explains Jason Scimeca, product line manager at Eaton. “Data center operators are working with chip manufacturers and utility providers to understand and manage these power fluctuations across the powertrain.”

“Collaboration, sharing information and planning together is how you execute on this challenge,” echoes Janne Paananen, technology manager, Critical Power Solutions, Eaton. “Not just the suppliers and data centers, but communities as well, you can no longer work in isolation.”

Another way operators are looking to tackle the power problem, is through the deployment of distributed energy resources. This, coupled with ‘peak shaving’ (reducing power consumption during AI bursts for example) not only helps augment utility power but can reduce energy costs, increase network reliability, and promote sustainable power consumption.  

Another strategy involves operators looking within (their data centers) to utilize traditional assets, such as UPS systems, in new ways to help support the grid. Historically, a UPS was little more than an insurance policy, an idle asset reserved for the worst-case scenario. Now, operators are leveraging this technology and battery energy storage to provide ancillary services for the grid, helping to stabilize frequency and in some cases, integrate renewable energy.

“We're starting to hear a lot more customers asking us, ‘How can we use these batteries to do something different, and operate differently than we have in the past,’ in the sense of grid participation programs and things of that nature. It's changing the game,” affirms Scimeca.

This opportunity for grid stabilization also puts data centers in good stead when it comes to attaining power, incentivizing a symbiotic relationship between operator and utility. For instance, imagine you are a utility provider with two data centers vying for power. One has on-site energy generation and battery storage systems that could help balance the grid, and the other doesn’t. Which would you prioritize?

This shift from energy consumer to prosumer can also benefit the communities in which these data centers reside, in the form of waste heat utilization for district heating systems. This kind of contribution to the community not only helps appease any data center NIMBYs, but depending on location, can help ensure operators remain in line with government sustainability regulations. 

In Germany for example, as of June 2026, data centers will require a PUE of 1.2 or lower, with operators having to declare and report on how much of their energy comes from renewable sources, as well as the reuse of waste heat. 

“According to the EU Energy Directive, if you have a data center larger than 500kW, if technically possible, you really should be reusing waste heat,” says Paananen. “Going forward, data centers will have such a big role in the energy system, stricter regulations are coming, it’s as simple as that.” 

Safety first

With this increased demand for power, cabinets are using up to 100kW. A single rack row can demand 1.5-2MW, requiring 3,000-amp bus bars and large transformers to replace traditional 100-amp systems. This leads to low imbalance connections and high fault currents, centering the challenge not only around power, but how we power these facilities safely. 

GettyImages-1135581644
– Getty Images

“A more thoughtful approach is needed to ensure the environment remains safe in the event of faults in the electrical system,” says Paananen. “Proper technologies and design can manage these risks, but this safety aspect should not be overlooked.” 

That said, technology is not the only factor at play when it comes to ensuring safety. The right skills to do so are also integral. For instance, dealing with higher power densities may involve medium voltage loads that require moving equipment from inside to outside a building. This requires specialized knowledge and a safety-first mindset to ensure that as the infrastructure continues to evolve, risks are mitigated. 

“Everything we're talking about connects to talent and the labor market,” says Paul Connors, director of marketing, Electrical Engineering Services and Systems, Eaton. “What we're finding from an engineering services perspective is a lot more partnering taking place, where our customers are sharing build schedules going out multiple years to correspond with our hiring activity. This ensures we've got the right skills to help commission and maintain the uptime of this critical infrastructure.”

Better together

When it comes to powering the data center of the future, it’s no secret that flexibility will be a key driver to success, but this is not something that can be achieved alone. As the demand for data at speed and scale continues to proliferate, we’re starting to see the data center industry shape-shift into something wholly different.

What was once a highly secretive, siloed sector, is now leaning whole-heartedly into the benefits of cross-industry collaboration. “This is to be a confluence of multiple industries, utilities, and data centers, coming together to solve complex problems,” emphasizes Connors, as we wrap up our conversation.

“We’re also seeing data centers team up with industrial customers, much like the Industrial Revolution, by building near natural gas pipelines and using cleaner energy sources like fuel cells,” he adds. “These collaborations make use of existing energy infrastructure, with Eaton playing a key role in integrating these systems for efficiency.”

Without this cultural shift, realizing the full potential of AI simply wouldn’t be possible. Digitization is everybody’s business, and together, we have the tools needed to not only define the AI evolution but redefine how our industry operates for the better.

To find out more about powering data centers for the AI era, watch the original DCD>broadcast, Powering the data center of the future’ in full here.