I have spent my entire career working at the intersection of infrastructure and power. Collaborating with colleagues in the utility industry has been an enormous part of my job for almost three decades. So much so, that I have been humbled by how many familiar faces have come up to me at recent power-focused conferences like POWERGEN, the EPRI Summer Series, and DCW POWER where I have spoken about the power needs of data centers.
COMMENTARY
The theme running through nearly every conversation: artificial intelligence (AI)-focused data centers may require power at a magnitude beyond anything we’ve built before, but the blueprint for solving that challenge already exists.
It comes down to three principles we’ve relied on before, and need to recommit to now:
- Build together. Deliver reliable, resilient power for data centers that serve as critical infrastructure and engines of economic growth.
- Invest together. Modernize the grid in ways that benefit not just data centers, but communities and the broader economy.
- Innovate together. Use the very technologies being powered by data centers to solve the power challenges those technologies create.
This last point is easy to overlook—but it may be the most important.
Consider what happened during the cloud and hyperscale era of the 2010s. The shift from fragmented on-premise information technology (IT) to large-scale cloud infrastructure created what felt, at the time, like an unsustainable surge in power demand. Critics warned that data centers would consume an ever-growing share of the national grid with no end in sight.
What happened instead was a case study in technology and collaboration outpacing the problem. Industry-wide power usage effectiveness (PUE) dropped significantly as economizer cooling, advanced power distribution, and smarter infrastructure management became standard practice. Hyperscalers began signing large-scale renewable power purchase agreements (PPAs), and those contracts pulled billions in wind and solar investment onto the grid, adding generation that benefited every ratepayer. And early machine learning tools, running inside those same data centers, began optimizing cooling and energy use dynamically—Google’s DeepMind famously reduced cooling energy by roughly 40%. The technologies consuming power were simultaneously being used to consume it more efficiently.
The grid didn’t buckle under the weight of cloud computing. It got stronger. And the data center industry—through partnership with utilities, investment in grid infrastructure, and the application of emerging technology—was a meaningful reason why.
I am providing that historical context because I think it’s an important lens for discussing the power needs of AI computing. Yes, AI is unprecedented in its potential—to cure diseases, drive innovation, and reshape industries. And yes, its power needs are also unprecedented. But AI also has the potential to solve the very challenge it is creating, just as prior technological innovations have done.
The timing is striking: AI’s power demands are arriving just as the national grid reaches a critical inflection point. Most of the grid was built between the 1950s and 1970s, and today approximately 70% of it is approaching the end of its life cycle—never designed for the scale or speed of modern digital demand. The grid doesn’t just need maintenance; it needs fundamental modernization and significant investment. The data center boom is now acting as a catalyst, surfacing long-standing deficiencies the system could previously absorb quietly. But if history is any guide, the same forces creating this pressure may be precisely what drives us to finally address it.
We need infrastructure built for the next 50 to 100 years. AI can help get us there—enabling smarter power use, grid stabilization, and capacity we already have but haven’t unlocked. That includes:
- Demand forecasting that prevents overbuilding and stranded capacity.
- Predictive maintenance that eliminates unplanned outages before they happen.
- Grid management that identifies and releases trapped capacity.
- Cybersecurity monitoring that protects critical grid infrastructure from emerging threats.
But AI’s role doesn’t stop at the grid’s edge. Inside the data center itself, intelligent systems are already redefining what efficient, resilient infrastructure looks like:
- Dynamic cooling and energy management that respond in real time to workload and climate.
- Data center efficiency optimization that drives meaningful gains in metrics like Tokens per Watt.
The pace of innovation in these areas is remarkable. At Compass, we’ve been proud to be at the forefront of many of these initiatives, including in close collaboration with utility partners and EPRI. However, technology alone won’t close the gap.
People and policy must enable it. Progress requires new ways of working among utilities, data center companies, regulators, operators, and government agencies to modernize the grid and get the most out of our current resources.
Utilities, regulators, data center companies, and government all want the same outcome: reliable, resilient power that strengthens communities, supports economic growth, and maintains energy affordability. But historically we have worked in parallel instead of in partnership. What’s needed now is true partnership—built on transparent planning, aligned incentives, shared data, and a willingness to solve problems at the same table.
The good news: we don’t need a new playbook. We have a proven one. Through genuine collaboration, shared investment in infrastructure, and purposeful application of cutting-edge technology, we can meet the power demands of AI and build the foundation the next century of innovation requires.
—Chris Crosby is the CEO of Compass Datacenters, which designs and constructs data centers for some of the world’s largest hyperscalers and cloud providers on campuses across the globe.