The rise of artificial intelligence (AI) has revolutionized how industries operate, and data centers have become the nerve center of this transformation. From streaming services to real-time collaboration tools, the data centers that house and manage our digital lives are expanding rapidly to meet the surging demand. Yet, as AI’s impact and importance grow, so do its energy demands, creating a new set of challenges for an already resource-intensive industry.
COMMENTARY
Data centers are becoming larger, denser, and more power-hungry. AI applications like generative AI (GenAI), high-performance computing (HPC), and machine learning have dramatically increased processing power requirements. As a result, rack densities in data centers have surged from an average of 7 kW in 2021 to 12 kW today, with some racks exceeding 50 kW, and recent designs up to 100kW+. This shift has implications not just for data centers themselves but for the power grids that support them.
Historically, fiber networks were the top priority when building data centers. Today, energy has taken the lead. As data centers grow larger and denser to meet the demand for AI applications, their power consump-tion is skyrocketing. Across North America, hyperscale data centers operated by tech giants like Amazon, Google, Microsoft, and Meta can consume hundreds of megawatts (MW) of power, with plans for multi-gigawatt (GW) campuses already underway across the U.S.
While the demand is most acute in the U.S., both Canada and Mexico are also emerging as attractive locations, in the case of Canada thanks to its abundant hydroelectric power allowing operators to meet sustainability targets more easily, and Mexico due to its near-shoring advantages. However, continued growth in these regions could strain local power grids, especially as data centers continue to increase in size and energy requirements.
Unlike traditional server operations, which experience relatively stable demand, AI data centers are characterized by volatile power usage, with intense spikes driven by AI workloads. These spikes can significantly disrupt the grid, especially as more renewable energy sources like wind and solar—which are inherently variable—become part of the energy mix. This unpredictability is compounded by the fact that many utilities are already struggling to meet the growing power demands of data centers.
Omdia forecasts that data centers will require an additional 100 GW of power capacity to meet AI demand between 2024 and 2030. However, with long lead times for power generation and transmission infrastructure, coupled with regulatory hurdles, it will be challenging to develop new power sources in the needed timeframe at the scale required.
While data centers are embracing AI to manage the demand for real-time processing, the energy demands of AI-driven data centers also present a paradox. Data center operators are under increasing pressure to reduce their carbon footprints and enhance energy efficiency, yet AI’s energy appetite is accelerating. Many operators have set ambitious sustainability goals, such as sourcing energy from renewable resources and improving their power usage effectiveness (PUE)—a measure of how efficiently a data center uses its energy.
Yet, given the current state of power infrastructure, these goals will be difficult to achieve. AI applications are projected to drive a tenfold increase in data center electricity consumption by 2030. Moreover, many regions with high data center growth are facing challenges in building new power generation capacity quickly enough to meet demand. U.S. utilities are expected to invest $50 billion in new generation capacity, but there remains a significant gap between projected and actual electricity generation. As the energy consumption of AI-driven data centers continues to rise, this gap will only widen, making it crucial to modernize the power grid.
Despite these challenges, there are solutions on the horizon. One of the most promising avenues for addressing the energy challenge posed by AI is the use of AI itself to manage energy consumption. AI has the potential to enhance power infrastructure in several ways, from improving HVAC (heating, ventilation, air conditioning) controls to optimizing power distribution across the data center. For example, Google reported a 40% reduction in cooling energy by using AI to monitor and adjust cooling systems. Similarly, AI-driven power management systems can regulate energy flow more efficiently, even as rack densities continue to climb.
Beyond the data center walls, the power grid must become more resilient and flexible to accommodate the fluctuating energy demands of AI data centers. AI-enabled interconnections between data centers and substations can help balance energy supply and demand more effectively, allowing operators to tap into renewable energy sources when available. Data centers can also play a more active role in grid stabilization by acting as “prosumers,” generating their own energy through renewable sources and selling excess energy back to the grid during periods of low demand.
Emerging energy hubs, which can integrate multiple power sources such as solar, wind, and batteries, offer another solution. These hubs, managed by AI systems, offer the capability to switch seamlessly between energy sources, ensuring a stable and reliable power supply even during periods of peak demand. By adopting these advanced technologies, data centers can reduce their reliance on fossil fuels and improve their overall sustainability.
As AI continues to reshape industries, the power demands of data centers will only grow. To keep pace, utilities, data center operators, and technology providers must work together to modernize power infrastructure, embrace renewable energy sources, and leverage AI to create smarter, more sustainable power systems.
The energy challenges facing hyperscale data centers are significant, but they are not insurmountable. With the right combination of innovation, investment, and collaboration, we can meet AI’s energy demands while supporting a greener, more efficient future.
—Dave Sterlace is Strategic Account Director for Global Hyperscale Data Centers at Hitachi Energy.