It’s no secret that artificial intelligence (AI) has quickly become a part of the daily routine for most of us. Whether it’s to help draft an email, plan a trip itinerary or answer a quick question, it feels so simple—just type and get an instant response. But behind even the smallest convenience powered by AI is a massive surge of computing power for training models and inference.
All that computing power requires energy. Data centers, specialized facilities that house the computer systems and equipment needed to store, manage and process data, draw power from local electricity grids. Most grids worldwide are still heavily dependent on fossil fuels—coal, natural gas, oil—for electricity generation and are big emissions drivers.
COMMENTARY
Electricity use at national or global scale is measured in terawatt hours (TWh), a unit large enough that one TWh is roughly enough to power a mid-size town of about 100,000 people for a year. In 2024, U.S. data centers consumed 183 TWh of electricity—an amount comparable to the annual electricity use of the entire state of Arizona or Washington. And that total is projected to grow 133% by 2030.
AI adoption is accelerating across nearly every sector, and each new application adds incremental energy demand. Companies are working to leverage AI to make operations more efficient and sustainable, but efficiency alone does not offset the carbon challenges it creates. Without access to low-carbon energy solutions, scaling AI could derail global sustainability efforts rather than support them.

Fortunately, there’s an opportunity to meet the growing energy needs of AI in a way that aligns with long-term carbon constraints. That means looking beyond the walls of data centers and focusing on reducing the carbon intensity of the energy grids that power them, and the infrastructure society depends on.
A responsible path forward begins with better visibility into how different AI applications drive energy use and environmental impact. By seeing where the biggest pressures come from and what tradeoffs they create, organizations can make more informed choices about how to scale AI sustainably. Doing so will foster continued AI adoption in nearly every global industry, while mitigating environmental impact, and, importantly, making business operations more efficient and cost-effective.
Why AI Demand Will Continue to Grow, Despite Environmental Concerns
AI is becoming embedded in nearly every sector and field. In healthcare, AI is enabling personalized medicine, improving diagnostic accuracy and accelerating drug discovery. In supply chains, it is supporting demand forecasting, inventory optimization and more efficient logistics planning. Autonomous vehicles and AI‑driven traffic management systems are offering safer and more efficient transportation. In agriculture, AI‑powered precision farming is enhancing crop yields and optimizing irrigation and fertilization practices.
These applications of AI just scratch the surface. Each new use case adds incremental energy requirements, and as adoption scales globally, the cumulative impact on electricity consumption will be significant. Even if individual models become more efficient and require less computing power, the overall demand curve points upward as AI becomes more accessible and widespread. This is known as the “rebound effect.”

The rebound effect occurs when increased efficiency makes a technology more accessible and affordable, leading to higher overall usage. In the context of AI, even if cooling systems become more efficient or models evolve to require less energy to run, total energy consumption can still rise, even though each individual task uses less energy.
The Sustainability Challenge Behind AI
Training large language models (LLMs) and running inference for even those quick questions you ask ChatGPT requires substantial energy and drives other resource demands, both direct and indirect.
These processes demand enormous computing power, which translates into high electricity consumption—the direct impact. Cooling systems for data centers add another layer of resource use, consuming significant volumes of water to maintain optimal temperatures. In regions already facing water scarcity, this creates additional stress on local supplies.
Indirect impacts extend beyond electricity and water. Hardware manufacturing depends on mining rare earth minerals and other critical resources, which disrupt ecosystems and generate pollution. Rapid hardware turnover compounds the problem, producing e-waste that is difficult to recycle and often contains hazardous materials that can leach into soil and water if not managed responsibly. Data centers also strain local power grids, increasing reliance on fossil-fuel plants and amplifying pollution in surrounding communities, which can lead to a public health crisis like we’ve seen with oil and gas.
While AI won’t spill oil, unchecked energy demands could trigger similar systemic harm: pollution, resource depletion and public health challenges concentrated in vulnerable regions. These parallels remind us that technological progress without sustainability planning can repeat old mistakes that have long-term effects on people and places. To avoid this trajectory, we need solutions that reduce the rising energy needs of AI and the waste it produces without slowing innovation.
The Building Blocks of Responsible AI Growth
AI’s sustainability challenges aren’t something technology alone can solve. As AI adoption grows, more people across industries are starting to look at its environmental impact and how to better measure it. This includes things like understanding how much energy AI systems use or finding ways to report emissions more consistently. Standard ways of tracking these impacts would make it easier for organizations to compare their progress, understand where improvements are needed and make better decisions. Without clearer methods for measuring carbon impact, it becomes harder to evaluate whether new efficiencies are actually working.
Progress will also depend on how well different groups work together. When approaches vary widely by region or industry, it becomes harder to scale AI responsibly. But when researchers, companies and public institutions share what they’ve learned, it helps create better practices for understanding and reducing AI’s environmental footprint. Over time, this kind of collaboration can also support work on renewable energy integration, carbon‑aware computing and more efficient hardware.
Sustainability is also becoming part of a broader conversation about ethical and trustworthy AI. Environmental impacts are now being considered alongside fairness, accountability and reliability. This shift is encouraging clearer communication about the energy demands of AI systems, as well as more attention to the hardware and software choices that influence waste and resource use.
Education ties all of this together. Developers can benefit from learning how to build and deploy models that use fewer resources. Business leaders and policymakers need a clearer understanding of how AI adoption affects energy use and emissions. And as the public becomes more aware of these impacts, communities are better prepared to participate in conversations about how digital technologies can support more sustainable infrastructure.
The Reality of Today—And What it Means for Tomorrow
The rapid rise of AI use is making it harder for many technology companies to meet their carbon reduction goals. AI can help lower emissions in other industries, but its own environmental impact is growing quickly. If data centers continue to rely on fossil‑fuel‑based power grids, emissions will keep increasing and the climate effects will last for decades.
Understanding AI’s footprint requires looking at more than daily electricity use. A Life Cycle Assessment provides a fuller picture by examining every stage of the process, including how raw materials are extracted, how hardware is manufactured, how much power and cooling systems require during operation and what happens to equipment when it reaches the end of its life. Without this wider view, improvements in AI model efficiency or data center performance can hide impacts that occur earlier or later in the chain. A full‑system perspective makes sustainability claims more credible and helps organizations make informed decisions as AI adoption grows.
Over time, the sustainability of AI will depend on lowering the carbon intensity of the energy that powers it. Thus, expanding energy generation, improving storage capacity and modernizing transmission infrastructure must advance in parallel with operational efficiencies, better reporting standards and smarter system design.
With steady investment and cooperation across sectors, AI can advance in a way that supports long‑term climate goals and creates benefits that extend beyond the technology industry.
—Dr. Anastasia Behr is senior director, sustainability science and technologies, for UL Solutions. Dr. Young Lee is principal engineer, artificial intelligence, for UL Solutions.