Energy20 views10-12 minutes min read

AI Data Centers Boost Emissions, Seek Solutions

New research explores ways to curb the rising carbon footprint of AI data centers, which are projected to double global electricity demand by 2030, with 60% met by fossil fuels.

Eleanor Hayes
By
Eleanor Hayes

Eleanor Hayes is a science and technology correspondent for Neurozzio, focusing on the application of artificial intelligence and data science to address environmental and urban challenges. She reports on academic research, climate technology, and sustainable urban development.

Author Profile
AI Data Centers Boost Emissions, Seek Solutions

The rapid expansion of artificial intelligence (AI) data centers is projected to significantly increase global greenhouse gas emissions. Researchers worldwide, including those at MIT, are actively seeking innovative solutions to lessen these environmental impacts. This effort focuses on reducing the carbon footprint across the entire lifecycle of AI infrastructure, from construction to daily operations.

Key Takeaways

  • AI data centers are expected to double global electricity demand by 2030.
  • Fossil fuels may meet 60% of this increased demand, adding 220 million tons of carbon.
  • Solutions target both operational and embodied carbon emissions.
  • Efficiency gains from algorithms and hardware are crucial.
  • Leveraging renewable energy and smart data center scheduling can reduce impact.
  • AI itself is being explored to accelerate clean energy development.

Growing Energy Demand from AI

Generative AI models require substantial computing power. This demand is set to rise sharply over the next decade. Data centers, which house the necessary computing infrastructure, are at the core of this growth.

A report from the International Energy Agency in April 2025 predicted a significant surge. It forecasts that global electricity demand from data centers will more than double by 2030. This would reach approximately 945 terawatt-hours.

Energy Demand Facts

  • Projected 2030 data center electricity demand: 945 terawatt-hours.
  • This amount is slightly more than the total energy consumption of Japan.

This projected increase in electricity use raises concerns about carbon emissions. An August 2025 analysis by Goldman Sachs Research highlighted this issue. It estimates that fossil fuels will power about 60% of the rising electricity demand from data centers.

This reliance on fossil fuels could add around 220 million tons to global carbon emissions. To put this in perspective, driving a gas-powered car for 5,000 miles produces about 1 ton of carbon dioxide. The scale of this increase is substantial.

Addressing Operational and Embodied Carbon

Efforts to reduce AI's carbon footprint typically focus on "operational carbon." These are the emissions from powerful processors, or GPUs, used daily within data centers. However, another significant factor is often overlooked: "embodied carbon."

Vijay Gadepally, a senior scientist at MIT Lincoln Laboratory, leads research projects in supercomputing. He explains that embodied carbon refers to emissions generated during the initial construction of a data center. Building these facilities involves tons of steel and concrete. They are also filled with air conditioning units, computing hardware, and miles of cabling, all of which contribute to a large carbon footprint.

"The operational side is only part of the story. Some things we are working on to reduce operational emissions may lend themselves to reducing embodied carbon, too, but we need to do more on that front in the future," says Vijay Gadepally.

Data Center Construction Impact

Companies like Meta and Google are exploring more sustainable building materials for data centers. This is partly due to the high environmental impact of construction. Data centers are enormous structures, often 10 to 50 times more energy-dense than typical office buildings. The world's largest, the China Telecomm-Inner Mongolia Information Park, spans about 10 million square feet.

Recognizing both operational and embodied carbon is crucial for a complete solution. Researchers are looking at ways to reduce energy consumption across the board. This includes improving the efficiency of algorithms and redesigning data centers themselves.

Strategies for Reducing Operational Carbon

Reducing the operational carbon emissions of AI data centers shares similarities with household energy-saving methods. One simple analogy is turning off lights when not needed.

Gadepally notes that even inefficient lightbulbs use less energy when turned off or dimmed. The same principle applies to data center hardware. Research from the Supercomputing Center shows that "turning down" GPUs to consume about 30% of their energy has minimal impact on AI model performance. This also makes the hardware easier to cool, saving more energy.

Hardware and Algorithm Optimization

Another approach is to use less energy-intensive computing hardware. Training new, complex generative AI models often requires many GPUs working in parallel. The Goldman Sachs analysis suggests that a state-of-the-art system could soon use as many as 576 connected GPUs simultaneously.

However, engineers can sometimes achieve similar results by reducing the precision of computing hardware. This might involve switching to less powerful processors tuned for specific AI workloads. This careful selection of hardware can significantly cut energy use.

Training Efficiency

  • About half the electricity for AI model training is spent on achieving the final 2-3% of accuracy.
  • Stopping training early can save substantial energy if lower accuracy is acceptable for an application.

Efficiency measures also extend to the training process itself. Gadepally's team discovered that roughly half the electricity used to train an AI model goes into getting the last 2 or 3 percentage points of accuracy. Stopping the training process earlier can save a lot of this energy without compromising all applications.

For example, 70% accuracy might be sufficient for a recommender system in e-commerce. In such cases, continuing to train for marginal gains becomes energy-inefficient. Researchers are also developing tools to avoid wasted computing cycles. One postdoc at the Supercomputing Center created a tool that cut 80% of unnecessary simulations during training. This dramatically reduced energy demands while maintaining model accuracy.

Leveraging Efficiency Improvements

Ongoing innovation in computing hardware continues to improve the energy efficiency of AI models. Neil Thompson, director of the FutureTech Research Project at MIT's Computer Science and Artificial Intelligence Laboratory, points to advancements like denser transistor arrays on semiconductor chips.

While general chip energy efficiency has slowed since 2005, GPUs show remarkable improvement. The amount of computation GPUs can perform per joule of energy has been increasing by 50% to 60% annually. Thompson explains that running operations in parallel remains key to improving efficiency in AI systems, continuing the "Moore's Law" trend.

Algorithmic Gains and "Negaflops"

Even more significant are efficiency gains from new model architectures. Thompson's research indicates that these improvements, which allow complex problems to be solved faster with less energy, are doubling every eight or nine months. He coined the term "negaflop" to describe this effect.

Similar to how a "negawatt" represents saved electricity, a "negaflop" signifies a computing operation that is avoided due to algorithmic improvements. This can involve "pruning" unnecessary parts of a neural network or using compression techniques. These methods enable more to be done with less computation.

"Making these models more efficient is the single-most important thing you can do to reduce the environmental costs of AI," states Neil Thompson. He adds that a powerful model needed today might be replaced by a significantly smaller, more efficient one in just a few years.

Maximizing Energy Savings and Renewable Sources

Reducing overall energy use is vital, but the type of energy also matters. Vijay Gadepally highlights that the carbon emissions associated with 1 kilowatt-hour of electricity vary significantly. This variation occurs throughout the day, month, and year.

Engineers can capitalize on these fluctuations by making AI workloads and data center operations more flexible. Some generative AI tasks do not need to be completed all at once. By splitting computing operations, some can be performed later when more electricity comes from renewable sources like solar and wind power. This strategic scheduling can greatly reduce a data center's carbon footprint.

Smarter Data Centers and Energy Storage

Deepjyoti Deka, a research scientist at the MIT Energy Initiative, and his team are studying "smarter" data centers. These centers would flexibly adjust AI workloads from multiple companies using the same equipment. The goal is to improve overall energy efficiency.

"By looking at the system as a whole, our hope is to minimize energy use as well as dependence on fossil fuels, while still maintaining reliability standards for AI companies and users," says Deepjyoti Deka.

Deka's team is developing a flexibility model for data centers. This model considers the different energy demands of training versus deploying deep-learning models. Their research aims to identify optimal strategies for scheduling and streamlining computing operations to boost energy efficiency.

The researchers are also exploring long-duration energy storage units at data centers. These units would store excess energy for periods of high demand. This would allow data centers to use stored renewable energy during peak times or avoid relying on diesel backup generators during grid fluctuations.

Location Matters

  • Locating data centers in cooler climates, like Meta's facility in Lulea, Sweden, reduces cooling electricity needs.
  • Some governments are even considering lunar data centers for nearly all-renewable energy operation.

Deka believes long-duration energy storage could be transformative. It would enable operations designed to shift the system's emission mix towards greater reliance on renewable energy. Additionally, MIT and Princeton University researchers are developing GenX, a software tool for power sector investment planning. This tool could help companies choose ideal data center locations to minimize environmental impact and costs.

AI-Based Solutions for Clean Energy

Despite these innovations, the growth of renewable energy generation on Earth is not keeping pace with the rapid expansion of AI. This creates a major obstacle to reducing AI's carbon footprint, according to Jennifer Turliuk MBA ’25, a lecturer and former practice leader of climate and energy AI at the Martin Trust Center for MIT Entrepreneurship.

The review processes for new renewable energy projects at local, state, and federal levels can take years. Researchers are investigating how AI could accelerate these processes. For instance, generative AI models could streamline interconnection studies. These studies determine how new projects affect the power grid, a step that often takes years to complete.

AI's Role in Green Energy

AI is well-suited for tackling complex situations, such as optimizing the electrical grid, which is considered one of the world's largest and most intricate machines. AI can help predict solar and wind energy generation, identify ideal locations for new renewable facilities, and perform predictive maintenance on green energy infrastructure.

AI could also help gather and analyze vast amounts of data. This data could then inform targeted policy interventions. These interventions would aim to maximize the impact of investments in areas like renewable energy, Turliuk explains.

To assist policymakers and enterprises in evaluating AI's environmental impact, Turliuk and her team developed the Net Climate Impact Score. This framework helps determine the net climate impact of AI projects. It considers both emissions and other environmental costs, along with potential future environmental benefits.

"Every day counts. We are on a path where the effects of climate change won’t be fully known until it is too late to do anything about it. This is a once-in-a-lifetime opportunity to innovate and make AI systems less carbon-intense," emphasizes Jennifer Turliuk.

Ultimately, the most effective solutions will likely emerge from strong collaborations. These partnerships will involve companies, regulators, and researchers, with academia playing a leading role in driving innovation.