The rapid expansion of artificial intelligence infrastructure across the United States is driving a significant increase in electricity consumption, leading to a renewed reliance on coal-fired power plants. A recent analysis indicates that coal generation has risen nearly 20%, not to directly power AI data centers, but to stabilize the national power grid against the unique and fluctuating energy demands of AI operations.
Key Takeaways
- Electricity generation from coal is projected to rise by nearly 20% to meet growing energy needs, with demand expected to remain high through 2027.
- The surge is driven by the massive power requirements of AI data centers and less favorable pricing for natural gas.
- Coal is not the primary power source for AI; instead, it provides crucial grid stability to manage fluctuations caused by AI workloads.
- Nuclear power offers a steady baseload ideal for AI training, while coal and renewables handle the sharp, unpredictable spikes from AI inference.
AI Growth and Rising Energy Needs
The proliferation of AI technologies has created an unprecedented demand for computational power, which in turn requires vast amounts of electricity. Data centers dedicated to training and running AI models are becoming one of the fastest-growing sources of new energy consumption in the U.S.
This increased demand, combined with changes in the energy market, has led power grid operators to turn to older fuel sources. According to a report from financial services company Jefferies, electricity generation from coal has increased by almost 20%. Projections suggest this elevated level of coal use will likely continue for the next several years.
Why the Shift Back to Coal?
The primary drivers for the resurgence in coal use are twofold: the sheer scale of new electricity demand from the tech sector and unfavorable market conditions for natural gas. As AI data centers come online, they place immediate and significant strain on the existing power grid, forcing operators to utilize all available generation sources to maintain a reliable supply.
However, it is a misconception that these advanced data centers are 'running on coal.' The reality is more complex, involving a mix of energy sources where each plays a specific role in maintaining a stable and responsive power grid for all consumers, not just tech giants.
The Two Distinct Power Profiles of AI
To understand the grid's energy-sourcing strategy, it is essential to differentiate between the two main types of AI workloads: training and inference. Each has a unique power consumption pattern that presents different challenges for electricity providers.
AI Training: A Steady and Predictable Load
AI model training is a long and resource-intensive process. It can involve running powerful processors, or GPUs, at maximum capacity for weeks or even months at a time. This creates a continuous, high-capacity, and largely predictable demand for electricity.
This steady power profile makes nuclear energy an almost perfect match. Nuclear power plants are designed to provide a constant, high-volume output of electricity with minimal fluctuation, aligning well with the uninterrupted needs of a multi-week training run. The plant provides the baseload power that keeps the AI models learning without interruption.
Even so, minor fluctuations occur. During training, models are periodically saved to storage in a process called a "checkpoint interval." During these moments, GPU utilization drops significantly as data is written and synchronized, causing a temporary dip in power consumption. Since a nuclear plant cannot quickly reduce its output, other energy sources must be used to balance the grid.
AI Inference: Short, Intense Bursts of Demand
Inference is the stage where a trained AI model is put to use, performing tasks like generating text, recognizing images, or providing recommendations. Unlike the steady marathon of training, inference is a series of short sprints. Each user request triggers a massive number of calculations that draw a sudden surge of electricity for just a fraction of a second.
The Scale of Inference Demand
When millions of users interact with an AI service simultaneously, the combined power load can spike dramatically. This creates a volatile and unpredictable demand pattern that stresses local power distribution networks and cooling systems within the data center.
These spiky, rapidly changing power needs cannot be met by a steady baseload source like nuclear alone. The grid requires highly responsive energy sources that can be ramped up or down in an instant to meet these sudden peaks.
A Mixed-Energy Strategy for Grid Stability
The national power grid is engineered to provide stable and reliable electricity to every type of customer, from residential homes to heavy industry. The unique demands of AI have reinforced the need for a diverse energy portfolio where different sources serve distinct purposes.
"Nuclear energy provides the power that keeps multi-week training runs uninterrupted, but when the cards are down, coal and renewables do their job for the other consumers."
During AI training, the steady output from nuclear plants powers the data centers. When checkpoint intervals cause a dip in demand, flexible sources like coal, natural gas, or renewables are adjusted to prevent an oversupply of power to the rest of the grid. This ensures that voltage remains stable for homes and businesses.
For AI inference, the role of these flexible sources is even more critical. They act as the grid's shock absorbers, rapidly increasing output to handle the sharp, unpredictable bursts of power demand from millions of simultaneous user requests. This responsiveness is something that even the most advanced data centers cannot manage alone with on-site storage.
The Role of Fossil Fuels in a Modern Grid
In this dynamic system, coal and other fossil fuels are valued not as a primary energy source for AI, but for their ability to provide dispatchable powerβelectricity that can be turned on or off on demand. While renewable sources like solar and wind are crucial for clean energy goals, their intermittent nature means they cannot always be relied upon to meet sudden demand spikes, leaving a critical gap that is currently filled by fossil fuels.
As the AI industry continues its exponential growth, the challenge for energy providers will be to modernize the grid. This will involve developing larger-scale energy storage solutions and expanding flexible, low-carbon power sources that can ensure stability without increasing reliance on fuels like coal.





