Before the pandemic, U.S. energy demand was fairly predictable: It was driven by steady population increases and economic growth. According to Energy Information Administration (EIA) data, U.S. electricity consumption increased by an average annual growth rate of just 0.1% from 2005 to 2020. But the insatiable energy appetite of artificial intelligence (AI) data centers has carved an ever-growing wrinkle in that formerly predictable model.
Occasional updates to aging infrastructure are no longer good enough. The energy demand from AI data centers is real, large, and structurally constrained – but the bottleneck creates opportunity.
How much energy do AI data centers use?
U.S. data centers consumed an estimated 177 to 192 terawatt-hours (TWh) of electricity in 2024 – roughly 4% to 5% of all U.S. electricity – and could consume 9% to 17% by 2030 under scenarios developed by the Electric Power Research Institute (EPRI). The updated range is about 60% higher than EPRI's own projections from 18 months earlier. But most of that energy usage is from conventional data centers.
- A hyperscale AI data center uses as much power as 100,000 homes – and the largest under construction uses 20 times that. A conventional data center is 10 to 25 megawatts (MW). But hyperscale AI data centers are often classified as 100 MW or higher, which is enough energy to power 100,000 homes. The largest hyperscale data center currently under construction exceeds 2 GW, and the largest planned is 5 GW, says the International Energy Agency (IEA). That is a staggering 20 times a hyperscale data center and 200 times the high end of a conventional data center.
- Data centers are growing in size because AI is far more energy intensive than basic search. Consider that a single ChatGPT AI query uses 10 times as much electricity as a traditional Google Search, according to the EPRI. GPT-4 training required around 42.4 GWh over 14 weeks, equivalent to the daily electricity use of about 28,500 households in advanced economies, says the IEA.
Training AI models and chat-based use are already challenges for the grid, but the real constraints are the widespread adoption of AI agents and the increased use of AI inference, which involves applying what an AI has been trained to do. Key chip designers, such as Nvidia (NVDA +2.57%) and Broadcom (AVGO -0.83%), are developing specialized AI chips and networking solutions for inference applications. According to IEA data, the B200 GPU, which is part of Nvidia’s Blackwell architecture, is 60% more energy efficient per FLOP/watt than its H100, which is 80% more efficient than the A100. - Efficiency improvements in IT equipment represent the greatest opportunity to reduce AI data center usage. IT equipment accounts for 40% to 50% of data center energy, cooling for 30% to 40%, and auxiliary for 10% to 30%, says the EPRI. Data centers are growing in size not just because of increased use of models like OpenAI’s ChatGPT, Anthropic’s Claude models, Alphabet's (GOOG -0.47%) Google Gemini, and Microsoft (MSFT +2.81%) Copilot but because inferencing is broadening the scope of AI’s role in business and daily life.
Investors should pay attention to three structural factors: the accelerating scale of the AI data center build-out, energy demands for computing power (especially inference), and efficiency gains that history suggests won't keep pace with these new energy demands. These factors point to durable, structural growth in electricity demand, not a cyclical spike.
How much energy will AI data centers require?
Data centers could consume 9% to 17% of U.S. electricity by 2030. However, the range depends on how many planned projects actually get built. The EPRI’s latest forecast is 60% higher than its 2025 estimates and represents a 2x to 4x increase from 2024 levels. EPRI 2026 notes its 2030 range is broadly consistent with Lawrence Berkeley National Laboratory (LBNL) data through 2028 despite different methodologies. LBNL is forecasting 325 to 580 TWh of data center electricity demand by 2028, which would be 6.7% to 12% of total projected U.S. electric demand.
Will the AI revolution hit an energy bottleneck?
AI-driven energy demand is causing bottlenecks across the supply chain. Gas turbine deliveries are reportedly delayed by several years and transformer backlogs are up by at least 30%, while transformer prices are up 1.5x since 2020, says the IEA. According to EPRI 2026 data, information technology (IT) and power equipment, as well as skilled labor, are both regional and national challenges. Infrastructure development timelines paint an uncertain future for AI data center energy capacity.
- Data centers take 1 to 3 years to build, but new transmission lines take 4 to 8 years, and new-generation plants take 2 to 15 years, depending on the type. And that’s not even factoring in the previously discussed specialized industrial machinery bottlenecks.
- 20% of planned data centers could face grid delays by 2030 – and the pipeline is bigger than anyone realized. And 50% of U.S. data centers under development are in existing large clusters, raising the risk of local bottlenecks, says the IEA.
- Virginia, Northern Europe, and Japan are already hitting connection limits. North Virginia connection queues have ballooned to 7 years, compared to 5 to 7 years for the U.K., up to 10 years for the Netherlands, and a full-blown pause for Ireland, according to the IEA.
These bottlenecks, paired with developers that are racing to bring more power to the grid, are sparking a surge in behind-the-meter (BTM) energy generation and storage focused on bringing power directly to a data center rather than relying on the constrained grid.
Bridging the gap between planned and operating AI data center projects takes more than just funding and next-generation chips. Access to networking, IT equipment, and specialized industrial machinery is equally important. Similarly, the energy bottleneck isn’t just about producing more energy. Transmission, permitting, and regulatory hurdles are also worth considering. Therefore, it may be useful for investors to think about the AI infrastructure build-out as an interconnected circulatory system with multiple arteries that could get blocked by various factors rather than a single pathway.
Adjacent opportunities – cooling, grid infrastructure, backup power, and efficiency
With the grid constrained by supply chain bottlenecks and connection hurdles, batteries, cooling solutions, hardware/software efficiency, grid connections, and backup power are poised to benefit from the demand-driven tailwinds of AI data centers.
- Beyond the data center: the cooling, grid, and efficiency markets the AI build-out is creating. Power usage effectiveness (PUE) measures the energy efficiency of data centers by dividing total facility energy, such as cooling, lighting, and power distribution, by IT equipment energy. A PUE of 1 would indicate 100% of electricity is consumed solely by IT equipment. According to the IEA, the global weighted-average PUE is expected to improve from 1.41 to 1.29 between 2024 and 2030, saving 90 TWh of energy. The U.S. is already ahead of the curve, with PUEs averaging 1.32 in 2024. According to 2026 EPRI data, large hyperscale facilities with liquid cooling currently under construction could achieve PUEs of 1.1.
- Batteries, transformers, and cooling: the infrastructure layer investors may be underweighting. Battery energy storage plays a supporting role in enabling higher renewable matching for data centers. It remains eligible for investment tax credits under current policy, says the EPRI.
Investors should pay close attention to battery energy storage, as it arguably offers the best antidote to the AI energy bottleneck. Battery energy storage is a uniquely positioned renewable energy solution that can power AI data centers without relying on the constrained grid through site-specific and BTM options. Utility-scale solar paired with battery energy storage offers an alternative to fossil fuels while addressing solar's intermittency when the sun isn’t shining.
What the energy data tells investors
The AI data center build-out is moving faster than initially anticipated, leading to upward revisions to projections.
Here are five things to watch:
- Hyperscale capex is the driving force behind data center development. If capex growth rates cool, grid and energy supply constraints will likely ease.
- Industrial machinery backlogs and power constraints could become a greater bottleneck than AI chips and networking equipment.
- Despite the development of utility-scale renewable energy projects, the U.S. energy mix remains heavily dependent on fossil fuels, though nuclear energy could make a significant impact after 2030.
- Hyperscale data center PUE improvements could reduce energy needs.
- Increased AI adoption creates opportunities for grid infrastructure (transformers, cables, transmission), cooling systems, backup power, efficiency hardware, and demand-flexibility technology.
Backlogs in chip production are straining data center availability and access, but energy is an equally important limiting factor. AI computing challenges can be solved by producing more chips and building data centers, but energy is more nuanced. Even if the energy market could flip a switch to increase supply, there are still environmental and regulatory hurdles to overcome. Perhaps the biggest challenge is aligning supply with forecasted demand, especially given that those forecasts have changed so much in the last two years.
FAQs
About the Author
Daniel Foelber has positions in Nvidia. The Motley Fool has positions in and recommends Alphabet, Amazon, Broadcom, Meta Platforms, Microsoft, and Nvidia. The Motley Fool has a disclosure policy.





