About the Author
Daniel Foelber has positions in Nvidia. The Motley Fool has positions in and recommends Alphabet, Amazon, Broadcom, Meta Platforms, Microsoft, and Nvidia. The Motley Fool has a disclosure policy.
Invest better with The Motley Fool. Get stock recommendations, portfolio guidance, and more from The Motley Fool's premium services.
Data centers could consume 9% to 17% of U.S. electricity by 2030. However, the range depends on how many planned projects actually get built. The EPRI’s latest forecast is 60% higher than its 2025 estimates and represents a 2x to 4x increase from 2024 levels. EPRI 2026 notes its 2030 range is broadly consistent with Lawrence Berkeley National Laboratory (LBNL) data through 2028 despite different methodologies. LBNL is forecasting 325 to 580 TWh of data center electricity demand by 2028, which would be 6.7% to 12% of total projected U.S. electric demand.
AI-driven energy demand is causing bottlenecks across the supply chain. Gas turbine deliveries are reportedly delayed by several years and transformer backlogs are up by at least 30%, while transformer prices are up 1.5x since 2020, says the IEA. According to EPRI 2026 data, information technology (IT) and power equipment, as well as skilled labor, are both regional and national challenges. Infrastructure development timelines paint an uncertain future for AI data center energy capacity.
These bottlenecks, paired with developers that are racing to bring more power to the grid, are sparking a surge in behind-the-meter (BTM) energy generation and storage focused on bringing power directly to a data center rather than relying on the constrained grid.
Bridging the gap between planned and operating AI data center projects takes more than just funding and next-generation chips. Access to networking, IT equipment, and specialized industrial machinery is equally important. Similarly, the energy bottleneck isn’t just about producing more energy. Transmission, permitting, and regulatory hurdles are also worth considering. Therefore, it may be useful for investors to think about the AI infrastructure build-out as an interconnected circulatory system with multiple arteries that could get blocked by various factors rather than a single pathway.
With the grid constrained by supply chain bottlenecks and connection hurdles, batteries, cooling solutions, hardware/software efficiency, grid connections, and backup power are poised to benefit from the demand-driven tailwinds of AI data centers.
Investors should pay close attention to battery energy storage, as it arguably offers the best antidote to the AI energy bottleneck. Battery energy storage is a uniquely positioned renewable energy solution that can power AI data centers without relying on the constrained grid through site-specific and BTM options. Utility-scale solar paired with battery energy storage offers an alternative to fossil fuels while addressing solar's intermittency when the sun isn’t shining.
The AI data center build-out is moving faster than initially anticipated, leading to upward revisions to projections.
Here are five things to watch:
Backlogs in chip production are straining data center availability and access, but energy is an equally important limiting factor. AI computing challenges can be solved by producing more chips and building data centers, but energy is more nuanced. Even if the energy market could flip a switch to increase supply, there are still environmental and regulatory hurdles to overcome. Perhaps the biggest challenge is aligning supply with forecasted demand, especially given that those forecasts have changed so much in the last two years.
Before the pandemic, U.S. energy demand was fairly predictable: It was driven by steady population increases and economic growth. According to Energy Information Administration (EIA) data, U.S. electricity consumption increased by an average annual growth rate of just 0.1% from 2005 to 2020. But the insatiable energy appetite of artificial intelligence (AI) data centers has carved an ever-growing wrinkle in that formerly predictable model.
Occasional updates to aging infrastructure are no longer good enough. The energy demand from AI data centers is real, large, and structurally constrained – but the bottleneck creates opportunity.
U.S. data centers consumed an estimated 177 to 192 terawatt-hours (TWh) of electricity in 2024 – roughly 4% to 5% of all U.S. electricity – and could consume 9% to 17% by 2030 under scenarios developed by the Electric Power Research Institute (EPRI). The updated range is about 60% higher than EPRI's own projections from 18 months earlier. But most of that energy usage is from conventional data centers.
Investors should pay attention to three structural factors: the accelerating scale of the AI data center build-out, energy demands for computing power (especially inference), and efficiency gains that history suggests won't keep pace with these new energy demands. These factors point to durable, structural growth in electricity demand, not a cyclical spike.
Outfitting hyperscale AI data centers with low-latency infrastructure optimized for speed is proving to be more costly and energy intensive than initially projected. The limiting factor isn’t cost, since hyperscalers are committing plenty of capital to fund projects, but rather energy and data center infrastructure availability.
Despite massive investment in solar photovoltaic (PV), onshore and offshore wind, battery energy storage, and nuclear energy, fossil fuels still dominate the U.S. energy mix.
Investors should focus more on total energy demand than on renewables versus fossil fuels in the energy mix. Over the long term, solar, wind, battery energy storage, and nuclear will likely make up a higher proportion of the electricity mix than natural gas and coal, but natural gas consumption could still be far higher 5 to 10 years from now than today, given AI’s outsize energy demands.