Will the AI revolution hit an energy bottleneck?
AI-driven energy demand is causing bottlenecks across the supply chain. Gas turbine deliveries are reportedly delayed by several years and transformer backlogs are up by at least 30%, while transformer prices are up 1.5x since 2020, says the IEA. According to EPRI 2026 data, information technology (IT) and power equipment, as well as skilled labor, are both regional and national challenges. Infrastructure development timelines paint an uncertain future for AI data center energy capacity.
- Data centers take 1 to 3 years to build, but new transmission lines take 4 to 8 years, and new-generation plants take 2 to 15 years, depending on the type. And that’s not even factoring in the previously discussed specialized industrial machinery bottlenecks.
- 20% of planned data centers could face grid delays by 2030 – and the pipeline is bigger than anyone realized. And 50% of U.S. data centers under development are in existing large clusters, raising the risk of local bottlenecks, says the IEA.
- Virginia, Northern Europe, and Japan are already hitting connection limits. North Virginia connection queues have ballooned to 7 years, compared to 5 to 7 years for the U.K., up to 10 years for the Netherlands, and a full-blown pause for Ireland, according to the IEA.
These bottlenecks, paired with developers that are racing to bring more power to the grid, are sparking a surge in behind-the-meter (BTM) energy generation and storage focused on bringing power directly to a data center rather than relying on the constrained grid.
Bridging the gap between planned and operating AI data center projects takes more than just funding and next-generation chips. Access to networking, IT equipment, and specialized industrial machinery is equally important. Similarly, the energy bottleneck isn’t just about producing more energy. Transmission, permitting, and regulatory hurdles are also worth considering. Therefore, it may be useful for investors to think about the AI infrastructure build-out as an interconnected circulatory system with multiple arteries that could get blocked by various factors rather than a single pathway.
Adjacent opportunities – cooling, grid infrastructure, backup power, and efficiency
With the grid constrained by supply chain bottlenecks and connection hurdles, batteries, cooling solutions, hardware/software efficiency, grid connections, and backup power are poised to benefit from the demand-driven tailwinds of AI data centers.
- Beyond the data center: the cooling, grid, and efficiency markets the AI build-out is creating. Power usage effectiveness (PUE) measures the energy efficiency of data centers by dividing total facility energy, such as cooling, lighting, and power distribution, by IT equipment energy. A PUE of 1 would indicate 100% of electricity is consumed solely by IT equipment. According to the IEA, the global weighted-average PUE is expected to improve from 1.41 to 1.29 between 2024 and 2030, saving 90 TWh of energy. The U.S. is already ahead of the curve, with PUEs averaging 1.32 in 2024. According to 2026 EPRI data, large hyperscale facilities with liquid cooling currently under construction could achieve PUEs of 1.1.
- Batteries, transformers, and cooling: the infrastructure layer investors may be underweighting. Battery energy storage plays a supporting role in enabling higher renewable matching for data centers. It remains eligible for investment tax credits under current policy, says the EPRI.
Investors should pay close attention to battery energy storage, as it arguably offers the best antidote to the AI energy bottleneck. Battery energy storage is a uniquely positioned renewable energy solution that can power AI data centers without relying on the constrained grid through site-specific and BTM options. Utility-scale solar paired with battery energy storage offers an alternative to fossil fuels while addressing solar's intermittency when the sun isn’t shining.
What the energy data tells investors
The AI data center build-out is moving faster than initially anticipated, leading to upward revisions to projections.
Here are five things to watch:
- Hyperscale capex is the driving force behind data center development. If capex growth rates cool, grid and energy supply constraints will likely ease.
- Industrial machinery backlogs and power constraints could become a greater bottleneck than AI chips and networking equipment.
- Despite the development of utility-scale renewable energy projects, the U.S. energy mix remains heavily dependent on fossil fuels, though nuclear energy could make a significant impact after 2030.
- Hyperscale data center PUE improvements could reduce energy needs.
- Increased AI adoption creates opportunities for grid infrastructure (transformers, cables, transmission), cooling systems, backup power, efficiency hardware, and demand-flexibility technology.
Backlogs in chip production are straining data center availability and access, but energy is an equally important limiting factor. AI computing challenges can be solved by producing more chips and building data centers, but energy is more nuanced. Even if the energy market could flip a switch to increase supply, there are still environmental and regulatory hurdles to overcome. Perhaps the biggest challenge is aligning supply with forecasted demand, especially given that those forecasts have changed so much in the last two years.