Why the Future of AI Depends on the Energy Sector’s Past

by Jack Fox
Why did it take decades for cars to overtake horses as the primary mode of transportation in America (Great American Adventures, 2022)? One of the primary drivers was the lack of car-centric infrastructure. Infrastructure like roads, signage, and supporting businesses were all geared towards horses, creating a significant barrier to entry for the car (Medium, 2025). Even once cars became widely adopted, it took a World War for America to scale its car infrastructure and build the interstate system. Just as it did with cars, America is going to have to face the challenge of creating and scaling its energy infrastructure to support our growing reliance on AI.
The computational demands of AI models are immense, primarily fueled by sprawling data centers. This growing need for processing power translates directly into a growing demand for electricity. Electricity demand from AI-specific data centers alone could more than quadruple by 2030, with each new model drawing as much power as a small town (IEA, 2025). Projections indicate a dramatic surge in energy consumption, rising from an estimated 3-4% of total US consumption in 2024 to a staggering 11-12% by 2030 (McKinsey, 2024). The International Energy Agency (IEA) even warns that data centers may consume more electricity than the entire country of Japan within five years (IEA, 2025).
This escalating demand is running headfirst into the limitations of our energy infrastructure. A primary bottleneck in powering these new data centers lies in connecting them to existing transmission grids (McKinsey, 2024). The backlog for new grid tie-ins in Western economies like the US and Europe is significant, ranging anywhere from 4 to 8 years (Data Center Frontier, 2025). This constraint is so severe that as much as 20% of all planned data center projects across the world could face delays due to limitations in the electrical grid (Data Center Frontier, 2025).
The contrast between rapidly growing electricity demand and slowly expanding electricity supply poses a challenge for both data centers and energy firms. However, there are several ways to bridge this gap while mitigating the risks outlined above.
One key strategy involves designing new data centers to better utilize existing energy infrastructure. Microsoft has taken a tangible step in this direction, signing a 20-year deal to draw 835 megawatts of nuclear energy from Pennsylvania’s revived Three Mile Island plant to power its AI infrastructure (Data Center Dynamics, 2024). Amazon's $650M acquisition of a data center near the Susquehanna Steam Electric Station nuclear power station further underscores this trend (Data Center Dynamics, 2024).
Another promising solution is the development of microgrids in areas where traditional energy infrastructure is lacking. Microgrids are independent, decentralized power grids that operate without a direct connection to the central power grid (And Cable, 2024). Currently, there are approximately 692 microgrids in the US (And Cable, 2024), demonstrating their existing, albeit limited, presence. These microgrids typically rely on energy infrastructure with a small local footprint, such as renewables like wind and solar in tandem with power storage facilities to offset renewables’ inconsistent generation. Microsoft, in a forward-thinking partnership with Enchanted Rock and US Energy, is implementing a microgrid system that utilizes gases derived from food waste to power its operations, showcasing the potential for sustainable, localized energy solutions (And Cable, 2024).
Another alternative is to harness the very heat that data centers create during operations. Thermoelectric generators are capable of transforming heat into electricity, which can be used to power components of the data center (Server Lift, 2024). However, current thermoelectric generators only operate at an efficiency of 5-20% (Science Direct, 2024). While this shortcoming has hindered adoption, this option shows promise, with the thermoelectric generator industry expected to grow at a CAGR of 11.3% in the coming years (Coherent Market Insights, 2025).
Finally, one of the most promising options to expand available power uses the AI that is creating the demand. AI can be employed to optimize the electricity grid in many ways, including demand forecasting, load balancing, and predictive maintenance (Power, 2025). Under certain conditions, AI based load management alone has doubled energy reserves and reduced the average energy cost by 10% (Rand, 2025). This same technology can also increase energy accessibility, important for data centers, by as much as 10% (Rand, 2025).
While these solutions are not without their downfalls, they represent tangible ways to supplement the energy demand of AI and data centers. Just as we saw with cars, the general adoption and proliferation of AI will be directly limited by our ability and willingness to build supporting infrastructure.