Roaming the country, and asking people about their concerns on artificial intelligence, one can hear a lot about two things: job displacement, and energy footprints.
Both of them are tied to larger, more universal fears. On the one hand, humans have always been concerned with where their next meal is going to come from, and AI is now playing a rather significant role in that puzzle. In terms of climate, and energy, and ecology, we are increasingly focused on carbon footprints as a whole, and so it makes sense that many of us would be focused on the carbon footprint of AI, and its energy use, not just because of this existential concern, but also because data centers will consume so much power.
I saw a map yesterday of where new power supplies are projected to be built. There was an enormous bubble around the PJM (Pennsylvania New Jersey Maryland area, more on that later) – and in the NOVA (Northern Virginia) area around the nation’s capital, which has been hailed as an east coast Silicon Valley of sorts.
In fact, just at a glance, that corridor seems to be the single largest area of focus. As for the real Silicon Valley, the map hardly registered any projected new generation. There were some large blips across the Midwest. Take a look.
Getting to a Gigawatt
I also heard a number of experts, and pundits like Dylan Ratigan, explaining that there is tremendous demand for new grid-connected projects on the order of one gigawatt, which is the size, in energy draws, of today’s large power plant, but that there are not many areas where that 1Gig is easily available.
I looked up new projections for increases in nuclear plants in Pennsylvania, and got no results. The numbers suggest the state won’t exceed the current number of plants plus what you can count on your fingers in that time frame. I researched whether natural gas is being looked at as a solution “behind-the-meter” and found that this approach is limited as well. There is a “Homer City” project that is projected to be able to add 4.5Gigawatts on the site of a former coal plant, but that’s really an outlier.
What I found, in PMJ and elsewhere, is that utility operators and planners are asking that new projects have significant “self-supply”; in other words, you’re more likely to be able to connect to the grid as a large drawer (ex: 1Gw) if you can bring your own power supply.
In addition, I found that national regulators like the DOE (the United States Department of Energy) and FERC (the Federal Energy Regulatory Commission) are pushing toward clearer, more consistent “interconnection” rules for these types of new users.
The Limits of Self-Supply
A novel idea. Why can’t data center operators just build their own power supply into the equation?
It turns out that they can, to some extent, but they simply need backup. The power sources contemplated for new sites, whether they are nuclear, or coal, or natural gas, or renewables, or whatever else, are not stable in the way that the existing U.S. grid is stable. Even pipelines have a particular need for stability that only the grid can match. So the new sites need to be grid-connected largely for brownouts, emergency and black swan situations, and low-power events, even if they do build in new capacity for 100% of their needs.
That stands in contrast to a grid that is already stressed, already maxed in many places, and generally ill-equipped for this level of demand. Here’s an excerpt from DoE analysis:
“According to DOE’s findings, which are based on similar work completed by the North American Electric Reliability Corporation (NERC), the U.S. energy grid will not be able to sustain the combined impact of coal and other plant closures, an overreliance on intermittent energy sources like wind and solar, and data center growth, highlighting the urgency of increasing dispatchable energy output … the analysis clearly demonstrates that in the absence of robust and rapid energy policy reform that prioritizes use of America’s abundant natural resources and fast infrastructure buildout, resource inadequacy will prevent development of new manufacturing in America, prohibit the re-industrialization of the US economy, drive up the cost of living for all Americans, and eliminate the potential to sustain enough data centers to win the artificial intelligence (AI) arms race.”
Of course, these goals sound ambitious, but the point is that the demand is beyond what the grid can comfortably accommodate, without an energy revolution of some sort. So that concept of self-supply is going to be critical.
Eating Our Lunch
The power generation approach of the Middle Kingdom across the sea is also contributing to U.S. angst on its own energy accomplishments. A team-written piece at the Financial Times includes this:
“In 2024, the PRC [People’s Republic of China] added 429GW of new power capacity — more than one-third of the entire U.S. grid, and more than half of all global electricity growth. The US contributed just 51GW, or 12 percent.”
(The FT piece also has a more interactive version of the above-cited data center energy map of the U.S. so you can use it as a single destination for further research here.)
Experts point out that some of this is because of the lack of regulatory hurdles for the Chinese, and there’s also the greater geopolitical and economic context, which should be obvious to everyone. If the U.S. dollar loses its place as the global reserve currency, that’s going to be an arguably more fundamental problem. But there is an AI arms race that is commanding people’s attention.
Power Plants in Space
The last resort, in some ways, is a move off of the terrestrial surface.
Reporting from Jeremy Hsu of Scientific American provides this:
“In early November, Google announced ‘Project Suncatcher,’ which aims to launch solar-powered satellite constellations carrying its specialty AI chips, with a demonstration mission planned for 2027. Around the same time, the start-up Starcloud celebrated the launch of a 60-kilogram satellite with an NVIDIA H100 GPU, as a prelude to an orbital data center that is expected to require five gigawatts of electric power by 2035. Those two efforts are part of a broader wave of concepts that move some computing off-planet.”
That larger trend is demonstrable within the energy sector, as planners brainstorm what the future of power production might look like.
That’s a little bit on what the world is contemplating to feed the AI beast, as energy demands continue to ramp up just prior to 2026. What do you think?











