The Double Symbolism of the Three Mile Island Restart

 By Ted Nordhaus and Alex Trembath

It is hard to overstate the symbolism of Constellation Energy’s announcement last month that it planned to restart the recently shuttered Three Mile Island nuclear power plant outside of Harrisburg, Pennsylvania. The partial meltdown of Unit 2 in 1979 marked the beginning of America’s turn away from nuclear energy. Dozens of orders for nuclear reactors were canceled. A wave of new regulations by the NRC led to skyrocketing costs for those already under construction and the end of new nuclear construction for the better part of three decades. 

By the time that Exelon Energy prematurely shuttered Unit 1 in 2019, the nuclear sector in the United States had reached a nadir. Not only had the United States stopped building new reactors but it had begun to close existing plants as well. The power sector was awash in cheap natural gas, electricity demand was not growing quickly, and neither liberalized electricity markets nor policy-makers valued nuclear’s attributes as a firm and reliable source of low carbon electricity. Nuclear energy’s problems were further exacerbated by environmental groups campaigning to shut down plants like Indian Point in New York, Diablo Canyon in California, and Three Mile Island despite their professed concern about climate change.

Much has changed in five short years.

Natural gas is still cheap. But electricity prices have soared across much of the country. Electricity demand is once again growing robustly, as AI driven data center growth is already outstripping the generation capacity that grid planners anticipated just a few years ago. Local resistance to wind, solar, and transmission, along with the hesitance of grid planners to allow more variable generation resources onto electricity systems without sufficient firm resources to back them up, has slowed growth of wind generation over the last few years, though solar is still growing apace. And a welcome shift of approach from tech and other firms at the leading edge of corporate commitments to clean energy, from simply procuring enough renewable energy credits in bulk to claim that they are offsetting their carbon dioxide emissions to matching their demand on an hour to hour basis with clean electricity generation, has tilted the corporate procurement landscape toward nuclear energy.

So the Three Mile Island reopening marks a remarkable turnabout in the trajectory of the US power sector, nuclear energy, and US decarbonization trends. For three decades, a combination of slow electricity demand growth, a sclerotic nuclear industry and regulator, and cheap fossil energy meant that there were not strong economic or energy security incentives to deploy new nuclear generation. Climate policy efforts primarily focused on deploying subsidized wind and solar on the margins of the electricity system while the shale gas revolution allowed the coal-to-gas transition to do the heavy lifting in terms of actual decarbonization. 

Share

Growing penetration of variable sources of renewable energy reduced some share of fossil fuel combustion. But the backbone of the system was still an enormous built infrastructure of fossil generation, augmented by incremental additions of new gas generation, with wind and solar operating mainly as fuel savers, dependent upon an enormous reserve of fossil generation capacity standing by to assure that the lights stayed on when the variable resources were not available. 

Large electricity customers like Microsoft contributed to the growth of renewable energy during this period. But they did so mostly through accounting shenanigans. Large municipal and corporate customers could run their operations on grid electricity while claiming they were doing so with renewable energy by purchasing renewable energy credits (RECs). The growing market for RECs provided, at best, marginal additional support for the deployment of wind and solar to grids that were still heavily dependent upon fossil fuels and, at worst, no additionality at all. 

More recently, though, leading tech firms like Google and Microsoft have shifted their electricity procurement strategies to ensure their power was clean in engineering, not just accounting. Earlier this year, Google, Nucor, and Microsoft, for instance, announced plans to aggregate their corporate electricity demand to accelerate the development of clean firm generation needed to fully decarbonize their operations.  

Now, with electricity demand growing faster than anyone anticipated, including Google and Microsoft, the sudden and unexpected arrival of substantial new demand growth from AI and data centers, electric vehicles, and onshoring of manufacturing, has called a key question about the future of the power sector much sooner than almost anyone expected. 

For a long time, many climate and renewable energy advocates dismissed concerns about how electricity systems would deal with very high penetrations of variable renewable energy without natural gas, arguing that it wasn’t an immediate issue. Fossil fuels still accounted for most generation, and policy-makers and grid managers could cross that bridge once most fossil fuel generation had been eliminated from the system. But booming electricity demand and the unique requirements of the new AI data centers for enormous amounts of always-on clean electricity have brought that question to the fore. 

A problem that has long been theoretical is now material. For the last two decades, a cohort of energy system modelers, most notoriously Stanford University’s Mark Jacobson, produced models suggesting that wind and solar energy could do most or all of the work. Even among more cautious advocates, it has become something of a mantra to assert that the United States can achieve “70 to 80 percent” economy-wide decarbonization within a couple decades, relying only on renewables, heat pumps, EVs, and batteries. 

But fifty years after Amory Lovins first suggested that the so-called “soft energy path” could meet the energy demands of modern, industrialized economies without nuclear energy or fossil fuels, no country in the world has even remotely approached that achievement. So the symbolism of the Three Mile Island restart goes beyond the past and future of nuclear energy. It also raises more fundamental questions about the future of the US electricity system and the plausible paths to deeply decarbonize it.

Two Natural Experiments

A decade ago, in the wake of the Fukushima nuclear accident, the world got to conduct a natural experiment of sorts on the value of nuclear energy to efforts to decarbonize electricity grids. As Japan, Germany, California, and New York proceeded to close nuclear plants, proponents of doing so, most notably environmentalists and renewable energy advocates, claimed that they would be replaced with renewable energy. Nuclear advocates, on the other hand, argued that they would be replaced by fossil fuels and that emissions would rise as a result. 

A decade later, the nuclear advocates were proven correct. In every case, emissions rose, lending credence to the important and unique role that nuclear energy plays in low carbon electricity systems. 

Today, a second natural experiment is underway. As the prospect of a step change in electricity demand driven by data centers and AI came into view over the last few years, utilities, grid planners, tech firms, and other energy industry insiders were quick to predict that demand would quickly outstrip the ability of existing and planned generation to meet it. Existing resources, along with planned additions, mostly solar, wind, and natural gas wouldn’t be enough. Not only that, but simply procuring more renewable energy wasn’t the solution, because the unique attributes of data centers required a dedicated source of electricity that would be available all the time. 

Alternative energy advocates were quick to push back. When Duke Energy proposed including both small modular reactors and new gas plants in its Integrated Resource Plan a couple of years ago, our former colleague, Tyler Norris, then a solar developer and now a PhD candidate at the Duke Nicholas School of the Environment protested, arguing that the utility was just doing what utilities do: exaggerating the coming demand to get state utility regulators to allow them to rate base more generation capacity. The SMRs, he argued, were just a stalking horse for new gas plants and the new demand, insofar as it materialized, could be met by simply allowing solar firms like Norris’ to pump more renewable energy into the grid, without worrying too much about how to firm it up. 

Notably, in scenarios like these, the variability and occasional scarcity of electricity generation are a feature, not a bug, demanding that grid managers and large sources of load finally pursue “demand-side management” in earnest. If there wasn’t enough power when they needed it, in other words, firms like Microsoft and Google would just have to buck up and curb their demand. 

So by this spring, when the New York Times’ Brad Plumer wrote about a new analysis from the consulting firm Grid Strategies forecasting that peak load growth would double over the next five years, reaction was swift. Jon Koomey, a well known energy efficiency advocate (and nuclear energy opponent) argued that load growth from data centers was both speculative, in that it hadn’t materialized yet, and overstated, pointing to past forecasts of load growth from data centers that had failed to account for improving energy efficiency. Others argued that utilities were focused on gas and nuclear because they could be rate based and that the new demand could be met through expediting connection of wind, solar, and batteries onto electrical grids along with demand side management.

But the Three Mile Island deal should put many of those claims to rest. It was not initiated by a regulated monopoly utility seeking to rate base new generation capacity, but rather through a power purchase agreement between Constellation Energy, acting as a merchant power generator, and Microsoft, a tech giant that procures lots of wind, solar, and energy storage but has committed to hourly matching of clean energy to its data center demand. And like a similar plant reopening in Michigan, it was undertaken in a liberalized electricity market, not a vertically integrated cost-of-service system. 

Were it possible to meet that demand by simply penning a deal for some new solar and wind generation and energy storage, Microsoft surely would have done so. But the basic economics of the massive data centers that AI requires and the variable nature of renewable energy generation are a terrible match. Data centers may be energy hogs compared to many other electricity customers. But energy, even very expensive energy, is a relatively small part of the cost of AI data centers. They require huge capital outlays for the massive computing power they need. The firms that build and operate these centers need to utilize them all the time to make money. Demand-side management of their electricity use is not an option.

So the natural experiment now underway is already showing that demand from these electricity intensive, always-on data centers cannot be met solely with variable renewable energy and batteries. As opposed to the last decade, where grid managers, utilities, and customers could decarbonize the grid by adding wind and solar and running a large installed base of natural gas plants at lower capacity, robust new demand growth coming from loads that are only economical when they are powered all the time mean that meeting that new demand with new solar and wind will also require adding a lot more new natural gas generation. 

The alternative is nuclear. Which is the reason that Microsoft and other tech firms are so keen on it, particularly those that have committed to net zero emissions and hourly matching. You can’t run these data centers economically solely with variable renewables and batteries and you can’t meet your climate and emissions commitments if you use natural gas to firm them up. 

The problem is that beyond a few more plants like Three Mile Island and the aforementioned Palisades in Michigan, there are not remotely enough shuttered plants to reopen or existing plants to uprate, and after that, it is unlikely that anyone can get new nuclear builds online before the beginning of the next decade at the earliest. The same holds for technologies with similar technical profiles to nuclear energy, like advanced deep-earth geothermal systems or Allam-cycle natural gas plants with carbon capture. So over the next decade, what we are likely to see as AI driven electricity demand grows is a lot of new wind, solar, and unmitigated gas, alongside the reopening of nuclear plants wherever that is possible. But the need for all that gas also tells us a lot about not only how AI demands are likely to be met in the short term but also the real requirements for a fully decarbonized electrical system in the long term.

Gas or Nuclear, Not Nuclear or Renewables

The characteristics of data center loads are more like those of the power sector as a whole than one might think. Demand for power waxes and wanes depending on the time of day and the time of year. Demand peaks in the late afternoon and early evening, as people come home from work and in the summer months when air conditioning substantially increases electricity demand. With the right incentives and technology, some of that demand can be shifted to different times on a daily basis but not a seasonal basis. But once that is said and done, there is a very substantial daily and year-round demand that must be met all the time. 

That all-the-time demand used to be called baseload, and utilities typically built their systems from base to peak, figuring out how much generation they needed that was basically always on, and then adding various sorts of additional generation on the grid that could be turned on and off quickly when demand spiked at various times of day and periods of the year. Conventional nuclear reactors and coal plants with large thermal inertia and limited ability to modulate their power output were well suited to provide baseload power to electrical grids. Natural gas turbines, with low capital and maintenance costs and substantial ability to ramp up and down quickly, were great for meeting peak demand. 

This approach to designing, building, and operating electrical grids began to change as significant shares of wind and solar began to show up in the electricity mix. These variable generation sources are neither baseload nor peaking power. They produce electricity when they produce electricity. You can’t just turn them on when you need them and they aren’t always on. 

The combination of falling module and turbine costs, subsidies, and deployment mandates made them attractive investments for firms able to harvest the tax subsidiesAnd grids could handle all that variable electricity as long as there was a large capacity of natural gas to ramp up and down and provide “firm capacity” to balance variability. But even in the most optimistic modeling of electrical systems powered predominantly by wind and solar energy (grids that, in the real world, still do not exist), it has been clear for many years that as the share of wind and solar began to approach a majority and on toward 100%, keeping the lights on all the time became an increasingly costly proposition. 

The closer the share of wind and solar got to 100%, the more the grid required an enormous overbuild of wind, solarand transmission across large geographies to capture electricity where the wind was blowing and the sun was shining and move it to places where it was needed. It required huge amounts of energy storage and lots of demand management to store electricity until it was needed and shift demand to times when electricity was available. And even with all of that, it still required something close to an entirely redundant natural gas generation system with capacity to meet most or all of the needs of the system sitting idle most of the time, in order to account for sustained periods of low sun and wind.

This creates two major problems: the first being that the massive overbuild of wind and solar together with the massive need for energy storage systems gets very expensive very quickly, and the second being that you still are dependent on a large natural gas infrastructure, even if you rarely use it. That is the case for a system perfectly optimized to achieve net zero emissions. In the real world, we are always incrementally retrofitting the existing grid we have, and have to accommodate a variety of other public priorities, not least the willingness of ratepayers and taxpayers to foot the bill, local resistance to building all the necessary solar, wind, and transmission in various backyards, and large unexpected macroeconomic and sectoral shocks to the electricity system such as the one we are witnessing today. Once we leave the models and enter the real world, in other words, it all gets far more complicated.

Enter (or Reenter) Nuclear 

It’s been clear for a while now, from those very same models that look at how low carbon electrical systems might work, that if you can get somewhere between 25 and 50% of your electricity in a decarbonized electricity system from nuclear, particularly next generation nuclear with significantly greater ability to ramp its output up and down, both the cost and complexity of the system are far lower. You don’t need to massively overbuild wind and solar. You don’t need a vast capacity of batteries to store energy for days or even weeks. Your high-voltage transmission requirements are much lower, since nuclear can be sited in smaller geographies closer to load. And you don’t need a big natural gas infrastructure that mostly sits idle. 

Nuclear in these scenarios is not necessarily the lion’s share of the system. But it does require building significant new nuclear capacity. When Biden Administration officials say that the US needs 200GW of new nuclear by 2050 to meet its climate goals, they are talking about roughly doubling nuclear energy’s share of US electricity generation, from 20% to 40%. Wind and solar energy, under this scenario for a decarbonized US power sector, still produce the majority of electricity. But that still requires building another 200 large reactors over the next three decades or somewhere in the neighborhood of 500-3500 small reactors

The coming data center crunch over the next decade in this way is a harbinger of the larger challenge that the effort to build a deeply decarbonized electricity system over the next three decades is likely to face. You can operate the system with a lot more wind, solar, batteries, and natural gas. Or you can operate it with a lot more wind, solar, batteries, and nuclear. Choose the former and what you will likely end up with is costly electricity that can’t be fully decarbonized. Choosing the latter offers many advantages, climate and land use chief among them, but also more robust grid reliability and resilience. 

But doing so is also more easily said than done. Like the 100% renewable energy spreadsheets that Mark Jacobson is so apt at producing, saying that the nation should build a lot more nuclear is the easy part. Despite a sea change in sentiment toward nuclear energy among the public, policy-makers, business leaders, and investors, the United States is not remotely prepared today to deploy nuclear at the scale that would be necessary. In the second part of this series, we’ll take a hard look at how and where the US might build all those new reactors.

Comments

Popular posts from this blog

New York --- What's keeping the lights on --- Grid Brief March 27

What The Media Won’t Tell You About The Energy Transition

There Once Was a Blade From Nantucket -- Doomberg