Modelling the Apocalypse

Let’s all put on our science-fiction hats and imagine that humans get wiped off the face of the Earth tomorrow. Perhaps a mysterious superbug kills us all overnight, or maybe we organize a mass migration to live on the moon. In a matter of a day, we’re gone without a trace.

If your first response to this scenario is “What would happen to the climate now that fossil fuel burning has stopped?” then you may be afflicted with Climate Science. (I find myself reacting like this all the time now. I can’t watch The Lord of the Rings without imagining how one would model the climate of Middle Earth.)

A handful of researchers, particularly in Canada, recently became so interested in this question that they started modelling it. Their motive was more than just morbid fascination – in fact, the global temperature change that occurs in such a scenario is a very useful metric. It represents the amount of warming that we’ve already guaranteed, and a lower bound for the amount of warming we can expect.

Initial results were hopeful. Damon Matthews and Andrew Weaver ran the experiment on the UVic ESCM and published the results. In their simulations, global average temperature stabilized almost immediately after CO2 emissions dropped to zero, and stayed approximately constant for centuries. The climate didn’t recover from the changes we inflicted, but at least it didn’t get any worse. The “zero-emissions commitment” was more or less nothing. See the dark blue line in the graph below:

However, this experiment didn’t take anthropogenic impacts other than CO2 into account. In particular, the impacts of sulfate aerosols and additional (non-CO2) greenhouse gases currently cancel out, so it was assumed that they would keep cancelling and could therefore be ignored.

But is this a safe assumption? Sulfate aerosols have a very short atmospheric lifetime – as soon as it rains, they wash right out. Non-CO2 greenhouse gases last much longer (although, in most cases, not as long as CO2). Consequently, you would expect a transition period in which the cooling influence of aerosols had disappeared but the warming influence of additional greenhouse gases was still present. The two forcings would no longer cancel, and the net effect would be one of warming.

Damon Matthews recently repeated his experiment, this time with Kirsten Zickfeld, and took aerosols and additional greenhouse gases into account. The long-term picture was still the same – global temperature remaining at present-day levels for centuries – but the short-term response was different. For about the first decade after human influences disappeared, the temperature rose very quickly (as aerosols were eliminated from the atmosphere) but then dropped back down (as additional greenhouse gases were eliminated). This transition period wouldn’t be fun, but at least it would be short. See the light blue line in the graph below:

We’re still making an implicit assumption, though. By looking at the graphs of constant global average temperature and saying “Look, the problem doesn’t get any worse!”, we’re assuming that regional temperatures are also constant for every area on the planet. In fact, half of the world could be warming rapidly and the other half could be cooling rapidly, a bad scenario indeed. From a single global metric, you can’t just tell.

A team of researchers led by Nathan Gillett recently modelled regional changes to a sudden cessation of CO2 emissions (other gases were ignored). They used a more complex climate model from Environment Canada, which is better for regional projections than the UVic ESCM.

The results were disturbing: even though the average global temperature stayed basically constant after CO2 emissions (following the A2 scenario) disappeared in 2100, regional temperatures continued to change. Most of the world cooled slightly, but Antarctica and the surrounding ocean warmed significantly. By the year 3000, the coasts of Antarctica were 9°C above preindustrial temperatures. This might easily be enough for the West Antarctic Ice Sheet to collapse.

Why didn’t this continued warming happen in the Arctic? Remember that the Arctic is an ocean surrounded by land, and temperatures over land change relatively quickly in response to a radiative forcing. Furthermore, the Arctic Ocean is small enough that it’s heavily influenced by temperatures on the land around it. In this simulation, the Arctic sea ice actually recovered.

On the other hand, Antarctica is land surrounded by a large ocean that mixes heat particularly well. As a result, it has an extraordinarily high heat capacity, and takes a very long time to fully respond to changes in temperature. So, even by the year 3000, it was still reacting to the radiative forcing of the 21st century. The warming ocean surrounded the land and caused it to warm as well.

As a result of the cooling Arctic and warming Antarctic, the Intertropical Convergence Zone (an important wind current) shifted southward in the simulation. As a result, precipitation over North Africa continued to decrease – a situation that was already bad by 2100. Counterintuitively, even though global warming had ceased, some of the impacts of warming continued to worsen.

These experiments, assuming an overnight apocalypse, are purely hypothetical. By definition, we’ll never be able to test their accuracy in the real world. However, as a lower bound for the expected impacts of our actions, the results are sobering.

Advertisements

Nuclear Power in Context

Since its birth, nuclear power has been a target of environmental activism. To be fair, when nuclear power goes wrong, it goes wrong in a bad way. Take a look at what’s happening in Japan right now. Friday’s tsumani damaged the Fukushima Daiichi power plant, and several of its reactors have experienced partial meltdowns. Radiation from the nuclear reactions has been released into the surrounding environment, and could endanger public health in the immediate area, causing cancer and birth defects.

Nuclear disasters are horrifying, and this is by no means the worst that has happened. However, nuclear isn’t the only form of energy that experiences periodic disasters. In fact, over the past century, hydroelectric disasters have killed more people than all other forms of energy disasters combined.

(Sovacool et al, 2008, Fig. 1).

So why do we worry so much more about nuclear power disasters? Is it because the idea of the resulting radiation is more disturbing than the prospect of a dam breaking, even if it’s far less common?

However, an energy source can kill people without a large-scale disaster occurring. Let’s look at fossil fuels. Think of all the miners killed by coal accidents, all the people killed by smog inhalation or exposure to toxic chemicals (such as heavy metals) that are present in fossil fuels, deaths due to gas leaks, civilians killed by wars over oil, and so on. It’s difficult to quantify these numbers, because fossil fuels have been in use for centuries, but they clearly exceed the 4,000 or so deaths due to nuclear power accidents (as well as any other deaths due to nuclear power, such as uranium mining).

We must also look at the deaths due to climate change, which fossil fuel burning has induced. The World Health Organization estimates that over 150 000 people died as a result of climate change in 2000 alone. This annual rate will increase as the warming progresses. If we don’t step away from fossil fuels in time, they could lead to a devastating amount of death and suffering.

Fossil fuels are silent, passive, indirect killers which end up being far more destructive to human life than nuclear power. However, much of the public remains opposed to nuclear energy, and I believe this is a case of “letting perfect be the enemy of good”. I feel that we hold nuclear power to an impossible standard, that we expect it to be perfect. It’s certainly not perfect, but it’s far better than the existing system, which desperately needs to be replaced.

There are also exciting developments in nuclear technology that could make it safer and more efficient. In his recent book, top climatologist James Hansen described “fast reactors“, which are a vast improvement over the previous generations of nuclear reactors. It’s also possible to use uranium-238 as fuel, which makes up 99.3% of all natural uranium, and is usually thrown away as nuclear waste because reactors aren’t equipped to use it. Another alternative is to use thorium, a safer and more common element. If we pursue these technologies, the major downsides of nuclear power – safety and waste concerns – could diminish substantially.

Renewable sources of energy, such as solar, wind, and geothermal, are safer than nuclear power, and also have a lower carbon footprint per kWh (Sovacool, 2008b, Table 8). They are clearly the ideal choice in the long run, but they can’t solve the problem completely, at least not yet. Cost is a barrier, as is the problem of storing and transporting the electricity they generate. Maybe a few decades down the line smart grids will become a reality, and we will be able to have an energy economy that is fully renewable. If we wait for that perfect situation before doing anything, though, we will overshoot and cause far more climate change than we can deal with.

I don’t know if I would describe myself as “pro-nuclear”, but I am definitely “anti-fossil-fuel”. I am aware of the risks nuclear power poses, and feel that, from a risk management perspective, it is still preferable to coal and oil by a long shot. Solving climate change will require a multi-faceted energy economy, and it would be foolish to rule out one viable option simply because it isn’t perfect.