Unknown's avatar

About climatesight

Kaitlin Naughten is an ocean-ice modeller at the British Antarctic Survey in Cambridge.

Modelling the Apocalypse

Let’s all put on our science-fiction hats and imagine that humans get wiped off the face of the Earth tomorrow. Perhaps a mysterious superbug kills us all overnight, or maybe we organize a mass migration to live on the moon. In a matter of a day, we’re gone without a trace.

If your first response to this scenario is “What would happen to the climate now that fossil fuel burning has stopped?” then you may be afflicted with Climate Science. (I find myself reacting like this all the time now. I can’t watch The Lord of the Rings without imagining how one would model the climate of Middle Earth.)

A handful of researchers, particularly in Canada, recently became so interested in this question that they started modelling it. Their motive was more than just morbid fascination – in fact, the global temperature change that occurs in such a scenario is a very useful metric. It represents the amount of warming that we’ve already guaranteed, and a lower bound for the amount of warming we can expect.

Initial results were hopeful. Damon Matthews and Andrew Weaver ran the experiment on the UVic ESCM and published the results. In their simulations, global average temperature stabilized almost immediately after CO2 emissions dropped to zero, and stayed approximately constant for centuries. The climate didn’t recover from the changes we inflicted, but at least it didn’t get any worse. The “zero-emissions commitment” was more or less nothing. See the dark blue line in the graph below:

However, this experiment didn’t take anthropogenic impacts other than CO2 into account. In particular, the impacts of sulfate aerosols and additional (non-CO2) greenhouse gases currently cancel out, so it was assumed that they would keep cancelling and could therefore be ignored.

But is this a safe assumption? Sulfate aerosols have a very short atmospheric lifetime – as soon as it rains, they wash right out. Non-CO2 greenhouse gases last much longer (although, in most cases, not as long as CO2). Consequently, you would expect a transition period in which the cooling influence of aerosols had disappeared but the warming influence of additional greenhouse gases was still present. The two forcings would no longer cancel, and the net effect would be one of warming.

Damon Matthews recently repeated his experiment, this time with Kirsten Zickfeld, and took aerosols and additional greenhouse gases into account. The long-term picture was still the same – global temperature remaining at present-day levels for centuries – but the short-term response was different. For about the first decade after human influences disappeared, the temperature rose very quickly (as aerosols were eliminated from the atmosphere) but then dropped back down (as additional greenhouse gases were eliminated). This transition period wouldn’t be fun, but at least it would be short. See the light blue line in the graph below:

We’re still making an implicit assumption, though. By looking at the graphs of constant global average temperature and saying “Look, the problem doesn’t get any worse!”, we’re assuming that regional temperatures are also constant for every area on the planet. In fact, half of the world could be warming rapidly and the other half could be cooling rapidly, a bad scenario indeed. From a single global metric, you can’t just tell.

A team of researchers led by Nathan Gillett recently modelled regional changes to a sudden cessation of CO2 emissions (other gases were ignored). They used a more complex climate model from Environment Canada, which is better for regional projections than the UVic ESCM.

The results were disturbing: even though the average global temperature stayed basically constant after CO2 emissions (following the A2 scenario) disappeared in 2100, regional temperatures continued to change. Most of the world cooled slightly, but Antarctica and the surrounding ocean warmed significantly. By the year 3000, the coasts of Antarctica were 9°C above preindustrial temperatures. This might easily be enough for the West Antarctic Ice Sheet to collapse.

Why didn’t this continued warming happen in the Arctic? Remember that the Arctic is an ocean surrounded by land, and temperatures over land change relatively quickly in response to a radiative forcing. Furthermore, the Arctic Ocean is small enough that it’s heavily influenced by temperatures on the land around it. In this simulation, the Arctic sea ice actually recovered.

On the other hand, Antarctica is land surrounded by a large ocean that mixes heat particularly well. As a result, it has an extraordinarily high heat capacity, and takes a very long time to fully respond to changes in temperature. So, even by the year 3000, it was still reacting to the radiative forcing of the 21st century. The warming ocean surrounded the land and caused it to warm as well.

As a result of the cooling Arctic and warming Antarctic, the Intertropical Convergence Zone (an important wind current) shifted southward in the simulation. As a result, precipitation over North Africa continued to decrease – a situation that was already bad by 2100. Counterintuitively, even though global warming had ceased, some of the impacts of warming continued to worsen.

These experiments, assuming an overnight apocalypse, are purely hypothetical. By definition, we’ll never be able to test their accuracy in the real world. However, as a lower bound for the expected impacts of our actions, the results are sobering.

Cumulative Emissions and Climate Models

As my summer research continues, I’m learning a lot about previous experiments that used the UVic ESCM (Earth System Climate Model), as well as beginning to run my own. Over the past few years, the UVic model has played an integral role in a fascinating little niche of climate research: the importance of cumulative carbon emissions.

So far, global warming mitigation policies have focused on choosing an emissions pathway: making a graph of desired CO2 emissions vs. time, where emissions slowly reduce to safer levels. However, it turns out that the exact pathway we take doesn’t actually matter. All that matters is the area under the curve: the total amount of CO2 we emit, or “cumulative emissions” (Zickfeld et al, 2009). So if society decides to limit global warming to 2°C (a common target), there is a certain amount of total CO2 that the entire world is allowed to emit. We can use it all up in the first ten years and then emit nothing, or we can spread it out – either way, it will lead to the same amount of warming.

If you delve a little deeper into the science, it turns out that temperature change is directly proportional to cumulative emissions (Matthews et al, 2009). In other words, if you draw a graph of the total amount of warming vs. total CO2 emitted, it will be a straight line.

This is counter-intuitive, because the intermediate processes are definitely not straight lines. Firstly, the graph of warming vs. CO2 concentrations is logarithmic: as carbon dioxide builds up in the atmosphere, each extra molecule added has less and less effect on the climate.

However, as carbon dioxide builds up and the climate warms, carbon sinks (which suck up some of our emissions) become less effective. For example, warmer ocean water can’t hold as much CO2, and trees subjected to heat stress often die and stop photosynthesizing. Processes that absorb CO2 become less effective, so more of our emissions actually stay in the air. Consequently, the graph of CO2 concentrations vs. CO2 emissions is exponential.

These two relationships, warming vs. concentrations and concentrations vs. emissions, more or less cancel each other out, making total warming vs. total emissions linear. It doesn’t matter how much CO2 was in the air to begin with, or how fast the allowable emissions get used up. Once society decides how much warming is acceptable, all we need to do is nail down the proportionality constant (the slope of the straight line) in order to find out how much carbon we have to work with. Then, that number can be handed to economists, who will figure out the best way to spread out those emissions while causing minimal financial disruption.

Finding that slope is a little tricky, though. Best estimates, using models as well as observations, generally fall between 1.5°C and 2°C for every trillion tonnes of carbon emitted (Matthews et al, 2009; Allen et al, 2009; Zickfeld et al, 2009). Keep in mind that we’ve already emitted about 0.6 trillion tonnes of carbon (University of Oxford). Following a theme commonly seen in climate research, the uncertainty is larger on the high end of these slope estimates than on the low end. So if the real slope is actually lower than our best estimate, it’s probably only a little bit lower; if it’s actually higher than our best estimate, it could be much higher, and the problem could be much worse than we thought.

Also, this approach ignores other human-caused influences on global temperature, most prominently sulfate aerosols (which cause cooling) and greenhouse gases other than carbon dioxide (which cause warming). Right now, these two influences basically cancel, which is convenient for scientists because it means we can ignore both of them. Typically, we assume that they will continue to cancel far into the future, which might not be the case – there’s a good chance that developing countries like China and India will reduce their emissions of sulfate aerosols, allowing the non-CO2 greenhouse gases to dominate and cause warming. If this happened, we couldn’t even lump the extra greenhouse gases into the allowable CO2 emissions, because the warming they cause does depend on the exact pathway. For example, methane has such a short atmospheric lifetime that “cumulative methane emissions” is a useless measurement, and certainly isn’t directly proportional to temperature change.

This summer, one of my main projects at UVic is to compare what different models measure the slope of temperature change vs. cumulative CO2 emissions to be. As part of the international EMIC intercomparison project that the lab is coordinating, different modelling groups have sent us their measurements of allowable cumulative emissions for 1.5°C, 2°C, 3°C, and 4°C global warming. Right now (quite literally, as I write this) I’m running the same experiments on the UVic model. It’s very exciting to watch the results trickle in. Perhaps my excitement towards the most menial part of climate modelling, watching as the simulation chugs along, is a sign that I’m on the right career path.

Stalin believed in gravity. Do you?

Here’s a classy way to slam people you disagree with: compare them to terrorists, dictators, and mass murderers.

Such was the focus of a recent billboard campaign by the Chicago-based Heartland Institute, a PR group that denies the existence of human-caused climate change. The only billboard that was actually displayed featured Ted Kaczynski (the Unabomber) and read, “I still believe in global warming. Do you?”

The message is clear: if a monster believes something, citizens of good moral standing should believe exactly the opposite. The Internet was quick to ridicule this philosophy, with parodies such as the following:

Similar billboards featuring Charles Manson and Fidel Castro were planned, but never publicly displayed. Heartland also considered putting Osama bin Laden on a future billboard. On their website, they attempted to justify this campaign:

The people who still believe in man-made global warming are mostly on the radical fringe of society. This is why the most prominent advocates of global warming aren’t scientists. They are murderers, tyrants, and madmen.

Given that a majority of Americans accept global warming, people did not take kindly to this campaign. Public outcry and negative media coverage led Heartland to cancel the project after 24 hours. However, their statement showed little remorse:

We do not apologize for running the ad, and we will continue to experiment with ways to communicate the ‘realist’ message on the climate.

Even though the campaign has been cancelled, the Heartland Institute continues to suffer financial repercussions. Dozens of corporate donors, including State Farm Insurance and drinks firm Diego (which owns Guiness and Smirnoff) have ended their support as a direct result of this campaign. Earlier in the year, Heartland lost financial backing from General Motors after internal documents exposed some of the group’s projects, particularly the development of an alternative curriculum to teach K-12 students that global warming is fake.

Will they recover from this failed campaign? Given Heartland’s reliance on donations, their prospects look poor. It seems that the Heartland Institute, previously one of the most influential mouthpieces for climate change denial, is going out with a bang.

Summer Research

I recently started working for the summer, with Andrew Weaver’s research group at the University of Victoria. If you’re studying climate modelling in Canada, this is the place to be. They are a fairly small group, but continually churn out world-class research.

Many of the projects here use the group’s climate model, the UVic ESCM (Earth System Climate Model). I am working with the ESCM this summer, and have previously read most of the code, so I feel pretty well acquainted with it.

The climate models that most people are familiar with are the really complex ones. GCMs (General Circulation Models or Global Climate Models, depending on who you talk to) use high resolution, a large number of physical processes, and relatively few parameterizations to emulate the climate system as realistically as possible. These are the models that take weeks to run on the world’s most powerful supercomputers.

EMICs (Earth System Models of Intermediate Complexity) are a step down in complexity. They run at a lower resolution than GCMs and have more paramaterizations. Individual storms and wind patterns (and sometimes ocean currents as well) typically are not resolved – instead, the model predicts the statistics of these phenomena. Often, at least one component (such as sea ice) is two-dimensional.

The UVic ESCM is one of the most complex EMICs – it really sits somewhere between a GCM and an EMIC. It has a moderately high resolution, with a grid of 3.6° longitude by 1.8° latitude (ten thousand squares in all), and 19 vertical layers in the ocean. Its ocean, land, and sea ice component would all belong in a GCM. It even has a sediment component, which simulates processes that most GCMs ignore.

The only reason that the UVic model is considered an EMIC is because of its atmosphere component. This part of the model is two-dimensional and parameterizes most processes. For example, clouds aren’t explicitly simulated – instead, as soon as the relative humidity of a region reaches 85%, the atmospheric moisture falls out as rain (or snow). You would never see this kind of atmosphere in a GCM, and it might seem strange for scientists to deliberately build an unrealistic model. However, this simplified atmosphere gives the UVic ESCM a huge advantage over GCMs: speed.

For example, today I tested out the model with an example simulation. It ran on a Linux cluster with 32 cores, which I accessed remotely from a regular desktop. It took about 7 minutes of real time to simulate each year and record annual averages for several dozen variables. In comparison, many GCMs take an entire day of real time to simulate a year, while running on a machine with thousands of cores. Most of this work is coming from the atmospheric component, which requires short time steps. Consequently, cutting down on complexity in the atmosphere gives the best return on model efficiency.

Because the UVic model is so fast, it’s suitable for very long runs. Simulating a century is an “overnight job”, and several millennia is no big deal (especially if you run it on WestGrid). As a result, long-term processes have come to dominate the research in this lab: carbon cycle feedbacks, sensitivity studies, circulation in the North Atlantic. It simply isn’t feasible to simulate these millennial-scale processes on a GCM – so, by sacrificing complexity, we’re able to open up brand new areas of research. Perfectly emulating the real world isn’t actually the goal of most climate modelling.

Of course, the UVic ESCM is imperfect. Like all models, it has its quirks – an absolute surface temperature that’s a bit too low, projections of ocean heat uptake that are a bit too high. It doesn’t give reliable projections of regional climate, so you can only really use globally or hemispherically averaged quantities. It’s not very good at decadal-scale projection. However, other models are suitable for these short-term and small-scale simulations: the same GCMs that suffer when it comes to speed. In this way, climate models perform “division of labour”. By developing many different models of varying complexity, we can make better use of the limited computer power available to us.

I have several projects lined up for the summer, and right now I’m reading a lot of papers to familiarize myself with the relevant sub-fields. There have been some really cool discoveries in the past few years that I wasn’t aware of. I have lots of ideas for posts to write about these papers, as well as the projects I’m involved in, so check back often!

The Day After Tomorrow: A Scientific Critique

The 2004 film The Day After Tomorrow, in which global warming leads to a new ice age, has been vigorously criticized by climate scientists. Why is this? What mistakes in the film led Dr. Andrew Weaver, Canada’s top climate modeller, to claim that “the science-fiction movie The Day After Tomorrow creatively violates every known law of thermodynamics”? What prompted Dr. Gavin Schmidt, NASA climatologist, to say thatThe Day After Tomorrow was so appallingly bad, it was that that prompted me to become a more public scientist”? What could an innocent blockbuster movie have done to deserve such harsh criticisms?

A New Ice Age?

The Day After Tomorrow opens with a new scientific discovery by paleoclimatologist Jack Hall, played by Dennis Quaid. After a particularly harrowing trip to gather Antarctic ice cores, he discovers evidence of a previously unknown climate shift that occurred ten thousand years ago. Since the film is set in the early 2000s, and ice cores yielding hundreds of thousands of years of climate data have been studied extensively since the 1960s, it seems implausible that such a recent and dramatic global climatic event would have gone previously unnoticed by scientists. However, this misstep is excusable, because a brand new discovery is a vital element of many science fiction films.

Jack goes on to describe this ancient climate shift. As the world was coming out of the last glacial period, he explains, melting ice sheets added so much freshwater to the Atlantic Ocean that certain ocean circulation patterns shut down. Since thermohaline circulation is a major source of heat for the surfaces of continents, the globe was plunged back into an ice age. Jack’s portrayal of the event is surprisingly accurate: a sudden change in climate did occur around ten thousand years ago, and was most likely caused by the mechanisms he describes. To scientists, it is known as the Younger Dryas.

The world’s ascent out of the last ice age was not smooth and gradual; rather, it was punctuated by jumps in temperature coupled with abrupt returns to glacial conditions. The Younger Dryas – named after a species of flower whose pollen was preserved in ice cores during the event – was the last period of sudden cooling before the interglacial fully took over. Ice core data worldwide indicates a relatively rapid drop in global temperatures around eleven thousand years ago. The glacial conditions lasted for approximately a millennium until deglaciation resumed.

The leading hypothesis for the cause of the Younger Dryas involves a sudden influx of freshwater from the melting Laurentide Ice Sheet in North America into the Atlantic Ocean. This disruption to North Atlantic circulation likely caused North Atlantic deep water formation, a process which supplies vast amounts of heat to northern Europe, to shut down. Substantial regional cooling allowed the glaciers of Europe to expand. The ice reflected sunlight, which triggered further cooling through the ice-albedo feedback. However, the orbital changes which control glacial cycles eventually overpowered this feedback. Warming resumed, and the current interglacial period began.

While Jack Hall’s discussion of the Younger Dryas is broadly accurate, his projections for the future are far-fetched. He asserts that, since the most recent example of large-scale warming triggered glacial conditions, the global warming event currently underway will also cause an ice age. At a United Nations conference, he claims that this outcome is virtually certain and “only a matter of time”. Because it happened in the past, he reasons, it will definitely happen now. Jack seems to forget that every climate event is unique: while looking to the past can be useful to understand today’s climate system, it does not provide a perfect analogue upon which we can base predictions. Differences in continental arrangement, initial energy balance, and global ice cover, to name a few factors, guarantee that no two climate changes will develop identically.

Additionally, Jack’s statements regarding the plausibility of an imminent thermohaline shutdown due to global warming fly in the face of current scientific understanding. As the world continues to warm, and the Greenland ice sheet continues to melt, the North Atlantic circulation will probably slow down due to the added freshwater. The resulting cooling influence on parts of Europe will probably still be overwhelmed by warming due to greenhouse gases. However, a complete shutdown of North Atlantic deep water formation is extremely unlikely within this century. It’s unclear whether an eventual shutdown is even possible, largely because there is less land ice available to melt than there was during the Younger Dryas. If such an event did occur, it would take centuries and still would not cause an ice age – instead, it would simply cancel out some of the greenhouse warming that had already occurred. Cooling influences simply decrease the global energy balance by a certain amount from its initial value; they do not shift the climate into a predetermined state regardless of where it started.

Nevertheless, The Day After Tomorrow goes on to depict a complete shutdown of Atlantic thermohaline circulation in a matter of days, followed by a sudden descent into a global ice age that is spurred by physically impossible meteorological phenomena.

The Storm

Many questions about the Ice Ages remain, but the scientific community is fairly confident that the regular cycles of glacial and interglacial periods that occurred throughout the past three million years were initiated by changes in the Earth’s orbit and amplified by carbon cycle feedbacks. Although these orbital changes have been present since the Earth’s formation, they can only lead to an ice age if sufficient land mass is present at high latitudes, as has been the case in recent times. When a glacial period begins, changes in the spatial and temporal distribution of sunlight favour the growth of glaciers in the Northern Hemisphere. These glaciers reflect sunlight, which alters the energy balance of the planet. The resulting cooling decreases atmospheric concentrations of greenhouse gases, through mechanisms such as absorption by cold ocean waters and expansion of permafrost, which causes more cooling. When this complex web of feedbacks stabilizes, over tens of thousands of years, the average global temperature is several degrees lower and glaciers cover much of the Northern Hemisphere land mass.

The ice age in The Day After Tomorrow has a more outlandish origin. Following the thermohaline shutdown, a network of massive hurricane-shaped snowstorms, covering entire continents, deposits enough snow to reflect sunlight and create an ice age in a matter of days. As if that weren’t enough, the air at the eye of each storm is cold enough to freeze people instantly, placing the characters in mortal danger. Jack’s friend Terry Rapson, a climatologist from the UK, explains that cold air from the top of the troposphere is descending so quickly in the eye of each storm that it does not warm up as expected. He estimates that the air must be -150°F (approximately -100°C) or colder, since it is instantly freezing the fuel lines in helicopters.

There are two main problems with this description of the storm. Firstly, the tropopause (the highest and coldest part of the troposphere) averages -60°C, and nowhere does it reach -100°C. Secondly, the eye of a hurricane – and presumably of the hurricane-shaped snowstorms – has the lowest pressure of anywhere in the storm. This fundamental characteristic indicates that air should be rising in the eye of each snowstorm, not sinking down from the tropopause.

Later in the film, NASA scientist Janet Tokada is monitoring the storms using satellite data. She notes that temperature is decreasing within the storm “at a rate of 10 degrees per second”. Whether the measurement is in Fahrenheit or Celsius, this rate of change is implausible. In under a minute (which is likely less time than the satellite reading takes) the air would reach absolute zero, a hypothetical temperature at which all motion stops.

In conclusion, there are many problems with the storm system as presented in the film, only a few of which have been summarized here. One can rest assured that such a frightening meteorological phenomenon could not happen in the real world.

Sea Level Rise

Before the snowstorms begin, extreme weather events – from hurricanes to tornadoes to giant hailstones – ravage the globe. Thrown in with these disasters is rapid sea level rise. While global warming will raise sea levels, the changes are expected to be extremely gradual. Most recent estimates project a rise of 1-2 metres by 2100 and tens of metres in the centuries following. In contrast, The Day After Tomorrow shows the ocean rising by “25 feet in a matter of seconds” along the Atlantic coast of North America. This event is not due to a tsunami, nor the storm surge of a hurricane; it is assumed to be the result of the Greenland ice sheet melting.

As the film continues and an ice age begins, the sea level should fall. The reasons for this change are twofold: first, a drop in global temperatures causes ocean water to contract; second, glacier growth over the Northern Hemisphere locks up a great deal of ice that would otherwise be present as liquid water in the ocean. However, when astronauts are viewing the Earth from space near the end of the film, the coastlines of each continent are the same as today. They have not been altered by either the 25-foot rise due to warming or the even larger fall that cooling necessitates. Since no extra water was added to the Earth from space, maintaining sea level in this manner is physically impossible.

Climate Modelling

Since the Second World War, ever-increasing computer power has allowed climate scientists to develop mathematical models of the climate system. Since there aren’t multiple Earths on which to perform controlled climatic experiments, the scientific community has settled for virtual planets instead. When calibrated, tested, and used with caution, these global climate models can produce valuable projections of climate change over the next few centuries. Throughout The Day After Tomorrow, Jack and his colleagues rely on such models to predict how the storm system will develop. However, the film’s representation of climate modelling is inaccurate in many respects.

Firstly, Jack is attempting to predict the development of the storm over the next few months, which is impossible to model accurately using today’s technology. Weather models, which project initial atmospheric conditions into the future, are only reliable for a week or two: after this time, the chaotic nature of weather causes small rounding errors to completely change the outcome of the prediction. On the other hand, climate models are concerned with average values and boundary conditions over decades, which are not affected by the principles of chaos theory. Put another way, weather modelling is like predicting the outcome of a single dice roll based on how the dice was thrown; climate modelling is like predicting the net outcome of one hundred dice rolls based on how the dice is weighted. Jack’s inquiry, though, falls right between the two: he is predicting the exact behaviour of a weather system over a relatively long time scale. Until computers become vastly more precise and powerful, this exercise is completely unreliable.

Furthermore, the characters make seemingly arbitrary distinctions between “forecast models”, “paleoclimate models”, and “grid models”. In the real world, climate models are categorized by complexity, not by purpose. For example, GCMs (General Circulation Models) represent the most processes and typically have the highest resolutions, while EMICs (Earth System Models of Intermediate Complexity) include more approximations and run at lower resolutions. All types of climate models can be used for projections (a preferred term to “forecasts” because the outcomes of global warming are dependent on emissions scenarios), but are only given credence if they can accurately simulate paleoclimatic events such as glacial cycles. All models include a “grid”, which refers to the network of three-dimensional cells used to split the virtual Earth’s surface, atmosphere, and ocean into discrete blocks.

Nevertheless, Jack gets to work converting his “paleoclimate model” to a “forecast model” so he can predict the path of the storm. It is likely that this conversion involves building a new high-resolution grid and adding dozens of new climatic processes to the model, a task which would take months to years of work by a large team of scientists. However, Jack appears to have superhuman programming abilities: he writes all the code by himself in 24 hours!

When he has finished, he decides to get some rest until the simulation has finished running. In the real world, this would take at least a week, but Jack’s colleagues wake him up after just a few hours. Evidently, their lab has access to computing resources more powerful than anything known to science today. Then, Jack’s colleagues hand him “the results” on a single sheet of paper. Real climate model output comes in the form of terabytes of data tables, which can be converted to digital maps, animations, and time plots using special software. Jack’s model appeared to simply spit out a few numbers, and what these numbers may have referred to is beyond comprehension.

If The Day After Tomorrow was set several hundred years in the future, the modelling skill of climate scientists and the computer power available to them might be plausible. Indeed, it would be very exciting to be able to build, run, and analyse models as quickly and with as much accuracy as Jack and his colleagues can. Unfortunately, in the present day, the field of climate modelling works quite differently.

Conclusions

The list of serious scientific errors in The Day After Tomorrow is unacceptably long. The film depicts a sudden shutdown of thermohaline circulation due to global warming, an event that climate scientists say is extremely unlikely, and greatly exaggerates both the severity and the rate of the resulting cooling. When a new ice age begins in a matter of days, it isn’t caused by the well-known mechanisms that triggered glacial periods in the past – rather, massive storms with physically impossible characteristics radically alter atmospheric conditions. The melting Greenland ice sheet causes the oceans to rise at an inconceivable rate, but when the ice age begins, sea level does not fall as the laws of physics dictate it should. Finally, the film depicts the endeavour of science, particularly the field of climate modelling, in a curious and inaccurate manner.

It would not have been very difficult or expensive for the film’s writing team to hire a climatologist as a science advisor – in fact, given that the plot revolves around global warming, it seems strange that they did not do so. One can only hope that future blockbuster movies about climate change will be more rigorous with regards to scientific accuracy.

Climate Change and Heat Waves

One of the most dangerous effects of climate change is its impact on extreme events. The extra energy that’s present on a warmer world doesn’t distribute itself uniformly – it can come out in large bursts, manifesting itself as heat waves, floods, droughts, hurricanes, and tornadoes, to name a few. Consequently, warming the world by an average of 2 degrees is a lot more complicated than adding 2 to every weather station reading around the world.

Scientists have a difficult time studying the impacts of climate change on extreme events, because all these events could happen anyway – how can you tell if Hurricane Something is a direct result of warming, or just a fluke? Indeed, for events involving precipitation, like hurricanes or droughts, it’s not possible to answer this question. However, research is advancing to the point where we can begin to attribute individual heat waves to climate change with fairly high levels of confidence. For example, the recent extended heat wave in Texas, which was particularly devastating for farmers, probably wouldn’t have happened if it weren’t for global warming.

Extreme heat is arguably the easiest event for scientists to model. Temperature is one-dimensional and more or less follows a normal distribution for a given region. As climate change continues, temperatures increase (shifting the bell curve to the right) and become more variable (flattening the bell curve). The end result, as shown in part (c) of the figure below, is a significant increase in extremely hot weather:

Now, imagine that you get a bunch of weather station data from all across the world in 1951-1980, back before the climate had really started to warm. For every single record, find the temperature anomaly (difference from the average value in that place and on that day of the year). Plot the results, and you will get a normal distribution centred at 0. So values in the middle of the bell curve – i.e., temperatures close to the average – are the most likely, and temperatures on the far tails of the bell curve – i.e. much warmer or much colder than the average – are far less likely.

As any statistics student knows, 99.7% of the Earth’s surface should have temperatures within three standard deviations of the mean (this is just an interval, with length dependent on how flat the bell curve is) at any given time. So if we still had the same climate we did between 1951 and 1980, temperatures more than three standard deviations above the mean would cover 0.15% of the Earth’s surface.

However, in the past few years, temperatures three standard deviations above average have covered more like 10% of the Earth’s surface. Even some individual heat waves – like the ones in Texas and Russia over the past few years – have covered so much of the Earth’s surface on their own that they blow the 0.15% statistic right out of the water. Under the “old” climate, they almost certainly wouldn’t have happened. You can only explain them by shifting the bell curve to the right and flattening it. For this reason, we can say that these heat waves were caused by global warming.

Here’s a graph of the bell curves we’re talking about, in this case for the months of June, July, and August. The red, yellow and green lines are the old climate; the blue and purple lines are the new climate. Look at the area under the curve to the right of x = 3: it’s almost nothing beneath the old climate, but quite significant beneath the new climate.

Using basic statistical methods, it’s very exciting that we can now attribute specific heat waves to climate change. On the other hand, it’s very depressing, because it goes to show that such events will become far more likely as the climate continues to change, and the bell curve shifts inexorably to the right.

References:

March Migration Data

In my life outside of climate science, I am an avid fan of birdwatching, and am always eager to connect the two. Today I’m going to share some citizen science data I collected.

Last year, I started taking notes during the spring migration. Every time I saw a species for the first time that year, I made a note of the date. I planned to repeat this process year after year, mainly so I would know when to expect new arrivals at our bird feeders, but also in an attempt to track changes in migration. Of course, this process is imperfect (it simply provides an upper bound for when the species arrives, because it’s unlikely that I witness the very first arrival in the city) but it’s better than nothing.

Like much of the Prairies and American Midwest, we’ve just had our warmest March on record, a whopping 8 C above normal. Additionally, every single bird arrival I recorded in March was earlier than last year, sometimes by over 30 days.

I don’t think this is a coincidence. I haven’t been any more observant than last year – I’ve spent roughly the same amount of time outside in roughly the same places. It also seems unlikely for such a systemic change to be a product of chance, although I would need much more data to figure that out for sure. Also, some birds migrate based on hours of daylight rather than temperature. However, I find it very interesting that, so far, not a single species has been late.

Because I feel compelled to graph everything, I typed all this data into Excel and made a little scatterplot. The mean arrival date was 20.6 days earlier than last year, with a standard deviation of 8.9 days.

My Earth Hour Story

Tonight is Earth Hour, when people across the world turn off all their lights and electronic devices (except the necessary ones – I don’t think you’re required to unplug the freezer) from 8:30 to 9:30 local time. This is meant to generate awareness about climate change and conservation. It’s really more of a symbolic action, to my understanding – I doubt it adds up to a significant dip in carbon emissions – but I take part anyway. I find that a lot of interesting conversations begin when there’s nothing to do but sit in the dark.

It was during the second official Earth Hour, when I was sixteen years old, that I agreed to babysit for friends of the family. Great, I thought, how am I going to get a five-year-old boy and a two-year-old girl to sit in the dark for an hour? I ended up turning it into a camping game, which was really fun. We made a tent out of chairs and blankets, ate popcorn, and played with a flashlight powered by a hand crank.

The girl was too young to understand the purpose of sitting in the dark – she just liked waving the flashlight around – but I talked to the boy a bit about why we were doing this. I told him how we needed to take care of nature, because it can be damaged if we don’t treat it well, and that can come back to bite us. I explained the purpose of recycling: “You can make paper out of trees, but you can also make paper out of old paper, and that way you don’t have to cut down any trees.” His face just lit up, and he said, “Oh! I get it now! Well, we should do more of that!” which was really great to hear.

Halfway through the hour, the kids went to bed, and I sat in the dark on my own until 9:30, when I turned the lights on and started to do homework. And that was the end of it…or so I thought.

Apparently, at some point during that hour, a neighbour had noticed that the house was in darkness and flashlights were waving around. He thought there was something wrong with that situation, and came over to knock on the door, but we were in the basement in our tent and didn’t hear him. So then he called the police.

It was 11 pm by the time they showed up. Suddenly someone was pounding on the door, and I, convinced that someone was trying to break in, was terrified. I froze in my seat, and contemplated hiding under the desk, but whoever was at the door refused to go away. Eventually I crept over to a side window and looked outside, where I saw a police car.

My first thought when I opened to the door to two police officers was, “Who got in a car accident? My family, or the kids’ parents?” The concept of police coming to investigate a house that had its lights off was completely foreign to me.

“It’s Earth Hour,” I said when they told me why they were there. They replied, “Yeah, we know, but we have to answer all our calls.” They took my name and my birth date, so this incident must be mentioned somewhere in the city police records. I imagine there is a note next to my name saying, “Attempted to indoctrinate children with environmentalism.”

Luckily the kids didn’t wake up, but they heard about the incident later from their parents. I still babysit these kids, albeit less frequently now that I’m in university, and the boy often asks, “Can we turn off all the lights again? I want the police to come. That would be fun.”

An Open Letter to the Future

To the citizens of the world in the year 5000:

It’s 2012, and nobody is thinking about you.

These days, Long Term Thinking means planning for 2050, and even that is unusual. Thoughts of Future Generations don’t go beyond grandchildren. If my government knew I was thinking about people three thousand years in the future, they would probably call me a “radical”.

However, three thousand years isn’t such a long time. The ancient Greeks flourished about three thousand years ago now, and we think about them all the time. Not just historians, but people in all walks of life – scientists, policymakers, teachers, and lawyers all acknowledge the contributions of this ancient civilization to today’s culture. Our society is, in many ways, modelled after the Greeks.

I was walking outside today, at the tail end of the warmest winter anyone can remember in central Canada, and thought to myself: What if the ancient Greeks had caused global climate change back in their day? What if they had not only caused it, but understood what was happening, and had actively chosen to ignore it? The effects would still be apparent today. Global temperature might have stabilized, but the biosphere would still be struggling to adapt, and the seas would still be gradually rising. What would we think of the ancient Greeks if they had bestowed this legacy upon us? Would we still look upon their civilization so favourably?

The Golden Rule is usually applied to individuals living in the same time and place, but I think we should extend it across continents and through millennia so it applies to all of human civilization. Before we make a major societal decision, like where to get our energy, we should ask ourselves: If the ancient Greeks had gone down this path, would we care?

The future is a very long time. Thinking about the future is like contemplating the size of the universe: it’s disturbing, and too abstract to fully comprehend. Time and space are analogues in this manner. 2050 is like Mars, and the year 5000 is more like Andromeda.

I can handle Andromeda. And I can handle the concept of 5000 A.D., so I think about it when I’m outside walking. My first thoughts are those of scientific curiosity. Tell me, people in 5000 – how bad did the climate get? What happened to the amphibians and the boreal forest? Did the methane hydrates give way, and if so, at what point? How much did the oceans rise?

Soon scientific curiosity gives way to societal questions. Were we smart enough to leave some coal in the ground, or did we burn it all? Did we open our doors to environmental refugees, or did we shut the borders tight and guard the food supply? How long did it take for Western civilization to collapse? What did you do then? What is life like now?

And then the inevitable guilt sets in, as I imagine what you must think of us, of this horrible thoughtless period of history that I am a part of. But with the guilt comes a desperate plea for you to understand that not everyone ignored the problem. A few of us dedicated our lives to combating denial and apathy, in a sort of Climate Change Resistance. I was one of them; I am one of them. With the guilt comes a burning desire to say that I tried.

Tar Sands vs. Coal

The term “fossil fuels” is a very large umbrella. Coal, oil, and natural gas are the usual distinctions, but there’s also unconventional oil (such as the Alberta tar sands) and unconventional gas (such as shale gas from fracking). “Unconventional” means that the fuel is produced in a roundabout way that’s less efficient and takes more energy than regular fuel. For example, oil in northern Alberta is mixed with sand and tar that’s difficult to remove. As global supplies of conventional oil and gas decline, unconventional fuels are making up a growing segment of the petroleum market.

The different types of fossil fuels are present in different amounts in the ground. Also, for each unit of energy we get from burning them, they will release different amounts of carbon emissions. Given these variables, here’s an interesting question: how much global warming would each type of fuel cause if we burned every last bit of it?

A few weeks ago, a new study addressed this question in one of the world’s top scientific journals. Neil Swart, a Ph.D. student from the University of Victoria, as well as his supervisor Andrew Weaver, one of Canada’s top climate scientists, used existing data to quantify the warming potential for each kind of fossil fuel. Observations show the relationship between carbon emissions and temperature change to be approximately linear, so they didn’t need to use a climate model – a back-of-the-envelope calculation was sufficient. Also, since both of the authors are Canadian, they were particularly interested in how burning the Alberta tar sands would contribute to global warming.

Swart and Weaver calculated that, if we burned every last drop of the tar sands, the planet would warm by about 0.36°C. This is about half of the warming that’s been observed so far. If we only burned the parts of the tar sands proven to be economically viable, that number drops to 0.03°C. If we don’t expand drilling any further, and stick to the wells that already exist, the world would only warm by 0.01°C, which is virtually undetectable.

Conventional oil and natural gas would each cause similarly small amounts of warming, if the respective global supplies were burned completely. Unconventional natural gas would cause several times more warming – even though it’s cleaner-burning than coal and oil, there’s a lot of it in the ground.

The real game-changer, though, is coal. If we burned all the coal in the ground, the world would warm by a staggering 15°C. There’s a large uncertainty range around this number, though, because the linear relationship between carbon emissions and temperature change breaks down under super-high emission levels. The warming could be anywhere from 8°C to 25°C. In the context of previous climate changes, it’s hard to overemphasize just how dramatic a double-digit rise in average temperatures would be.

The main reason why the warming potential of coal is so high is because there’s so much of it. The Alberta tar sands are a huge resource base, but they’re tiny in comparison to global coal deposits. Also, coal is more polluting than any kind of oil: if you powered a lightbulb for one hour using coal, you would produce about 30% more CO2 emissions than if you ran it using conventional oil.

The tar sands are more polluting than regular oil, but exactly how much more is a very difficult question to answer. The end product that goes into your car at the gas station is essentially the same, but the refining process takes more energy. You can supply the extra energy in many different ways, though: if you use coal, tar sands become much more polluting than regular oil; if you use renewable energy that doesn’t emit carbon, tar sands are about the same. The authors didn’t include these extra emissions in their study, but they did discuss them in a supplementary document, which estimated that, in an average case, tar sands cause 17% more emissions than regular oil. Taking this into account, the tar sands would cause 0.42°C of warming if they were burned completely, rather than 0.36°C.

Therefore, headlines like “Canada’s oil sands: Not so dirty after all” are misleading. Canada’s oil sands are still very dirty. There just isn’t very much of them. If we decide to go ahead and burn all the tar sands because they only cause a little bit of warming, the same argument could be used for every individual coal plant across the world. Small numbers add up quickly.

The authors still don’t support expansion of the tar sands, or construction of pipelines like the Keystone XL. “While coal is the greatest threat to the climate globally,” Andrew Weaver writes, “the tarsands remain the largest source of greenhouse gas emission growth in Canada and are the single largest reason Canada is failing to meet its international climate commitments and failing to be a climate leader.” Nationally, tar sands are a major climate issue, because they enable our addiction to fossil fuels and create infrastructure that locks us into a future of dirty energy. Also, a myriad of other environmental and social problems are associated with the tar sands – health impacts on nearby First Nations communities, threats to iconic species such as the woodland caribou, and toxic chemicals being released into the air and water.

Tar sands are slightly preferable to coal, but clean energy is hugely preferable to both. In order to keep the climate crisis under control, we need to transition to a clean energy economy as soon as possible. From this viewpoint, further development of the tar sands is a step in the wrong direction.