A Simple Stochastic Climate Model: Climate Sensitivity

I haven’t forgotten about this project! Read the introduction and ODE derivation if you haven’t already.

Last time I derived the following ODE for temperature T at time t:

where S and τ are constants, and F(t) is the net radiative forcing at time t. Eventually I will discuss each of these terms in detail; this post will focus on S.

At equilibrium, when dT/dt = 0, the ODE necessitates T(t) = S F(t). A physical interpretation for S becomes apparent: it measures the equilibrium change in temperature per unit forcing, also known as climate sensitivity.

A great deal of research has been conducted with the aim of quantifying climate sensitivity, through paleoclimate analyses, modelling experiments, and instrumental data. Overall, these assessments show that climate sensitivity is on the order of 3 K per doubling of CO2 (divide by 5.35 ln 2 W/m2 to convert to warming per unit forcing).

The IPCC AR4 report (note that AR5 was not yet published at the time of my calculations) compared many different probability distribution functions (PDFs) of climate sensitivity, shown below. They follow the same general shape of a shifted distribution with a long tail to the right, and average 5-95% confidence intervals of around 1.5 to 7 K per doubling of CO2.

Box 10.2, Figure 1 of the IPCC AR4 WG1: Probability distribution functions of climate sensitivity (a), 5-95% confidence intervals (b).

These PDFs generally consist of discrete data points that are not publicly available. Consequently, sampling from any existing PDF would be difficult. Instead, I chose to create my own PDF of climate sensitivity, modelled as a log-normal distribution (e raised to the power of a normal distribution) with the same shape and bounds as the existing datasets.

The challenge was to find values for μ and σ, the mean and standard deviation of the corresponding normal distribution, such that for any z sampled from the log-normal distribution,

Since erf, the error function, cannot be evaluated analytically, this two-parameter problem must be solved numerically. I built a simple particle swarm optimizer to find the solution, which consistently yielded results of μ = 1.1757, σ = 0.4683.

The upper tail of a log-normal distribution is unbounded, so I truncated the distribution at 10 K, consistent with existing PDFs (see figure above). At the beginning of each simulation, climate sensitivity in my model is sampled from this distribution and held fixed for the entire run. A histogram of 106 sampled points, shown below, has the desired characteristics.

Histogram of 106 points sampled from the log-normal distribution used for climate sensitivity in the model.

Histogram of 106 points sampled from the log-normal distribution used for climate sensitivity in the model.

Note that in order to be used in the ODE, the sampled points must then be converted to units of Km2/W (warming per unit forcing) by dividing by 5.35 ln 2 W/m2, the forcing from doubled CO2.

Advertisement

Counting my Blessings

This is the coldest time of year in the Prairies. Below -20 °C it all feels about the same, but the fuel lines in cars freeze more easily, and outdoor sports are no longer safe. We all become grouchy creatures of the indoors for a few months each year. But as much as I hate the extreme cold, I would rather be here than in Australia right now.

A record-breaking, continent-wide heat wave has just wrapped up, and Australia has joined the Arctic in the list of regions where the temperature is so unusually warm that new colours have been added to the map legends. This short-term forecast by the ACCESS model predicts parts of South Australia to reach between 52 and 54 °C on Monday:

For context, the highest temperature ever recorded on Earth was 56.7 °C, in Death Valley during July of 1913. Australia’s coming pretty close.

This heat wave has broken dozens of local records, but the really amazing statistics come from national average daily highs: the highest-ever value at 40.33 °C, on January 7th; and seven days in a row above 39 °C, the most ever, from January 2nd to 8th.

Would this have happened without climate change? It’s a fair question, and (for heat waves at least) one that scientists are starting to tackle – see James Hansen’s methodology that concluded recent heat waves in Texas and Russia were almost certainly the result of climate change.

At any rate, this event suggests that uninformed North Americans who claim “warming is a good thing” haven’t been to Australia.

A Summer of Extremes

Because of our emissions of greenhouse gases like carbon dioxide, a little extra energy gets trapped in our atmosphere every day. Over time, this energy builds up. It manifests itself in the form of higher temperatures, stronger storms, larger droughts, and melting ice. Global warming, then, isn’t about temperatures as much as it is about energy.

The extra energy, and its consequences, don’t get distributed evenly around the world. Weather systems, which move heat and moisture around the planet, aren’t very fair: they tend to bully some places more than others. These days, it’s almost as if the weather picks geographical targets each season to bombard with extremes, then moves on to somewhere else. This season, the main target seems to be North America.

The warmest 12 months on record for the United States recently wrapped up with a continent-wide heat wave and drought. Thousands of temperature records were broken, placing millions of citizens in danger. By the end of June, 56% of the country was experiencing at least “moderate” drought levels – the largest drought since 1956. Wildfires took over Colorado, and extreme wind storms on the East Coast knocked out power lines and communication systems for a week. Conditions have been similar throughout much of Canada, although its climate and weather reporting systems are less accessible.

“This is what global warming looks like,”, said Professor Jonathan Overpeck from the University of Arizona, a sentiment that was echoed across the scientific community in the following weeks. By the end of the century, these conditions will be the new normal.

Does that mean that these particular events were caused by climate change? There’s no way of knowing. It could have just been a coincidence, but the extra energy global warming adds to our planet certainly made them more likely. Even without climate change, temperature records get broken all the time.

However, in an unchanging climate, there would be roughly the same amount of record highs as record lows. In a country like the United States, where temperature records are well catalogued and publicly available, it’s easy to see that this isn’t the case. From 2000-2009, there were twice as many record highs as record lows, and so far this year, there have been ten times as many:

The signal of climate change on extreme weather is slowly, but surely, emerging. For those who found this summer uncomfortable, the message from the skies is clear: Get used to it. This is only the beginning.

Climate Change and Heat Waves

One of the most dangerous effects of climate change is its impact on extreme events. The extra energy that’s present on a warmer world doesn’t distribute itself uniformly – it can come out in large bursts, manifesting itself as heat waves, floods, droughts, hurricanes, and tornadoes, to name a few. Consequently, warming the world by an average of 2 degrees is a lot more complicated than adding 2 to every weather station reading around the world.

Scientists have a difficult time studying the impacts of climate change on extreme events, because all these events could happen anyway – how can you tell if Hurricane Something is a direct result of warming, or just a fluke? Indeed, for events involving precipitation, like hurricanes or droughts, it’s not possible to answer this question. However, research is advancing to the point where we can begin to attribute individual heat waves to climate change with fairly high levels of confidence. For example, the recent extended heat wave in Texas, which was particularly devastating for farmers, probably wouldn’t have happened if it weren’t for global warming.

Extreme heat is arguably the easiest event for scientists to model. Temperature is one-dimensional and more or less follows a normal distribution for a given region. As climate change continues, temperatures increase (shifting the bell curve to the right) and become more variable (flattening the bell curve). The end result, as shown in part (c) of the figure below, is a significant increase in extremely hot weather:

Now, imagine that you get a bunch of weather station data from all across the world in 1951-1980, back before the climate had really started to warm. For every single record, find the temperature anomaly (difference from the average value in that place and on that day of the year). Plot the results, and you will get a normal distribution centred at 0. So values in the middle of the bell curve – i.e., temperatures close to the average – are the most likely, and temperatures on the far tails of the bell curve – i.e. much warmer or much colder than the average – are far less likely.

As any statistics student knows, 99.7% of the Earth’s surface should have temperatures within three standard deviations of the mean (this is just an interval, with length dependent on how flat the bell curve is) at any given time. So if we still had the same climate we did between 1951 and 1980, temperatures more than three standard deviations above the mean would cover 0.15% of the Earth’s surface.

However, in the past few years, temperatures three standard deviations above average have covered more like 10% of the Earth’s surface. Even some individual heat waves – like the ones in Texas and Russia over the past few years – have covered so much of the Earth’s surface on their own that they blow the 0.15% statistic right out of the water. Under the “old” climate, they almost certainly wouldn’t have happened. You can only explain them by shifting the bell curve to the right and flattening it. For this reason, we can say that these heat waves were caused by global warming.

Here’s a graph of the bell curves we’re talking about, in this case for the months of June, July, and August. The red, yellow and green lines are the old climate; the blue and purple lines are the new climate. Look at the area under the curve to the right of x = 3: it’s almost nothing beneath the old climate, but quite significant beneath the new climate.

Using basic statistical methods, it’s very exciting that we can now attribute specific heat waves to climate change. On the other hand, it’s very depressing, because it goes to show that such events will become far more likely as the climate continues to change, and the bell curve shifts inexorably to the right.

References:

What’s the Warmest Year – and Does it Matter?

Cross-posted from NextGenJournal

Climate change is a worrying phenomenon, but watching it unfold can be fascinating. The beginning of a new year brings completed analysis of what last year’s conditions were like. Perhaps the most eagerly awaited annual statistic is global temperature.

This year was no different – partway through 2010, scientists could tell that it had a good chance of being the warmest year on record. It turned out to be more or less tied for first, as top temperature analysis centres recently announced:

Why the small discrepancy in the order of  1998, 2005, and 2010? The answer is mainly due to the Arctic. Weather stations in the Arctic region are few and far between, as it’s difficult to have a permanent station on ice floes that move around, and are melting away. Scientists, then, have two choices in their analyses: extrapolate Arctic temperature anomalies from the stations they do have, or just leave the missing areas out, assuming that they’re warming at the global average rate. The first choice might lead to results that are off in either direction…but the second choice almost certainly underestimates warming, as it’s clear that climate change is affecting the Arctic much more and much faster than the global average. Currently, NASA is the only centre to do extrapolation in Arctic data. A more detailed explanation is available here.

But how useful is an annual measurement of global temperature? Not very, as it turns out. Short-term climate variability, most prominently El Nino and La Nina, impact annual temperatures significantly. Furthermore, since this oscillation occurs in the winter, the thermal influence of El Nino or La Nina can fall entirely into one calendar year, or be split between two. The result is a graph that’s rather spiky:

A far more useful analysis involves plotting a 12-month running mean. Instead of measuring only from January to December, measurements are also compiled from February to January, March to February, and so on. This results in twelve times more data points, and prevents El Nino and La Nina events from being exaggerated:

This graph is better, but still not that useful. The natural spikiness of the El Nino cycle can, in the short term, get in the way of understanding the underlying trend. Since the El Nino cycle takes between 3 and 7 years to complete, a 60-month (5-year) running mean allows the resulting ups and downs to cancel each other out. Another cycle that impacts short-term temperature is the sunspot cycle, which operates on an 11-year cycle. A 132-month running mean smooths out that influence too. Both 60- and 132- month running means are shown below:

A statistic every month that shows the average global temperature over the last 5 or 11 years may not be as exciting as an annual measurement regarding the previous year. But that’s the reality of climate change. It doesn’t make every month or even every year warmer than the last, and a short-term trend line means virtually nothing. In the climate system, trends are always obscured by noise, and the nature of human psychology means we pay far more attention to noise. Nonetheless, the long-term warming trend since around 1975 is irrefutable when one is presented with the data. A gradual, persistent change might not make the greatest headline, but that doesn’t mean it’s worth ignoring.