Well, This is a Problem

The back gardens of Mayflower, Arkansas aren’t looking too good:

spill

Yes, that’s oil. Canadian oil, no less. You’re welcome.

I’ve heard surprisingly little about this event, which occurred when an Exxon Mobil pipeline ruptured on Friday. It appears that the press have limited access while the cleanup crews are at work. National Geographic had a good piece, though.

Call me cynical, but I think the Canadian media are purposely keeping quiet on this one. It’s a very inconvenient time for a pipeline to burst, given that all levels of government and industry are pushing for Keystone, Northern Gateway, Energy East, etc., etc.

News of this event is largely relying on Mayflower citizens leveraging social media. There’s no way to verify their photos and videos, but they’re striking nonetheless. Here’s a video of the situation on a residential street – note the lack of cleanup crews.

The oil is going straight into the storm drain, the man in the video says, which makes me shudder. I don’t know anything about Mayflower’s stormwater system, but where I live those storm drains are about three steps removed from the Red River. Once oil got in there, I can’t imagine it ever getting out.

I find it puzzling that the negative impacts of pipelines are so often catalogued as “environmentalists’ problems” in the Canadian media – here’s a typical example. In reality, they’re everyone’s problems. Environmentalists (as much as I detest that label) are just the people who realize it. We are not a special interest group; we represent everyone. When it comes to disasters, from short-term spills like the one in Mayflower to millennial-scale impacts like climate change, Canadian oil will affect everyone indiscriminately.

Side note: Sorry I have been so absurdly quiet recently. I am busy building two climate models – just small ones for term projects, but so enjoyable that everything else is getting neglected. I’ll be posting much more on that in about a month.

Advertisement

Counting my Blessings

This is the coldest time of year in the Prairies. Below -20 °C it all feels about the same, but the fuel lines in cars freeze more easily, and outdoor sports are no longer safe. We all become grouchy creatures of the indoors for a few months each year. But as much as I hate the extreme cold, I would rather be here than in Australia right now.

A record-breaking, continent-wide heat wave has just wrapped up, and Australia has joined the Arctic in the list of regions where the temperature is so unusually warm that new colours have been added to the map legends. This short-term forecast by the ACCESS model predicts parts of South Australia to reach between 52 and 54 °C on Monday:

For context, the highest temperature ever recorded on Earth was 56.7 °C, in Death Valley during July of 1913. Australia’s coming pretty close.

This heat wave has broken dozens of local records, but the really amazing statistics come from national average daily highs: the highest-ever value at 40.33 °C, on January 7th; and seven days in a row above 39 °C, the most ever, from January 2nd to 8th.

Would this have happened without climate change? It’s a fair question, and (for heat waves at least) one that scientists are starting to tackle – see James Hansen’s methodology that concluded recent heat waves in Texas and Russia were almost certainly the result of climate change.

At any rate, this event suggests that uninformed North Americans who claim “warming is a good thing” haven’t been to Australia.

Climate Change and Atlantic Circulation

Today my very first scientific publication is appearing in Geophysical Research Letters. During my summer at UVic, I helped out with a model intercomparison project regarding the effect of climate change on Atlantic circulation, and was listed as a coauthor on the resulting paper. I suppose I am a proper scientist now, rather than just a scientist larva.

The Atlantic meridional overturning circulation (AMOC for short) is an integral part of the global ocean conveyor belt. In the North Atlantic, a massive amount of water near the surface, cooling down on its way to the poles, becomes dense enough to sink. From there it goes on a thousand-year journey around the world – inching its way along the bottom of the ocean, looping around Antarctica – before finally warming up enough to rise back to the surface. A whole multitude of currents depend on the AMOC, most famously the Gulf Stream, which keeps Europe pleasantly warm.

Some have hypothesized that climate change might shut down the AMOC: the extra heat and freshwater (from melting ice) coming into the North Atlantic could conceivably lower the density of surface water enough to stop it sinking. This happened as the world was coming out of the last ice age, in an event known as the Younger Dryas: a huge ice sheet over North America suddenly gave way, drained into the North Atlantic, and shut down the AMOC. Europe, cut off from the Gulf Stream and at the mercy of the ice-albedo feedback, experienced another thousand years of glacial conditions.

A shutdown today would not lead to another ice age, but it could cause some serious regional cooling over Europe, among other impacts that we don’t fully understand. Today, though, there’s a lot less ice to start with. Could the AMOC still shut down? If not, how much will it weaken due to climate change? So far, scientists have answered these two questions with “probably not” and “something like 25%” respectively. In this study, we analysed 30 climate models (25 complex CMIP5 models, and 5 smaller, less complex EMICs) and came up with basically the same answer. It’s important to note that none of the models include dynamic ice sheets (computational glacial dynamics is a headache and a half), which might affect our results.

Models ran the four standard RCP experiments from 2006-2100. Not every model completed every RCP, and some extended their simulations to 2300 or 3000. In total, there were over 30 000 model years of data. We measured the “strength” of the AMOC using the standard unit Sv (Sverdrups), where each Sv is 1 million cubic metres of water per second.

Only two models simulated an AMOC collapse, and only at the tail end of the most extreme scenario (RCP8.5, which quite frankly gives me a stomachache). Bern3D, an EMIC from Switzerland, showed a MOC strength of essentially zero by the year 3000; CNRM-CM5, a GCM from France, stabilized near zero by 2300. In general, the models showed only a moderate weakening of the AMOC by 2100, with best estimates ranging from a 22% drop for RCP2.6 to a 40% drop for RCP8.5 (with respect to preindustrial conditions).

Are these somewhat-reassuring results trustworthy? Or is the Atlantic circulation in today’s climate models intrinsically too stable? Our model intercomparison also addressed that question, using a neat little scalar metric known as Fov: the net amount of freshwater travelling from the AMOC to the South Atlantic.

The current thinking in physical oceanography is that the AMOC is more or less binary – it’s either “on” or “off”. When AMOC strength is below a certain level (let’s call it A), its only stable state is “off”, and the strength will converge to zero as the currents shut down. When AMOC strength is above some other level (let’s call it B), its only stable state is “on”, and if you were to artificially shut it off, it would bounce right back up to its original level. However, when AMOC strength is between A and B, both conditions can be stable, so whether it’s on or off depends on where it started. This phenomenon is known as hysteresis, and is found in many systems in nature.

This figure was not part of the paper. I made it just now in MS Paint.

Here’s the key part: when AMOC strength is less than A or greater than B, Fov is positive and the system is monostable. When AMOC strength is between A and B, Fov is negative and the system is bistable. The physical justification for Fov is its association with the salt advection feedback, the sign of which is opposite Fov: positive Fov means the salt advection feedback is negative (i.e. stabilizing the current state, so monostable); a negative Fov means the salt advection feedback is positive (i.e. reinforcing changes in either direction, so bistable).

Most observational estimates (largely ocean reanalyses) have Fov as slightly negative. If models’ AMOCs really were too stable, their Fov‘s should be positive. In our intercomparison, we found both positives and negatives – the models were kind of all over the place with respect to Fov. So maybe some models are overly stable, but certainly not all of them, or even the majority.

As part of this project, I got to write a new section of code for the UVic model, which calculated Fov each timestep and included the annual mean in the model output. Software development on a large, established project with many contributors can be tricky, and the process involved a great deal of head-scratching, but it was a lot of fun. Programming is so satisfying.

Beyond that, my main contribution to the project was creating the figures and calculating the multi-model statistics, which got a bit unwieldy as the model count approached 30, but we made it work. I am now extremely well-versed in IDL graphics keywords, which I’m sure will come in handy again. Unfortunately I don’t think I can reproduce any figures here, as the paper’s not open-access.

I was pretty paranoid while coding and doing calculations, though – I kept worrying that I would make a mistake, never catch it, and have it dredged out by contrarians a decade later (“Kate-gate”, they would call it). As a climate scientist, I suppose that comes with the job these days. But I can live with it, because this stuff is just so darned interesting.

Since I Last Wrote…

Since I last wrote, I finished my summer research at Andrew Weaver’s lab (more on that in the weeks and months to come, as our papers work through peer review). I moved back home to the Prairies, which seem unnaturally hot, flat and dry compared to BC. Perhaps what I miss most is the ocean – the knowledge that the nearest coastline is more than a thousand kilometres away gives me an uncomfortable feeling akin to claustrophobia.

During that time, the last story I covered has developed significantly. Before September even began, Arctic sea ice extent reached record low levels. It’s currently well below the previous record, held in 2007, and will continue to decline for two or three more weeks before it levels off:

Finally, El Niño conditions are beginning to emerge in the Pacific Ocean. In central Canada we are celebrating, because El Niño tends to produce warmer-than-average winters (although last winter was mysteriously warm despite the cooling influence of La Niña – not a day below -30 C!) The impacts of El Niño are different all over the world, but overall it tends to boost global surface temperatures. Combine this effect with the current ascent from a solar minimum and the stronger-than-ever greenhouse gas forcing, and it looks likely that 2013 will break global temperature records. That’s still a long way away, though, and who knows what will happen before then?

A Summer of Extremes

Because of our emissions of greenhouse gases like carbon dioxide, a little extra energy gets trapped in our atmosphere every day. Over time, this energy builds up. It manifests itself in the form of higher temperatures, stronger storms, larger droughts, and melting ice. Global warming, then, isn’t about temperatures as much as it is about energy.

The extra energy, and its consequences, don’t get distributed evenly around the world. Weather systems, which move heat and moisture around the planet, aren’t very fair: they tend to bully some places more than others. These days, it’s almost as if the weather picks geographical targets each season to bombard with extremes, then moves on to somewhere else. This season, the main target seems to be North America.

The warmest 12 months on record for the United States recently wrapped up with a continent-wide heat wave and drought. Thousands of temperature records were broken, placing millions of citizens in danger. By the end of June, 56% of the country was experiencing at least “moderate” drought levels – the largest drought since 1956. Wildfires took over Colorado, and extreme wind storms on the East Coast knocked out power lines and communication systems for a week. Conditions have been similar throughout much of Canada, although its climate and weather reporting systems are less accessible.

“This is what global warming looks like,”, said Professor Jonathan Overpeck from the University of Arizona, a sentiment that was echoed across the scientific community in the following weeks. By the end of the century, these conditions will be the new normal.

Does that mean that these particular events were caused by climate change? There’s no way of knowing. It could have just been a coincidence, but the extra energy global warming adds to our planet certainly made them more likely. Even without climate change, temperature records get broken all the time.

However, in an unchanging climate, there would be roughly the same amount of record highs as record lows. In a country like the United States, where temperature records are well catalogued and publicly available, it’s easy to see that this isn’t the case. From 2000-2009, there were twice as many record highs as record lows, and so far this year, there have been ten times as many:

The signal of climate change on extreme weather is slowly, but surely, emerging. For those who found this summer uncomfortable, the message from the skies is clear: Get used to it. This is only the beginning.

Modelling Geoengineering

Later in my career as a climate modeller, I expect to spend a lot of time studying geoengineering. Given the near-total absence of policy responses to prevent climate change, I think it’s very likely that governments will soon start thinking seriously about ways to artificially cool the planet. Who will they come to for advice? The climate modellers.

Some scientists are pre-emptively recognizing this need for knowledge, and beginning to run simulations of geoengineering. In fact, there’s an entire model intercomparison project dedicated to this area of study. There’s only a small handful of publications so far, but the results are incredibly interesting. Here I summarize two recent papers that model solar radiation management: the practice of offsetting global warming by partially blocking sunlight, whether by seeding clouds, adding sulfate aerosols to the stratosphere, or placing giant mirrors in space. As an added bonus, both of these papers are open access.

A group of scientists from Europe ran the same experiment on four of the world’s most complex climate models. The simulation involved instantaneously quadrupling CO2 from preindustrial levels, but offsetting it with a reduction in the solar constant, such that the net forcing was close to zero.

The global mean temperature remained at preindustrial levels. “Great,” you might think, “we’re home free!” However, climate is far more than just one globally averaged metric. Even though the average temperature stayed the same, there were still regional changes, with cooling in the tropics and warming at both poles (particularly in their respective winters):

There were regional changes in precipitation, too, but they didn’t all cancel out like with temperature. Global mean precipitation decreased, due to cloud feedbacks which are influenced by sunlight but not greenhouse gases. There were significant changes in the monsoons of south Asia, but the models disagreed as to exactly what those changes would be.

This intercomparison showed that even with geoengineering, we’re still going to get a different climate. We won’t have to worry about some of the big-ticket items like sea level rise, but droughts and forest dieback will remain a major threat. Countries will still struggle to feed their people, and species will still face extinction.

On the other side of the Atlantic, Damon Matthews and Ken Caldeira took a different approach. (By the way, what is it about Damon Matthews? All the awesome papers out of Canada seem to have his name on them.) Using the UVic ESCM, they performed a more realistic experiment in which emissions varied with time. They offset emissions from the A2 scenario with a gradually decreasing solar constant. They found that the climate responds quickly to geoengineering, and their temperature and precipitation results were very similar to the European paper.

They also examined some interesting feedbacks in the carbon cycle. Carbon sinks (ecosystems which absorb CO2, like oceans and forests) respond to climate change in two different ways. First, they respond directly to increases in atmospheric CO2 – i.e., the fertilization effect. These feedbacks (lumped together in a term we call beta) are negative, because they tend to increase carbon uptake. Second, they respond to the CO2-induced warming, with processes like forest dieback and increased respiration. These feedbacks (a term called gamma) are positive, because they decrease uptake. Currently we have both beta and gamma, and they’re partially cancelling each other out. However, with geoengineering, the heat-induced gamma goes away, and beta is entirely unmasked. As a result, carbon sinks became more effective in this experiment, and sucked extra CO2 out of the atmosphere.

The really interesting part of the Matthews and Caldeira paper was when they stopped the geoengineering. This scenario is rather plausible – wars, recessions, or public disapproval could force the world to abandon the project. So, in the experiment, they brought the solar constant back to current levels overnight.

The results were pretty ugly. Global climate rapidly shifted back to the conditions it would have experienced without geoengineering. In other words, all the warming that we cancelled out came back at once. Global average temperature changed at a rate of up to 4°C per decade, or 20 times faster than at present. Given that biological, physical, and social systems worldwide are struggling to keep up with today’s warming, this rate of change would be devastating. To make things worse, gamma came back in full force, and carbon sinks spit out the extra CO2 they had soaked up. Atmospheric concentrations went up further, leading to more warming.

Essentially, if governments want to do geoengineering properly, they have to make a pact to do so forever, no matter what the side effects are or what else happens in the world. Given how much legislation is overturned every time a country has a change in government, such a promise would be almost impossible to uphold. Matthews and Caldeira consider this reality, and come to a sobering conclusion:

In the case of inconsistent or erratic deployment (either because of shifting public opinions or unilateral action by individual nations), there would be the potential for large and rapid temperature oscillations between cold and warm climate states.

Yikes. If that doesn’t scare you, what does?

Modelling the Apocalypse

Let’s all put on our science-fiction hats and imagine that humans get wiped off the face of the Earth tomorrow. Perhaps a mysterious superbug kills us all overnight, or maybe we organize a mass migration to live on the moon. In a matter of a day, we’re gone without a trace.

If your first response to this scenario is “What would happen to the climate now that fossil fuel burning has stopped?” then you may be afflicted with Climate Science. (I find myself reacting like this all the time now. I can’t watch The Lord of the Rings without imagining how one would model the climate of Middle Earth.)

A handful of researchers, particularly in Canada, recently became so interested in this question that they started modelling it. Their motive was more than just morbid fascination – in fact, the global temperature change that occurs in such a scenario is a very useful metric. It represents the amount of warming that we’ve already guaranteed, and a lower bound for the amount of warming we can expect.

Initial results were hopeful. Damon Matthews and Andrew Weaver ran the experiment on the UVic ESCM and published the results. In their simulations, global average temperature stabilized almost immediately after CO2 emissions dropped to zero, and stayed approximately constant for centuries. The climate didn’t recover from the changes we inflicted, but at least it didn’t get any worse. The “zero-emissions commitment” was more or less nothing. See the dark blue line in the graph below:

However, this experiment didn’t take anthropogenic impacts other than CO2 into account. In particular, the impacts of sulfate aerosols and additional (non-CO2) greenhouse gases currently cancel out, so it was assumed that they would keep cancelling and could therefore be ignored.

But is this a safe assumption? Sulfate aerosols have a very short atmospheric lifetime – as soon as it rains, they wash right out. Non-CO2 greenhouse gases last much longer (although, in most cases, not as long as CO2). Consequently, you would expect a transition period in which the cooling influence of aerosols had disappeared but the warming influence of additional greenhouse gases was still present. The two forcings would no longer cancel, and the net effect would be one of warming.

Damon Matthews recently repeated his experiment, this time with Kirsten Zickfeld, and took aerosols and additional greenhouse gases into account. The long-term picture was still the same – global temperature remaining at present-day levels for centuries – but the short-term response was different. For about the first decade after human influences disappeared, the temperature rose very quickly (as aerosols were eliminated from the atmosphere) but then dropped back down (as additional greenhouse gases were eliminated). This transition period wouldn’t be fun, but at least it would be short. See the light blue line in the graph below:

We’re still making an implicit assumption, though. By looking at the graphs of constant global average temperature and saying “Look, the problem doesn’t get any worse!”, we’re assuming that regional temperatures are also constant for every area on the planet. In fact, half of the world could be warming rapidly and the other half could be cooling rapidly, a bad scenario indeed. From a single global metric, you can’t just tell.

A team of researchers led by Nathan Gillett recently modelled regional changes to a sudden cessation of CO2 emissions (other gases were ignored). They used a more complex climate model from Environment Canada, which is better for regional projections than the UVic ESCM.

The results were disturbing: even though the average global temperature stayed basically constant after CO2 emissions (following the A2 scenario) disappeared in 2100, regional temperatures continued to change. Most of the world cooled slightly, but Antarctica and the surrounding ocean warmed significantly. By the year 3000, the coasts of Antarctica were 9°C above preindustrial temperatures. This might easily be enough for the West Antarctic Ice Sheet to collapse.

Why didn’t this continued warming happen in the Arctic? Remember that the Arctic is an ocean surrounded by land, and temperatures over land change relatively quickly in response to a radiative forcing. Furthermore, the Arctic Ocean is small enough that it’s heavily influenced by temperatures on the land around it. In this simulation, the Arctic sea ice actually recovered.

On the other hand, Antarctica is land surrounded by a large ocean that mixes heat particularly well. As a result, it has an extraordinarily high heat capacity, and takes a very long time to fully respond to changes in temperature. So, even by the year 3000, it was still reacting to the radiative forcing of the 21st century. The warming ocean surrounded the land and caused it to warm as well.

As a result of the cooling Arctic and warming Antarctic, the Intertropical Convergence Zone (an important wind current) shifted southward in the simulation. As a result, precipitation over North Africa continued to decrease – a situation that was already bad by 2100. Counterintuitively, even though global warming had ceased, some of the impacts of warming continued to worsen.

These experiments, assuming an overnight apocalypse, are purely hypothetical. By definition, we’ll never be able to test their accuracy in the real world. However, as a lower bound for the expected impacts of our actions, the results are sobering.

Cumulative Emissions and Climate Models

As my summer research continues, I’m learning a lot about previous experiments that used the UVic ESCM (Earth System Climate Model), as well as beginning to run my own. Over the past few years, the UVic model has played an integral role in a fascinating little niche of climate research: the importance of cumulative carbon emissions.

So far, global warming mitigation policies have focused on choosing an emissions pathway: making a graph of desired CO2 emissions vs. time, where emissions slowly reduce to safer levels. However, it turns out that the exact pathway we take doesn’t actually matter. All that matters is the area under the curve: the total amount of CO2 we emit, or “cumulative emissions” (Zickfeld et al, 2009). So if society decides to limit global warming to 2°C (a common target), there is a certain amount of total CO2 that the entire world is allowed to emit. We can use it all up in the first ten years and then emit nothing, or we can spread it out – either way, it will lead to the same amount of warming.

If you delve a little deeper into the science, it turns out that temperature change is directly proportional to cumulative emissions (Matthews et al, 2009). In other words, if you draw a graph of the total amount of warming vs. total CO2 emitted, it will be a straight line.

This is counter-intuitive, because the intermediate processes are definitely not straight lines. Firstly, the graph of warming vs. CO2 concentrations is logarithmic: as carbon dioxide builds up in the atmosphere, each extra molecule added has less and less effect on the climate.

However, as carbon dioxide builds up and the climate warms, carbon sinks (which suck up some of our emissions) become less effective. For example, warmer ocean water can’t hold as much CO2, and trees subjected to heat stress often die and stop photosynthesizing. Processes that absorb CO2 become less effective, so more of our emissions actually stay in the air. Consequently, the graph of CO2 concentrations vs. CO2 emissions is exponential.

These two relationships, warming vs. concentrations and concentrations vs. emissions, more or less cancel each other out, making total warming vs. total emissions linear. It doesn’t matter how much CO2 was in the air to begin with, or how fast the allowable emissions get used up. Once society decides how much warming is acceptable, all we need to do is nail down the proportionality constant (the slope of the straight line) in order to find out how much carbon we have to work with. Then, that number can be handed to economists, who will figure out the best way to spread out those emissions while causing minimal financial disruption.

Finding that slope is a little tricky, though. Best estimates, using models as well as observations, generally fall between 1.5°C and 2°C for every trillion tonnes of carbon emitted (Matthews et al, 2009; Allen et al, 2009; Zickfeld et al, 2009). Keep in mind that we’ve already emitted about 0.6 trillion tonnes of carbon (University of Oxford). Following a theme commonly seen in climate research, the uncertainty is larger on the high end of these slope estimates than on the low end. So if the real slope is actually lower than our best estimate, it’s probably only a little bit lower; if it’s actually higher than our best estimate, it could be much higher, and the problem could be much worse than we thought.

Also, this approach ignores other human-caused influences on global temperature, most prominently sulfate aerosols (which cause cooling) and greenhouse gases other than carbon dioxide (which cause warming). Right now, these two influences basically cancel, which is convenient for scientists because it means we can ignore both of them. Typically, we assume that they will continue to cancel far into the future, which might not be the case – there’s a good chance that developing countries like China and India will reduce their emissions of sulfate aerosols, allowing the non-CO2 greenhouse gases to dominate and cause warming. If this happened, we couldn’t even lump the extra greenhouse gases into the allowable CO2 emissions, because the warming they cause does depend on the exact pathway. For example, methane has such a short atmospheric lifetime that “cumulative methane emissions” is a useless measurement, and certainly isn’t directly proportional to temperature change.

This summer, one of my main projects at UVic is to compare what different models measure the slope of temperature change vs. cumulative CO2 emissions to be. As part of the international EMIC intercomparison project that the lab is coordinating, different modelling groups have sent us their measurements of allowable cumulative emissions for 1.5°C, 2°C, 3°C, and 4°C global warming. Right now (quite literally, as I write this) I’m running the same experiments on the UVic model. It’s very exciting to watch the results trickle in. Perhaps my excitement towards the most menial part of climate modelling, watching as the simulation chugs along, is a sign that I’m on the right career path.

Summer Research

I recently started working for the summer, with Andrew Weaver’s research group at the University of Victoria. If you’re studying climate modelling in Canada, this is the place to be. They are a fairly small group, but continually churn out world-class research.

Many of the projects here use the group’s climate model, the UVic ESCM (Earth System Climate Model). I am working with the ESCM this summer, and have previously read most of the code, so I feel pretty well acquainted with it.

The climate models that most people are familiar with are the really complex ones. GCMs (General Circulation Models or Global Climate Models, depending on who you talk to) use high resolution, a large number of physical processes, and relatively few parameterizations to emulate the climate system as realistically as possible. These are the models that take weeks to run on the world’s most powerful supercomputers.

EMICs (Earth System Models of Intermediate Complexity) are a step down in complexity. They run at a lower resolution than GCMs and have more paramaterizations. Individual storms and wind patterns (and sometimes ocean currents as well) typically are not resolved – instead, the model predicts the statistics of these phenomena. Often, at least one component (such as sea ice) is two-dimensional.

The UVic ESCM is one of the most complex EMICs – it really sits somewhere between a GCM and an EMIC. It has a moderately high resolution, with a grid of 3.6° longitude by 1.8° latitude (ten thousand squares in all), and 19 vertical layers in the ocean. Its ocean, land, and sea ice component would all belong in a GCM. It even has a sediment component, which simulates processes that most GCMs ignore.

The only reason that the UVic model is considered an EMIC is because of its atmosphere component. This part of the model is two-dimensional and parameterizes most processes. For example, clouds aren’t explicitly simulated – instead, as soon as the relative humidity of a region reaches 85%, the atmospheric moisture falls out as rain (or snow). You would never see this kind of atmosphere in a GCM, and it might seem strange for scientists to deliberately build an unrealistic model. However, this simplified atmosphere gives the UVic ESCM a huge advantage over GCMs: speed.

For example, today I tested out the model with an example simulation. It ran on a Linux cluster with 32 cores, which I accessed remotely from a regular desktop. It took about 7 minutes of real time to simulate each year and record annual averages for several dozen variables. In comparison, many GCMs take an entire day of real time to simulate a year, while running on a machine with thousands of cores. Most of this work is coming from the atmospheric component, which requires short time steps. Consequently, cutting down on complexity in the atmosphere gives the best return on model efficiency.

Because the UVic model is so fast, it’s suitable for very long runs. Simulating a century is an “overnight job”, and several millennia is no big deal (especially if you run it on WestGrid). As a result, long-term processes have come to dominate the research in this lab: carbon cycle feedbacks, sensitivity studies, circulation in the North Atlantic. It simply isn’t feasible to simulate these millennial-scale processes on a GCM – so, by sacrificing complexity, we’re able to open up brand new areas of research. Perfectly emulating the real world isn’t actually the goal of most climate modelling.

Of course, the UVic ESCM is imperfect. Like all models, it has its quirks – an absolute surface temperature that’s a bit too low, projections of ocean heat uptake that are a bit too high. It doesn’t give reliable projections of regional climate, so you can only really use globally or hemispherically averaged quantities. It’s not very good at decadal-scale projection. However, other models are suitable for these short-term and small-scale simulations: the same GCMs that suffer when it comes to speed. In this way, climate models perform “division of labour”. By developing many different models of varying complexity, we can make better use of the limited computer power available to us.

I have several projects lined up for the summer, and right now I’m reading a lot of papers to familiarize myself with the relevant sub-fields. There have been some really cool discoveries in the past few years that I wasn’t aware of. I have lots of ideas for posts to write about these papers, as well as the projects I’m involved in, so check back often!

March Migration Data

In my life outside of climate science, I am an avid fan of birdwatching, and am always eager to connect the two. Today I’m going to share some citizen science data I collected.

Last year, I started taking notes during the spring migration. Every time I saw a species for the first time that year, I made a note of the date. I planned to repeat this process year after year, mainly so I would know when to expect new arrivals at our bird feeders, but also in an attempt to track changes in migration. Of course, this process is imperfect (it simply provides an upper bound for when the species arrives, because it’s unlikely that I witness the very first arrival in the city) but it’s better than nothing.

Like much of the Prairies and American Midwest, we’ve just had our warmest March on record, a whopping 8 C above normal. Additionally, every single bird arrival I recorded in March was earlier than last year, sometimes by over 30 days.

I don’t think this is a coincidence. I haven’t been any more observant than last year – I’ve spent roughly the same amount of time outside in roughly the same places. It also seems unlikely for such a systemic change to be a product of chance, although I would need much more data to figure that out for sure. Also, some birds migrate based on hours of daylight rather than temperature. However, I find it very interesting that, so far, not a single species has been late.

Because I feel compelled to graph everything, I typed all this data into Excel and made a little scatterplot. The mean arrival date was 20.6 days earlier than last year, with a standard deviation of 8.9 days.