Open Thread

Apologies for my silence recently – I just finished writing some final exams that I missed for the AGU conference, so I’ve been studying hard ever since Boxing Day.

I am working on a larger piece about climate models: an introduction to how they work and why they are useful. That will take about a week to finish, so in the mean time, here is an open thread to keep things moving.

Some possible discussion topics from posts I’ve enjoyed:

Enjoy!

Winter in the Woods

Do not burn yourself out. Be as I am – a reluctant enthusiast… a part time crusader, a half-hearted fanatic. Save the other half of yourselves and your lives for pleasure and adventure. It is not enough to fight for the land; it is even more important to enjoy it. While you can. While it is still there. So get out there and mess around with your friends, ramble out yonder and explore the forests, encounter the grizz, climb the mountains. Run the rivers, breathe deep of that yet sweet and lucid air, sit quietly for a while and contemplate the precious stillness, that lovely, mysterious and awesome space. Enjoy yourselves, keep your brain in your head and your head firmly attached to your body, the body active and alive, and I promise you this much: I promise you this one sweet victory over our enemies, over those deskbound people with their hearts in a safe deposit box and their eyes hypnotized by desk calculators. I promise you this: you will outlive the bastards.

So writes Edward Abbey, in a passage that Ken sent to me nearly two years ago. The quote is now stuck to my fridge, and I abide by it as best I can.

It’s pretty easy to find areas of untouched forest within my city. Living in a floodplain, it’s only practical to leave natural vegetation growing around the rivers – it acts as a natural sponge when the water rises. In the warmer months, hiking in the woods is convenient, particularly because I can bike to the edge of the river. But in the winter, it’s not so easy. The past few months have consistently been about 10 C above normal, though, and today I found a shortcut that made the trip to the woods walkable.

The aspen parkland in winter is strange. Most wildlife travel south or begin hibernating by early October, and no evergreen species grow here naturally. As you walk through the naked branches, it’s easy to think of the woods as desolate. But if you slow down, pay attention, and look around more carefully, you see signs of life in the distance:

Black-capped Chickadee

White-tailed Deer

If you stand still and do your best to look non-threatening, some of the more curious animals might come for a closer inspection:

If you imitate a bird's call well enough, it will come right up to you

A mother deer and her fawn, probably about eight months old

The species that live here year-round are some of the most resilient on the continent. They have survived 40 above and 40 below, near-annual droughts and floods, and 150 years of colonization. The Prairies is a climate of extremes, and life has evolved to thrive in those extremes.

So maybe this isn’t the land I am fighting for – it will probably be able to handle whatever climate change throws at it – but it is the land I love regardless.

Happy Christmas to everyone, and please go out and enjoy the land you’re fighting for, as a gift to yourself.

A Little Bit of Hope

I went to a public lecture on climate change last night (because I just didn’t get enough of that last week at AGU, apparently), where four professors from different departments at my university spoke about their work. They were great speeches – it sort of reminded me of TED Talks – but I was actually most interested in the audience questions and comments afterward.

There was the token crazy guy who stood up and said “The sun is getting hotter every day and one day we’re all going to FRY! So what does that say about your global warming theory? Besides, if it was CO2 we could all just stop breathing!” Luckily, everybody laughed at his comments…

There were also some more reasonable-sounding people, repeating common myths like “It’s a natural cycle” and “Volcanoes emit more CO2 than humans“. The speakers did a good job of explaining why these claims were false, but I still wanted to pull out the Skeptical Science app and wave it in the air…

Overall, though, the audience seemed to be composed of concerned citizens who understood the causes and severity of climate change, and were eager to learn about impacts, particularly on extreme weather. It was nice to see an audience moving past this silly public debate into a more productive one about risk management.

The best moment, though, was on the bus home. There was a first-year student in the seat behind me – I assume he came to see the lecture as well, but maybe he just talks about climate change on the bus all the time. He was telling his friend about sea level rise, and he was saying all the right things – we can expect one or two metres by the end of the century, which doesn’t sound like a lot, but it’s enough to endanger many densely populated coastal cities, as well as kill vegetation due to seawater seeping in.

He even had the statistics right! I was so proud! I was thinking about turning around to join in the conversation, but by then I had been listening in for so long that it would have been embarrassing.

It’s nice to see evidence of a shift in public understanding, even if it’s only anecdotal. Maybe we’re doing something right after all.

The Software Architecture of Global Climate Models

Last week at AGU, I presented the results of the project Steve Easterbrook and I worked on this summer. Click the thumbnail on the left for a full size PDF. Also, you can download the updated versions of our software diagrams:

  • COSMOS (COmmunity earth System MOdelS) 1.2.1
  • Model E: Oct. 11, 2011 snapshot
  • HadGEM3 (Hadley Centre Global Environmental Model, version 3): August 2009 snapshot
  • CESM (Community Earth System Model) 1.0.3
  • GFDL (Geophysical Fluid Dynamics Laboratory), Climate Model 2.1 coupled to MOM (Modular Ocean Model) 4.1
  • IPSL (Institut Pierre Simon Laplace), Climate Model 5A
  • UVic ESCM (Earth System Climate Model) 2.9

And, since the most important part of poster sessions is the schpiel you give and the conversations you have, here is my schpiel:

Steve and I realized that while comparisons of the output of global climate models are very common (for example, CMIP5: Coupled Model Intercomparison Project Phase 5), nobody has really sat down and compared their software structure. We tried to fill this gap in research with a qualitative comparison study of seven models. Six of them are GCMs (General Circulation Models – the most complex climate simulations) in the CMIP5 ensemble; one, the UVic model, is not in CMIP because it’s really more of an EMIC (Earth System Model of Intermediate Complexity – simpler than a GCM). However, it’s one of the most complex EMICs, and contains a full GCM ocean, so we thought it would present an interesting boundary case. (Also, the code was easier to get access to than the corresponding GCM from Environment Canada. When we write this up into a paper we will probably use that model instead.)

I created a diagram of each model’s architecture. The area of each bubble is roughly proportional to the lines of code in that component, which we think is a pretty good proxy for complexity – a more complex model will have more subroutines and functions than a simple one. The bubbles are to scale within each model, but not between models, as the total lines of code in a model varies by about a factor of 10. A bit difficult to fit on a poster and still make everything readable! Fluxes from each component are represented by coloured arrows (the same colour as the bubble), and often pass through the coupler before reaching another component.

We examined the amount of encapsulation of components, which varies widely between models. CESM, on one end of the spectrum, isolates every component completely, particularly in the directory structure. Model E, on the other hand, places nearly all of its files in the same directory, and has a much higher level of integration between components. This is more difficult for a user to read, but it has benefits for data transfer.

While component encapsulation is attractive from a software engineering perspective, it poses problems because the real world is not so encapsulated. Perhaps the best example of this is sea ice. It floats on the ocean, its extent changing continuously. It breaks up into separate chunks and can form slush with the seawater. How do you split up ocean code and ice code? CESM keeps the two components completely separate, with a transient boundary between them. IPSL represents ice as an encapsulated sub-component of their ocean model, NEMO (Nucleus for European Modeling of the Ocean). COSMOS integrates both ocean and ice code together in MPI-OM (Max Planck Institute Ocean Model).

GFDL took a completely different, and rather innovative, approach. Sea ice in the GFDL model is an interface, a layer over the ocean with boolean flags in each cell indicating whether or not ice is present. All fluxes to and from the ocean must pass through the “sea ice”, even if they’re at the equator and the interface is empty.

Encapsulation requires code to tie components together, since the climate system is so interconnected. Every model has a coupler, which fulfills two main functions: controlling the main time-stepping loop, and passing data between components. Some models, such as CESM, use the coupler for every interaction. However, if two components have the same grid, no interpolation is necessary, so it’s often simpler just to pass them directly. Sometimes this means a component can be completely disconnected from the coupler, such as the land model in IPSL; other times it still uses the coupler for other interactions, such as the HadGEM3 arrangement with direct ocean-ice fluxes but coupler-controlled ocean-atmosphere and ice-atmosphere fluxes.

While it’s easy to see that some models are more complex than others, it’s also interesting to look at the distribution of complexity within a model. Often the bulk of the code is concentrated in one component, due to historical software development as well as the institution’s conscious goals. Most of the models are atmosphere-centric, since they were created in the 1970s when numerical weather prediction was the focus of the Earth system modelling community. Weather models require a very complex atmosphere but not a lot else, so atmospheric routines dominated the code. Over time, other components were added, but the atmosphere remained at the heart of the models. The most extreme example is HadGEM3, which actually uses the same atmosphere model for both weather prediction and climate simulations!

The UVic model is quite different. The University of Victoria is on the west coast of Canada, and does a lot of ocean studies, so the model began as a branch of the MOM ocean model from GFDL. The developers could have coupled it to a complex atmosphere model in an effort to mimic full GCMs, but they consciously chose not to. Atmospheric routines need very short time steps, so they eat up most of the run time, and make very long simulations not feasible. In an effort to keep their model fast, UVic created EMBM (Energy Moisture Balance Model), an extremely simple atmospheric model (for example, it doesn’t include dynamic precipitation – it simply rains as soon as a certain humidity is reached). Since the ocean is the primary moderator of climate over the long run, the UVic ESCM still outputs global long-term averages that match up nicely with GCM results.

Finally, CESM and Model E could not be described as “land-centric”, but land is definitely catching up – it’s even surpassed the ocean model in both cases! These two GCMs are cutting-edge in terms of carbon cycle feedbacks, which are primarily terrestrial, and likely very important in determining how much warming we can expect in the centuries to come. They are currently poorly understood and difficult to model, so they are a new frontier for Earth system modelling. Scientists are moving away from a binary atmosphere-ocean paradigm and towards a more comprehensive climate system representation.

I presented this work to some computer scientists in the summer, and many of them asked, “Why do you need so many models? Wouldn’t it be better to just have one really good one that everyone collaborated on?” It might be simpler from a software engineering perspective, but for the purposes of science, a variety of diverse models is actually better. It means you can pick and choose which model suits your experiment. Additionally, it increases our confidence in climate model output, because if dozens of independent models are saying the same thing, they’re more likely to be accurate than if just one model made a given prediction. Diversity in model architecture arguably produces the software engineering equivalent of perturbed physics, although it’s not systematic or deliberate.

A common question people asked me at AGU was, “Which model do you think is the best?” This question is impossible to answer, because it depends on how you define “best”, which depends on what experiment you are running. Are you looking at short-term, regional impacts at a high resolution? HadGEM3 would be a good bet. Do you want to know what the world will be like in the year 5000? Go for UVic, otherwise you will run out of supercomputer time! Are you studying feedbacks, perhaps the Paleocene-Eocene Thermal Maximum? A good choice would be CESM. So you see, every model is the best at something, and no model can be the best at everything.

You might think the ideal climate model would mimic the real world perfectly. It would still have discrete grid cells and time steps, but it would be like a digital photo, where the pixels are so small that it looks continuous even when you zoom in. It would contain every single Earth system process known to science, and would represent their connections and interactions perfectly.

Such a model would also be a nightmare to use and develop. It would run slower than real time, making predictions of the future useless. The code would not be encapsulated, so organizing teams of programmers to work on certain aspects of the model would be nearly impossible. It would use more memory than computer hardware offers us – despite the speed of computers these days, they’re still too slow for many scientific models!

We need to balance complexity with feasibility. A hierarchy of complexity is important, as is a variety of models to choose from. Perfectly reproducing the system we’re trying to model actually isn’t the ultimate goal.

Please leave your questions below, and hopefully we can start a conversation – sort of a virtual poster session!

Labels

For a long time I have struggled with what to call the people who insist that climate change is natural/nonexistent/a global conspiracy. “Skeptics” is their preferred term, but I refuse to give such a compliment to those who don’t deserve it. Skepticism is a good thing in science, and it’s not being applied by self-professed “climate skeptics”. This worthy label has been hijacked by those who seek to redefine it.

“Deniers” is more accurate, in my opinion, but I feel uncomfortable using it. I don’t want to appear closed-minded and alienate those who are confused or undecided. Additionally, many people are in the audience of deniers, but aren’t in denial themselves. They repeat the myths they hear from other sources, but you can easily talk them out of their misconceptions using evidence.

I posed this question to some people at AGU. Which word did they use? “Pseudoskeptics” and “misinformants” are both accurate terms, but too difficult for a new reader to understand. My favourite answer, which I think I will adopt, was “contrarians”. Simple, clear, and non-judgmental. It emphasizes what they think, not how they think. Also, it hints that they are going against the majority in the scientific community. Another good suggestion was to say someone is “in denial”, rather than “a denier” – it depersonalizes the accusation.

John Cook, when I asked him this question, turned it around: “What should we call ourselves?” he asked, and I couldn’t come up with an answer. I feel that not being a contrarian is a default position that doesn’t require a qualifier. We are just scientists, communicators, and concerned citizens, and unless we say otherwise you can assume we follow the consensus. (John thinks we should call ourselves “hotties”, but apparently it hasn’t caught on.)

“What should I call myself?” is another puzzler, since I fall into multiple categories. Officially I’m an undergrad student, but I’m also getting into research, which isn’t a required part of undergraduate studies. In some ways I am a journalist too, but I see that as a side project rather than a career goal. So I can’t call myself a scientist, or even a fledgling scientist, but I feel like I’m on that path – a scientist larva, perhaps?

Thoughts?

General Thoughts on AGU

I returned home from the AGU Fall Meeting last night, and after a good night’s sleep I am almost recovered – it’s amazing how tired science can make you!

The whole conference felt sort of surreal. Meeting and conversing with others was definitely the best part. I shook the hand of James Hansen and assured him that he is making a difference. I talked about my research with Gavin Schmidt. I met dozens of people that were previously just names on a screen, from top scientists like Michael Mann and Ben Santer to fellow bloggers like Michael Tobis and John Cook.

I filled most of a journal with notes I took during presentations, and saw literally hundreds of posters. I attended a workshop on climate science communication, run by Susan Joy Hassol and Richard Sommerville, which fundamentally altered my strategies for public outreach. Be sure to check out their new website, and their widely acclaimed Physics Today paper that summarizes most of their work.

Speaking of fabulous communication, take a few minutes to watch this memorial video for Stephen Schneider – it’s by the same folks who turned Bill McKibben’s article into a video:

AGU inspired so many posts that I think I will publish something every day this week. Be sure to check back often!

AGU 2011

I know that many of you will be at the annual American Geophysical Union conference next week in San Francisco. If so, I’d invite you to come by and take a look at our poster! It will be up all Thursday morning in Halls A-C, Moscone South. I will be around for at least part of the morning to chat and answer questions.

You can view an electronic version of our poster, as well as read our abstract and leave comments, on the new AGU ePosters site.

Hope to see some of you next week!

Uncertainty

Part 5 in a series of 5 for NextGen Journal
Read Part 1, Part 2, Part 3, and Part 4

Scientists can never say that something is 100% certain, but they can come pretty close. After a while, a theory becomes so strong that the academic community accepts it and moves on to more interesting problems. Replicating an experiment for the thousandth time just isn’t a good use of scientific resources. For example, conducting a medical trial to confirm that smoking increases one’s risk of cancer is no longer very useful; we covered that decades ago. Instead, a medical trial to test the effectiveness of different strategies to help people quit smoking will lead to much greater scientific and societal benefit.

In the same manner, scientists have known since the 1970s that human emissions of greenhouse gases are exerting a warming force on the climate. More recently, the warming started to show up, in certain patterns that confirm it is caused by our activities. These facts are no longer controversial in the scientific community (the opinion pages of newspapers are another story, though). While they will always have a tiny bit of uncertainty, it’s time to move on to more interesting problems. So where are the real uncertainties? What are the new frontiers of climate science?

First of all, projections of climate change depend on what the world decides to do about climate change – a metric that is more uncertain than any of the physics underlying our understanding of the problem. If we collectively step up and reduce our emissions, both quickly and significantly, the world won’t warm too much. If we ignore the problem and do nothing, it will warm a great deal. At this point, our actions could go either way.

Additionally, even though we know the world is going to warm, we don’t know exactly how much, even given a particular emission scenario. We don’t know exactly how sensitive the climate system is, because it’s quite a complex beast. However, using climate models and historical data, we can get an idea. Here is a probability density function for climate sensitivity: the greater the area under the curve at a specific point on the x-axis, the greater the probability that the climate sensitivity is equal to that value of x (IPCC, 2007):

This curve shows us that climate sensitivity is most likely around 3 degrees Celsius for every doubling of atmospheric carbon dixoide, since that’s where the area peaks. There’s a small chance that it’s less than that, so the world might warm a little less. But there’s a greater chance that climate sensitivity is greater than 3 degrees so the world will warm more. So this graph tells us something kind of scary: if we’re wrong about climate sensitivity being about 3 degrees, we’re probably wrong in the direction we don’t want – that is, the problem being worse than we expect. This metric has a lot to do with positive feedbacks (“vicious cycles” of warming) in the climate system.

Another area of uncertainty is precipitation. Temperature is a lot easier to forecast than precipitation, both regionally and globally. With global warming, the extra thermal energy in the climate system will lead to more water in the air, so there will be more precipitation overall – but the extra energy also drives evaporation of surface water to increase. Some areas will experience flooding, and some will experience drought; many areas will experience some of each, depending on the time of year. In summary, we will have more of each extreme when it comes to precipitation, but the when and where is highly uncertain.

Scientists are also unsure about the rate and extent of future sea level rise. Warming causes the sea to rise for two different reasons:

  1. Water expands as it warms, which is easy to model;
  2. Glaciers and ice sheets melt and fall into the ocean, which is very difficult to model.

If we cause the Earth to warm indefinitely, all the ice in the world will turn into water, but we won’t get that far (hopefully). So how much ice will melt, and how fast will it go? This depends on feedbacks in the climate system, glacial dynamics, and many other phenomena that are quantitatively poorly understood.

These examples of uncertainty in climate science, just a few of many, don’t give us an excuse to do nothing about the problem. As Brian, a Master’s student from Canada, wrote, “You don’t have to have the seventh decimal place filled in to see that the number isn’t looking good.”. We know that there is a problem, and it might be somewhat better or somewhat worse than scientists are currently predicting, but it won’t go away. As we noted above, in many cases it’s more likely to be worse than it is to be better. Even a shallow understanding of the implications of “worse” should be enough for anyone to see the necessity of action.

My Dishpan Climate Model

About two years ago, I discovered the concept of “dishpan climate models”, through Iain Stewart’s Climate Wars documentary. The experiment is pretty simple: a large bowl filled with water (representing one hemisphere of the Earth) with a block of ice in the middle (a polar region) rotates on a turntable with a Bunsen Burner (the Sun) heating it from one side. By injecting some dye into the water, you can see regular currents from heat transport and the Coriolis effect. Spencer Weart dug up some fascinating results from the days when dishpan climate models were the only sort available: researchers were able to simulate the Hadley circulation, Rossby waves, and the Gulf Stream.

I wanted to try this out for myself. Iain Stewart had made it look easy enough, and he got some really neat currents flowing. So one Saturday afternoon a friend and I got to work in my kitchen.

We started by figuring out how to rotate the bowl. My family doesn’t own a record player, so we couldn’t use that as a turntable. We tried to rig something up out of an old toy helicopter motor, but it wasn’t strong enough. Eventually we settled for a Lazy Susan which we spun by hand. It wasn’t a constant rotation, but it would have to do.

Then Antarctica, which consisted of a handful of ice cubes, kept floating away from the centre of the bowl. Soon the ice cubes melted and there were none left in the freezer. We filled a Ziploc bag with frozen corn, which wasn’t quite as buoyant, and used that for Antarctica instead.

Unsurprisingly, there was no Bunsen burner in my kitchen cupboard, so the Sun was represented by a paraffin candle that sort of smelled like cinnamon.

The only serious problem remaining was the dye. Every kind of dye we tried – food colouring, milk, food colouring mixed with milk – would completely homogenize with the water after just a few rotations, so all the currents were invisible.

The only liquid in my kitchen that wouldn’t mix with water was vegetable oil, so we dyed some of it blue and poured it in. This was a really really bad idea. The oil seemed to be attracted to the plastic bag keeping Antarctica together, so it all washed up onto the continent like some kind of awful petroleum spill in the Antarctic Ocean.

At that point, our climate model looked like this:

I would like to try this again some day, perhaps when I have access to a better laboratory than my kitchen. Any ideas for improvement (besides the obvious)? In particular, what kind of dye does work, and how does Antarctica stay together without being encased in plastic?

A Vast Machine

I read Paul Edward’s A Vast Machine this summer while working with Steve Easterbrook. It was highly relevant to my research, but I would recommend it to anyone interested in climate change or mathematical modelling. Think The Discovery of Global Warming, but more specialized.

Much of the public seems to perceive observational data as superior to scientific models. The U.S. government has even attempted to mandate that research institutions focus on data above models, as if it is somehow more trustworthy. This is not the case. Data can have just as many problems as models, and when the two disagree, either could be wrong. For example, in a high school physics lab, I once calculated the acceleration due to gravity to be about 30 m/s2. There was nothing wrong with Newton’s Laws of Motion – our instrumentation was just faulty.

Additionally, data and models are inextricably linked. In meteorology, GCMs produce forecasts from observational data, but that same data from surface stations was fed through a series of algorithms – a model for interpolation – to make it cover an entire region. “Without models, there are no data,” Edwards proclaims, and he makes a convincing case.

The majority of the book discussed the history of climate modelling, from the 1800s until today. There was Arrhenius, followed by Angstrom who seemed to discredit the entire greenhouse theory, which was not revived until Callendar came along in the 1930s with a better spectroscope. There was the question of the ice ages, and the mistaken perception that forcing from CO2 and forcing from orbital changes (the Milankovitch model) were mutually exclusive.

For decades, those who studied the atmosphere were split into three groups, with three different strategies. Forecasters needed speed in their predictions, so they used intuition and historical analogues rather than numerical methods. Theoretical meteorologists wanted to understand weather using physics, but numerical methods for solving differential equations didn’t exist yet, so nothing was actually calculated. Empiricists thought the system was too complex for any kind of theory, so they just described climate using statistics, and didn’t worry about large-scale explanations.

The three groups began to merge as the computer age dawned and large amounts of calculations became feasible. Punch-cards came first, speeding up numerical forecasting considerably, but not enough to make it practical. ENIAC, the first model on a digital computer, allowed simulations to run as fast as real time (today the model can run on a phone, and 24 hours are simulated in less than a second).

Before long, theoretical meteorologists “inherited” the field of climatology. Large research institutions, such as NCAR, formed in an attempt to pool computing resources. With incredibly simplistic models and primitive computers (2-3 KB storage), the physicists were able to generate simulations that looked somewhat like the real world: Hadley cells, trade winds, and so on.

There were three main fronts for progress in atmospheric modelling: better numerical methods, which decreased errors from approximation; higher resolution models with more gridpoints; and higher complexity, including more physical processes. As well as forecast GCMs, which are initialized with observations and run at maximum resolution for about a week of simulated time, scientists developed climate GCMs. These didn’t use any observational data at all; instead, the “spin-up” process fed known forcings into a static Earth, started the planet spinning, and waited until it settled down into a complex climate and circulation that looked a lot like the real world. There was still tension between empiricism and theory in models, as some factors were parameterized rather than being included in the spin-up.

The Cold War, despite what it did to international relations, brought tremendous benefits to atmospheric science. Much of our understanding of the atmosphere and the observation infrastructure traces back to this period, when governments were monitoring nuclear fallout, spying on enemy countries with satellites, and considering small-scale geoengineering as warfare.

I appreciated how up-to-date this book was, as it discussed AR4, the MSU “satellites show cooling!” controversy, Watt’s Up With That, and the Republican anti-science movement. In particular, Edwards emphasized the distinction between skepticism for scientific purposes and skepticism for political purposes. “Does this mean we should pay no attention to alternative explanations or stop checking the data?” he writes. “As a matter of science, no…As a matter of policy, yes.”

Another passage beautifully sums up the entire narrative: “Like data about the climate’s past, model predictions of its future shimmer. Climate knowledge is probabilistic. You will never get a single definitive picture, either of exactly how much the climate has already changed or of how much it will change in the future. What you will get, instead, is a range. What the range tells you is that “no change at all” is simply not in the cards, and that something closer to the high end of the range – a climate catastrophe – looks all the more likely as time goes on.”