Modelling Geoengineering, Part II

Near the end of my summer at the UVic Climate Lab, all the scientists seemed to go on vacation at the same time and us summer students were left to our own devices. I was instructed to teach Jeremy, Andrew Weaver’s other summer student, how to use the UVic climate model – he had been working with weather station data for most of the summer, but was interested in Earth system modelling too.

Jeremy caught on quickly to the basics of configuration and I/O, and after only a day or two, we wanted to do something more exciting than the standard test simulations. Remembering an old post I wrote, I dug up this paper (open access) by Damon Matthews and Ken Caldeira, which modelled geoengineering by reducing incoming solar radiation uniformly across the globe. We decided to replicate their method on the newest version of the UVic ESCM, using the four RCP scenarios in place of the old A2 scenario. We only took CO2 forcing into account, though: other greenhouse gases would have been easy enough to add in, but sulphate aerosols are spatially heterogeneous and would complicate the algorithm substantially.

Since we were interested in the carbon cycle response to geoengineering, we wanted to prescribe CO2 emissions, rather than concentrations. However, the RCP scenarios prescribe concentrations, so we had to run the model with each concentration trajectory and find the equivalent emissions timeseries. Since the UVic model includes a reasonably complete carbon cycle, it can “diagnose” emissions by calculating the change in atmospheric carbon, subtracting contributions from land and ocean CO2 fluxes, and assigning the residual to anthropogenic sources.

After a few failed attempts to represent geoengineering without editing the model code (e.g., altering the volcanic forcing input file), we realized it was unavoidable. Model development is always a bit of a headache, but it makes you feel like a superhero when everything falls into place. The job was fairly small – just a few lines that culminated in equation 1 from the original paper – but it still took several hours to puzzle through the necessary variable names and header files! Essentially, every timestep the model calculates the forcing from CO2 and reduces incoming solar radiation to offset that, taking changing planetary albedo into account. When we were confident that the code was working correctly, we ran all four RCPs from 2006-2300 with geoengineering turned on. The results were interesting (see below for further discussion) but we had one burning question: what would happen if geoengineering were suddenly turned off?

By this time, having completed several thousand years of model simulations, we realized that we were getting a bit carried away. But nobody else had models in the queue – again, they were all on vacation – so our simulations were running three times faster than normal. Using restart files (written every 100 years) as our starting point, we turned off geoengineering instantaneously for RCPs 6.0 and 8.5, after 100 years as well as 200 years.

Results

Similarly to previous experiments, our representation of geoengineering still led to sizable regional climate changes. Although average global temperatures fell down to preindustrial levels, the poles remained warmer than preindustrial while the tropics were cooler:

Also, nearly everywhere on the globe became drier than in preindustrial times. Subtropical areas were particularly hard-hit. I suspect that some of the drying over the Amazon and the Congo is due to deforestation since preindustrial times, though:

Jeremy also made some plots of key one-dimensional variables for RCP8.5, showing the results of no geoengineering (i.e. the regular RCP – yellow), geoengineering for the entire simulation (red), and geoengineering turned off in 2106 (green) or 2206 (blue):

It only took about 20 years for average global temperature to fall back to preindustrial levels. Changes in solar radiation definitely work quickly. Unfortunately, changes in the other direction work quickly too: shutting off geoengineering overnight led to rates of warming up to 5 C / decade, as the climate system finally reacted to all the extra CO2. To put that in perspective, we’re currently warming around 0.2 C / decade, which far surpasses historical climate changes like the Ice Ages.

Sea level rise (due to thermal expansion only – the ice sheet component of the model isn’t yet fully implemented) is directly related to temperature, but changes extremely slowly. When geoengineering is turned off, the reversals in sea level trajectory look more like linear offsets from the regular RCP.

Sea ice area, in contrast, reacts quite quickly to changes in temperature. Note that this data gives annual averages, rather than annual minimums, so we can’t tell when the Arctic Ocean first becomes ice-free. Also, note that sea ice area is declining ever so slightly even with geoengineering – this is because the poles are still warming a little bit, while the tropics cool.

Things get really interesting when you look at the carbon cycle. Geoengineering actually reduced atmospheric CO2 concentrations compared to the regular RCP. This was expected, due to the dual nature of carbon cycle feedbacks. Geoengineering allows natural carbon sinks to enjoy all the benefits of high CO2 without the associated drawbacks of high temperatures, and these sinks become stronger as a result. From looking at the different sinks, we found that the sequestration was due almost entirely to the land, rather than the ocean:

In this graph, positive values mean that the land is a net carbon sink (absorbing CO2), while negative values mean it is a net carbon source (releasing CO2). Note the large negative spikes when geoengineering is turned off: the land, adjusting to the sudden warming, spits out much of the carbon that it had previously absorbed.

Within the land component, we found that the strengthening carbon sink was due almost entirely to soil carbon, rather than vegetation:

This graph shows total carbon content, rather than fluxes – think of it as the integral of the previous graph, but discounting vegetation carbon.

Finally, the lower atmospheric CO2 led to lower dissolved CO2 in the ocean, and alleviated ocean acidification very slightly. Again, this benefit quickly went away when geoengineering was turned off.

Conclusions

Is geoengineering worth it? I don’t know. I can certainly imagine scenarios in which it’s the lesser of two evils, and find it plausible (even probable) that we will reach such a scenario within my lifetime. But it’s not something to undertake lightly. As I’ve said before, desperate governments are likely to use geoengineering whether or not it’s safe, so we should do as much research as possible ahead of time to find the safest form of implementation.

The modelling of geoengineering is in its infancy, and I have a few ideas for improvement. In particular, I think it would be interesting to use a complex atmospheric chemistry component to allow for spatial variation in the forcing reduction through sulphate aerosols: increase the aerosol optical depth over one source country, for example, and let it disperse over time. I’d also like to try modelling different kinds of geoengineering – sulphate aerosols as well as mirrors in space and iron fertilization of the ocean.

Jeremy and I didn’t research anything that others haven’t, so this project isn’t original enough for publication, but it was a fun way to stretch our brains. It was also a good topic for a post, and hopefully others will learn something from our experiments.

Above all, leave over-eager summer students alone at your own risk. They just might get into something like this.

Since I Last Wrote…

Since I last wrote, I finished my summer research at Andrew Weaver’s lab (more on that in the weeks and months to come, as our papers work through peer review). I moved back home to the Prairies, which seem unnaturally hot, flat and dry compared to BC. Perhaps what I miss most is the ocean – the knowledge that the nearest coastline is more than a thousand kilometres away gives me an uncomfortable feeling akin to claustrophobia.

During that time, the last story I covered has developed significantly. Before September even began, Arctic sea ice extent reached record low levels. It’s currently well below the previous record, held in 2007, and will continue to decline for two or three more weeks before it levels off:

Finally, El Niño conditions are beginning to emerge in the Pacific Ocean. In central Canada we are celebrating, because El Niño tends to produce warmer-than-average winters (although last winter was mysteriously warm despite the cooling influence of La Niña – not a day below -30 C!) The impacts of El Niño are different all over the world, but overall it tends to boost global surface temperatures. Combine this effect with the current ascent from a solar minimum and the stronger-than-ever greenhouse gas forcing, and it looks likely that 2013 will break global temperature records. That’s still a long way away, though, and who knows what will happen before then?

A Bad Situation in the Arctic

Arctic sea ice is in the midst of a record-breaking melt season. This is yet another symptom of human-caused climate change progressing much faster than scientists anticipated.

Every year, the frozen surface of the Arctic Ocean waxes and wanes, covering the largest area in February or March and the smallest in September. Over the past few decades, these September minima have been getting smaller and smaller. The lowest sea ice extent on record occurred in 2007, followed closely by 2011, 2008, 2010, and 2009. That is, the five lowest years on record all happened in the past five years. While year-to-year weather conditions, like summer storms, impact the variability of Arctic sea ice cover, the undeniable downward trend can only be explained by human-caused climate change.

The 2012 melt season started off hopefully, with April sea ice extent near the 1979-2000 average. Then things took a turn for the worse, and sea ice was at record or near-record low conditions for most of the summer. In early August, a storm spread out the remaining ice, exacerbating the melt. Currently, sea ice is significantly below the previous record for this time of year. See the light blue line in the figure below:

The 2012 minimum is already the fifth-lowest on record for any day of the year – and the worst part is, it will keep melting for about another month. At this rate, it’s looking pretty likely that we’ll break the 2007 record and hit an all-time low in September. Sea ice volume, rather than extent, is in the same situation.

Computer models of the climate system have a difficult time reproducing this sudden melt. As recently as 2007, the absolute worst-case projections showed summer Arctic sea ice disappearing around 2100. Based on observations, scientists are now confident that will happen well before 2050, and possibly within a decade. Climate models, which many pundits like to dismiss as “alarmist,” actually underestimated the severity of the problem. Uncertainty cuts both ways.

The impacts of an ice-free Arctic Ocean will be wide-ranging and severe. Luckily, melting sea ice does not contribute to sea level rise (only landlocked ice does, such as the Greenland and Antarctic ice sheets), but many other problems remain. The Inuit peoples of the north, who depend on sea ice for hunting, will lose an essential source of food and culture. Geopolitical tensions regarding ownership of the newly-accessible Arctic waters are likely. Changes to the Arctic food web, from blooming phytoplankton to dwindling polar bears, will irreversibly alter the ecosystem. While scientists don’t know exactly what this new Arctic will look like, it is certain to involve a great deal of disruption and suffering.

Daily updates on Arctic sea ice conditions are available from the NSIDC website.

More on Phytoplankton

On the heels of my last post about iron fertilization of the ocean, I found another interesting paper on the topic. This one, written by Long Cao and Ken Caldeira in 2010, was much less hopeful.

Instead of a small-scale field test, Cao and Caldeira decided to model iron fertilization using the ocean GCM from Lawrence Livermore National Laboratory. To account for uncertainties, they chose to calculate an upper bound on iron fertilization rather than a most likely scenario. That is, they maxed out phytoplankton growth until something else became the limiting factor – in this case, phosphates. On every single cell of the sea surface, the model phytoplankton were programmed to grow until phosphate concentrations were zero.

A 2008-2100 simulation implementing this method was forced with CO2 emissions data from the A2 scenario. An otherwise identical A2 simulation did not include the ocean fertilization, to act as a control. Geoengineering modelling is strange that way, because there are multiple definitions of “control run”: a non-geoengineered climate that is allowed to warm unabated, as well as preindustrial conditions (the usual definition in climate modelling).

Without any geoengineering, atmospheric CO2 reached 965 ppm by 2100. With the maximum amount of iron fertilization possible, these levels only fell to 833 ppm. The mitigation of ocean acidification was also quite modest: the sea surface pH in 2100 was 7.74 without geoengineering, and 7.80 with. Given the potential side effects of iron fertilization, is such a small improvement worth the trouble?

Unfortunately, the ocean acidification doesn’t end there. Although the problem was lessened somewhat at the surface, deeper layers in the ocean actually became more acidic. There was less CO2 being gradually mixed in from the atmosphere, but another source of dissolved carbon appeared: as the phytoplankton died and sank, they decomposed a little bit and released enough CO2 to cause a net decrease in pH compared to the control run.

In the diagram below, compare the first row (A2 control run) to the second (A2 with iron fertilization). The more red the contours are, the more acidic that layer of the ocean is with respect to preindustrial conditions. The third row contains data from another simulation in which emissions were allowed to increase just enough to offest sequestration by phytoplankton, leading to the same CO2 concentrations as the control run. The general pattern – iron fertilization reduces some acidity at the surface, but increases it at depth – is clear.

depth vs. latitude at 2100 (left); depth vs. time (right)

The more I read about geoengineering, the more I realize how poor the associated cost-benefit ratios might be. The oft-repeated assertion is true: the easiest way to prevent further climate change is, by a long shot, to simply reduce our emissions.

Feeding the Phytoplankton

While many forms of geoengineering involve counteracting global warming with induced cooling, others move closer to the source of the problem and target the CO2 increase. By artificially boosting the strength of natural carbon sinks, it might be possible to suck CO2 emissions right out of the air. Currently around 30% of human emissions are absorbed by these sinks; if we could make this metric greater than 100%, atmospheric CO2 concentrations would decline.

One of the most prominent proposals for carbon sink enhancement involves enlisting phytoplankton, photosynthetic organisms in the ocean which take the carbon out of carbon dioxide and use it to build their bodies. When nutrients are abundant, phytoplankton populations explode and create massive blue or green blooms visible from space. Very few animals enjoy eating these organisms, so they just float there for a while. Then they run out of nutrients, die, and sink to the bottom of the ocean, taking the carbon with them.

Phytoplankton blooms are a massive carbon sink, but they still can’t keep up with human emissions. This is because CO2 is not the limiting factor for their growth. In many parts of the ocean, the limiting factor is actually iron. So this geoengineering proposal, often known as “iron fertilization”, involves dumping iron compounds into the ocean and letting the phytoplankton go to work.

A recent study from Germany (see also the Nature news article) tested out this proposal on a small scale. The Southern Ocean, which surrounds Antarctica, was the location of their field tests, since it has a strong circumpolar current that kept the iron contained. After adding several tonnes of iron sulphate, the research ship tracked the phytoplankton as they bloomed, died, and sank.

Measurements showed that at least half of the phytoplankton sank below 1 km after they died, and “a substantial portion is likely to have reached the sea floor”. At this depth, which is below the mixed layer of the ocean, the water won’t be exposed to the atmosphere for centuries. The carbon from the phytoplankton’s bodies is safely stored away, without the danger of CO2 leakage that carbon capture and storage presents. Unlike in previous studies, the researchers were able to show that iron fertilization could be effective.

However, there are other potential side effects of large-scale iron fertilization. We don’t know what the impacts of so much iron might be on other marine life. Coating the sea surface with phytoplankton would block light from entering the mixed layer, decreasing photosynthesis in aquatic plants and possibly leading to oxygen depletion or “dead zones”. It’s also possible that toxic species of algae would get a hold of the nutrients and create poisonous blooms. On the other hand, the negative impacts of ocean acidification from high levels of CO2 would be lessened, a problem which is not addressed by solar radiation-based forms of geoengineering.

Evidently, the safest way to fix the global warming problem is to stop burning fossil fuels. Most scientists agree that geoengineering should be a last resort, an emergency measure to pull out if the Greenland ice sheet is about to go, rather than an excuse for nations to continue burning coal. And some scientists, myself included, fully expect that geoengineering will be necessary one day, so we might as well figure out the safest approach.

Modelling Geoengineering

Later in my career as a climate modeller, I expect to spend a lot of time studying geoengineering. Given the near-total absence of policy responses to prevent climate change, I think it’s very likely that governments will soon start thinking seriously about ways to artificially cool the planet. Who will they come to for advice? The climate modellers.

Some scientists are pre-emptively recognizing this need for knowledge, and beginning to run simulations of geoengineering. In fact, there’s an entire model intercomparison project dedicated to this area of study. There’s only a small handful of publications so far, but the results are incredibly interesting. Here I summarize two recent papers that model solar radiation management: the practice of offsetting global warming by partially blocking sunlight, whether by seeding clouds, adding sulfate aerosols to the stratosphere, or placing giant mirrors in space. As an added bonus, both of these papers are open access.

A group of scientists from Europe ran the same experiment on four of the world’s most complex climate models. The simulation involved instantaneously quadrupling CO2 from preindustrial levels, but offsetting it with a reduction in the solar constant, such that the net forcing was close to zero.

The global mean temperature remained at preindustrial levels. “Great,” you might think, “we’re home free!” However, climate is far more than just one globally averaged metric. Even though the average temperature stayed the same, there were still regional changes, with cooling in the tropics and warming at both poles (particularly in their respective winters):

There were regional changes in precipitation, too, but they didn’t all cancel out like with temperature. Global mean precipitation decreased, due to cloud feedbacks which are influenced by sunlight but not greenhouse gases. There were significant changes in the monsoons of south Asia, but the models disagreed as to exactly what those changes would be.

This intercomparison showed that even with geoengineering, we’re still going to get a different climate. We won’t have to worry about some of the big-ticket items like sea level rise, but droughts and forest dieback will remain a major threat. Countries will still struggle to feed their people, and species will still face extinction.

On the other side of the Atlantic, Damon Matthews and Ken Caldeira took a different approach. (By the way, what is it about Damon Matthews? All the awesome papers out of Canada seem to have his name on them.) Using the UVic ESCM, they performed a more realistic experiment in which emissions varied with time. They offset emissions from the A2 scenario with a gradually decreasing solar constant. They found that the climate responds quickly to geoengineering, and their temperature and precipitation results were very similar to the European paper.

They also examined some interesting feedbacks in the carbon cycle. Carbon sinks (ecosystems which absorb CO2, like oceans and forests) respond to climate change in two different ways. First, they respond directly to increases in atmospheric CO2 – i.e., the fertilization effect. These feedbacks (lumped together in a term we call beta) are negative, because they tend to increase carbon uptake. Second, they respond to the CO2-induced warming, with processes like forest dieback and increased respiration. These feedbacks (a term called gamma) are positive, because they decrease uptake. Currently we have both beta and gamma, and they’re partially cancelling each other out. However, with geoengineering, the heat-induced gamma goes away, and beta is entirely unmasked. As a result, carbon sinks became more effective in this experiment, and sucked extra CO2 out of the atmosphere.

The really interesting part of the Matthews and Caldeira paper was when they stopped the geoengineering. This scenario is rather plausible – wars, recessions, or public disapproval could force the world to abandon the project. So, in the experiment, they brought the solar constant back to current levels overnight.

The results were pretty ugly. Global climate rapidly shifted back to the conditions it would have experienced without geoengineering. In other words, all the warming that we cancelled out came back at once. Global average temperature changed at a rate of up to 4°C per decade, or 20 times faster than at present. Given that biological, physical, and social systems worldwide are struggling to keep up with today’s warming, this rate of change would be devastating. To make things worse, gamma came back in full force, and carbon sinks spit out the extra CO2 they had soaked up. Atmospheric concentrations went up further, leading to more warming.

Essentially, if governments want to do geoengineering properly, they have to make a pact to do so forever, no matter what the side effects are or what else happens in the world. Given how much legislation is overturned every time a country has a change in government, such a promise would be almost impossible to uphold. Matthews and Caldeira consider this reality, and come to a sobering conclusion:

In the case of inconsistent or erratic deployment (either because of shifting public opinions or unilateral action by individual nations), there would be the potential for large and rapid temperature oscillations between cold and warm climate states.

Yikes. If that doesn’t scare you, what does?

A New Kind of Science

Cross-posted from NextGen Journal

Ask most people to picture a scientist at work, and they’ll probably imagine someone in a lab coat and safety goggles, surrounded by test tubes and Bunsen burners. If they’re fans of The Big Bang Theory, maybe they’ll picture complicated equations being scribbled on whiteboards. Others might think of the Large Hadron Collider, or people wading through a swamp taking water samples.

All of these images are pretty accurate – real scientists, in one field or another, do these things as part of their job. But a large and growing approach to science, which is present in nearly every field, replaces the lab bench or swamp with a computer. Mathematical modelling, which essentially means programming the complicated equations from the whiteboard into a computer and solving them many times, is the science of today.

Computer models are used for all sorts of research questions. Epidemiologists build models of an avian flu outbreak, to see how the virus might spread through the population. Paleontologists build biomechanical models of different dinosaurs, to figure out how fast they could run or how high they could stretch their necks. I’m a research student in climate science, where we build models of the entire planet, to study the possible effects of global warming.

All of these models simulate systems which aren’t available in the real world. Avian flu hasn’t taken hold yet, and no sane scientist would deliberately start an outbreak just so they could study it! Dinosaurs are extinct, and playing around with their fossilized bones to see how they might move would be heavy and expensive. Finally, there’s only one Earth, and it’s currently in use. So models don’t replace lab and field work – rather, they add to it. Mathematical models let us perform controlled experiments that would otherwise be impossible.

If you’re interested in scientific modelling, spend your college years learning a lot of math, particularly calculus, differential equations, and numerical methods. The actual application of the modelling, like paleontology or climatology, is less important for now – you can pick that up later, or read about it on your own time. It might seem counter-intuitive to neglect the very system you’re planning to spend your life studying, but it’s far easier this way. A few weeks ago I was writing some computer code for our lab’s climate model, and I needed to calculate a double integral of baroclinic velocity in the Atlantic Ocean. I didn’t know what baroclinic velocity was, but it only took a few minutes to dig up a paper that defined it. My work would have been a lot harder if, instead, I hadn’t known what a double integral was.

It’s also important to become comfortable with computer programming. You might think it’s just the domain of software developers at Google or Apple, but it’s also the main tool of scientists all over the world. Two or three courses in computer science, where you’ll learn a multi-purpose language like C or Java, are all you need. Any other languages you need in the future will take you days, rather than months, to master. If you own a Mac or run Linux on a PC, spend a few hours learning some basic UNIX commands – it’ll save you a lot of time down the road. (Also, if the science plan falls through, computer science is one of the only majors which will almost definitely get you a high-paying job straight out of college.)

Computer models might seem mysterious, or even untrustworthy, when the news anchor mentions them in passing. In fact, they’re no less scientific than the equations that Sheldon Cooper scrawls on his whiteboard. They’re just packaged together in a different form.

Ten Things I Learned in the Climate Lab

  1. Scientists do not blindly trust their own models of global warming. In fact, nobody is more aware of a model’s specific weaknesses than the developers themselves. Most of our time is spent comparing model output to observations, searching for discrepancies, and hunting down bugs.
  2. If 1.5 C global warming above preindustrial temperatures really does represent the threshold for “dangerous climate change” (rather than 2 C, as some have argued), then we’re in trouble. Stabilizing global temperatures at this level isn’t just climatically difficult, it’s also mathematically difficult. Given current global temperatures, and their current rate of change, it’s nearly impossible to smoothly extend the curve to stabilize at 1.5 C without overshooting.
  3. Sometimes computers do weird things. Some bugs appear for the most illogical reasons (last week, the act of declaring a variable altered every single metric of the model output). Other bugs show up once, then disappear before you can track down the source, and you’re never able to reproduce them. It’s not uncommon to fix a problem without ever understanding why the problem occurred in the first place.
  4. For anyone working with climate model output, one of the best tools to have in your arsenal is the combination of IDL and NetCDF. Hardly an hour of work goes by when I don’t use one or both of these programming tools in some way.
  5. Developing model code for the first time is a lot like moving to a new city. At first you wander around aimlessly, clutching your map and hesitantly asking for directions. Then you begin to recognize street names and orient yourself around landmarks. Eventually you’re considered a resident of the city, as your little house is there on the map with your name on it. You feel inordinately proud of the fact that you managed to build that house without burning the entire city down in the process.
  6. The RCP 8.5 scenario is really, really scary. Looking at the output from that experiment is enough to give me a stomachache. Let’s just not let that scenario happen, okay?
  7. It’s entirely possible to get up in the morning and just decide to be enthusiastic about your work. You don’t have to pretend, or lie to yourself – all you do is consciously choose to revel in the interesting discoveries, and to view your setbacks as challenges rather than chores. It works really well, and everything is easier and more fun as a result.
  8. Climate models are fabulous experimental subjects. If you run the UVic model twice with the same code, data, options, and initial state, you get exactly the same results. (I’m not sure if this holds for more complex GCMs which include elements of random weather variation.) For this reason, if you change one factor, you can be sure that the model is reacting only to that factor. Control runs are completely free of external influences, and deconstructing confounding variables is only a matter of CPU time. Most experimental scientists don’t have this element of perfection in their subjects – it makes me feel very lucky.
  9. The permafrost is in big trouble, and scientists are remarkably calm about it.
  10. Tasks that seem impossible at first glance are often second nature by the end of the day. No bug lasts forever, and no problem goes unsolved if you exert enough effort.

My Earth Hour Story

Tonight is Earth Hour, when people across the world turn off all their lights and electronic devices (except the necessary ones – I don’t think you’re required to unplug the freezer) from 8:30 to 9:30 local time. This is meant to generate awareness about climate change and conservation. It’s really more of a symbolic action, to my understanding – I doubt it adds up to a significant dip in carbon emissions – but I take part anyway. I find that a lot of interesting conversations begin when there’s nothing to do but sit in the dark.

It was during the second official Earth Hour, when I was sixteen years old, that I agreed to babysit for friends of the family. Great, I thought, how am I going to get a five-year-old boy and a two-year-old girl to sit in the dark for an hour? I ended up turning it into a camping game, which was really fun. We made a tent out of chairs and blankets, ate popcorn, and played with a flashlight powered by a hand crank.

The girl was too young to understand the purpose of sitting in the dark – she just liked waving the flashlight around – but I talked to the boy a bit about why we were doing this. I told him how we needed to take care of nature, because it can be damaged if we don’t treat it well, and that can come back to bite us. I explained the purpose of recycling: “You can make paper out of trees, but you can also make paper out of old paper, and that way you don’t have to cut down any trees.” His face just lit up, and he said, “Oh! I get it now! Well, we should do more of that!” which was really great to hear.

Halfway through the hour, the kids went to bed, and I sat in the dark on my own until 9:30, when I turned the lights on and started to do homework. And that was the end of it…or so I thought.

Apparently, at some point during that hour, a neighbour had noticed that the house was in darkness and flashlights were waving around. He thought there was something wrong with that situation, and came over to knock on the door, but we were in the basement in our tent and didn’t hear him. So then he called the police.

It was 11 pm by the time they showed up. Suddenly someone was pounding on the door, and I, convinced that someone was trying to break in, was terrified. I froze in my seat, and contemplated hiding under the desk, but whoever was at the door refused to go away. Eventually I crept over to a side window and looked outside, where I saw a police car.

My first thought when I opened to the door to two police officers was, “Who got in a car accident? My family, or the kids’ parents?” The concept of police coming to investigate a house that had its lights off was completely foreign to me.

“It’s Earth Hour,” I said when they told me why they were there. They replied, “Yeah, we know, but we have to answer all our calls.” They took my name and my birth date, so this incident must be mentioned somewhere in the city police records. I imagine there is a note next to my name saying, “Attempted to indoctrinate children with environmentalism.”

Luckily the kids didn’t wake up, but they heard about the incident later from their parents. I still babysit these kids, albeit less frequently now that I’m in university, and the boy often asks, “Can we turn off all the lights again? I want the police to come. That would be fun.”

Denial in the Classroom

At one of Canada’s top comprehensive universities, a well-known climate change denier was recently discovered “educating” a class of undergraduate students about global warming.

The Instructor

Tom Harris spent much of his career acting as a PR consultant for fossil fuel companies. Today he directs the International Climate Science Coalition (ICSC), an advocacy group closely tied to the Heartland Institute. In fact, Harris is listed as a Global Warming Expert on Heartland’s website, and spoke at their 2008 conference. However, with a background in mechanical engineering, Tom Harris is hardly qualified to comment on climate science.

The ICSC’s position on climate change is, unsurprisingly, similar to Heartland’s. Their list of Core Principles includes the following gems:

  • Science is rapidly evolving away from the view that humanity’s emissions of carbon dioxide and other ‘greenhouse gases’ are a cause of dangerous climate change.
  • Climate models used by the IPCC fail to reproduce known past climates without manipulation and therefore lack the scientific integrity needed for use in climate prediction and related policy decision-making.
  • Carbon dioxide is not a pollutant – it is a necessary reactant in plant photosynthesis and so is essential for life on Earth.
  • Since science and observation have failed to substantiate the human-caused climate change hypothesis, it is premature to damage national economies with ‘carbon’ taxes, emissions trading or other schemes to control ‘greenhouse gas’ emissions.

More recently, Harris began teaching at Carleton University, an Ottawa institution that Maclean’s magazine ranks as the 7th best comprehensive university in Canada. Climate Change: An Earth Sciences Perspective looks innocuous enough, claiming to teach “the history of earth climates, geological causes of climate change and impact that rapid climate change has had on the biosphere”. As we’ll see, the real content of the course was not so benign.

The Watchdog

The Committee for the Advancement of Scientific Skepticism (CASS) is a Canadian society dedicated to scrutinizing scientific claims made in advertisements, classrooms, and the media. As part of the skeptic movement, they mainly address paranormal phenomena and alternative medicine, but have recently broadened their interests to include climate change denial.

Four members of CASS living in the Ottawa area became aware of Tom Harris’ teaching activities at Carleton, and requested access to videotapes made of his lectures. Earlier today, they published their findings in a disturbing report.

As Heard in University Lectures…

“We can’t even forecast how these clouds are going to move in the next week,” Harris remarked in the first lecture. “Our understanding of the physics is so bad that we can’t even do that. So to think that we could do a whole planet for 50 years in the future…” This kind of misconception, conflating weather and climate predictions, is understandable among laypeople whose only experience with atmospheric modelling is the 5-day forecast presented on the news each night. For a university instructor teaching a course dedicated to climate change, however, such an error is simply unacceptable.

But the next lecture, it got worse. At the time, sunspots were the lowest on record, and some scientists speculated that the Sun might return to Maunder Minimum conditions. However, this slight negative forcing would cancel out less than ten percent of global warming from greenhouse gases, were it to even occur. The numbers, though, didn’t stop Harris, who claimed that “we’re in for some real cooling come around 2030 because we’re going back to the conditions that existed at the time of Napoleon. So cold weather is coming.” Forget about global warming, his message was – global cooling is the real threat.

The misconceptions, oversimplifications, half-truths, and flat-out nonsense continued throughout every single lecture, leading to a whopping 142 “incorrect or equivocal claims” as tallied by the CASS report, which quoted and rebutted every single one. It’s as if Tom Harris was actively trying to hit every argument on the Skeptical Science list.

In the last lecture, the students were presented with “take-away slogans”:

  • The only constant about climate is change.
  • Carbon dioxide is plant food.
  • There is no scientific consensus about climate change causes.
  • Prepare for global cooling.
  • Climate science is changing quickly.

This clear exercise in creating young climate change deniers seems to have influenced some, as shown by the RateMyProfessors reviews of the course. “Interesting course,” wrote one student. “Nice to have some fresh perspectives on global warming rather than the dramatized fear mongering versions. Harris really loves to indulge in the facts and presents some pretty compelling evidence.”

Crossing the Line

There is a line between ensuring academic freedom and providing unqualified individuals with a platform for disseminating nonsense. It is clear to me that Carleton University crossed this line long ago. I am astounded that such material is being taught at a respectable Canadian university. If the Heartland Institute’s proposed curriculum comes through, similar material might be taught in select K-12 classrooms all over the world. As an undergraduate student, the same age as many of the students in the course, I am particularly disturbed.

I have encountered climate change misinformation in my university lectures, both times in the form of false balance, a strategy that I feel many professors fall back to when an area of science is debated in the media and they want to be seen to respect all viewpoints. In both cases, I printed out some articles from Science, Nature, PNAS, and the IPCC, and went to see the prof in their office hours. We had a great conversation and we both learned something from the experience. However, it took an incredible amount of courage for me to talk to my professors like this, not only because teenage girls are naturally insecure creatures, but also because a student telling their science teacher that they’ve got the science wrong just isn’t usually done.

Even by the time they reach university, most students seem to unconditionally trust what a science teacher tells them, and will not stop to question the concepts they are being taught. Although many of my professors have encouraged us to do research outside of class and read primary literature on the topic, nearly all of my peers are content to simply copy down every word of the lecture notes and memorize it all for the final exam.

By allowing Tom Harris to teach the anti-science messages of climate change denial, Carleton University is doing a great disservice to its students. They paid for a qualified instructor to teach them accurate scientific knowledge, and instead they were taken advantage of by a powerful industry seeking to indoctrinate citizens with misinformation. This should not be permitted to continue.