Feeds:
Posts
Comments

Posts Tagged ‘carbon dioxide’

You may have already heard that carbon dioxide concentrations have surpassed 400 ppm. The most famous monitoring station, Mauna Loa Observatory in Hawaii, reached this value on May 9th. Due to the seasonal cycle, CO2 levels began to decline almost immediately thereafter, but next year they will easily blow past 400 ppm.

Of course, this milestone is largely arbitrary. There’s nothing inherently special about 400 ppm. But it’s a good reminder that while we were arguing about taxation, CO2 levels continued to quietly tick up and up.


In happier news, John Cook and others have just published the most exhaustive survey of the peer-reviewed climate literature to date. Read the paper here (open access), and a detailed but accessible summary here. Unsurprisingly, they found the same 97% consensus that has come up over and over again.

Cook et al read the abstracts of nearly 12 000 papers published between 1991 and 2011 – every single hit from the ISI Web of Science with the keywords “global climate change” or “global warming”. Several different people categorized each abstract, and the authors were contacted whenever possible to categorize their own papers. Using several different methods like this makes the results more reliable.

Around two-thirds of the studies, particularly the more recent ones, didn’t mention the cause of climate change. This is unsurprising, since human-caused warming has been common knowledge in the field for years. Similarly, seismology papers don’t usually mention that plate tectonics cause earthquakes, particularly in the abstracts where space is limited.

Among the papers which did express a position, 97.1% said climate change was human-caused. Again, unsurprising to anyone working in the field, but it’s news to many members of the public. The study has been widely covered in the mainstream media – everywhere from The Guardian to The Australian – and even President Obama’s Twitter feed.


Congratulations are also due to Andrew Weaver, my supervisor from last summer, who has just been elected to the British Columbia provincial legislature. He is not only the first-ever Green Party MLA in BC’s history, but also (as far as I know) the first-ever climate scientist to hold public office.

Governments the world over are sorely in need of officials who actually understand the problem of climate change. Nobody fits this description better than Andrew, and I think he is going to be great. The large margin by which he won also indicates that public support for climate action is perhaps higher than we thought.


Finally, my second publication came out this week in Climate of the Past. It describes an EMIC intercomparison project the UVic lab conducted for the next IPCC report, which I helped out with while I was there. The project was so large that we split the results into two papers (the second of which is in press in Journal of Climate). This paper covers the historical experiments – comparing model results from 850-2005 to observations and proxy reconstructions – as well as some idealized experiments designed to measure metrics such as climate sensitivity, transient climate response, and carbon cycle feedbacks.

Read Full Post »

On the heels of my last post about iron fertilization of the ocean, I found another interesting paper on the topic. This one, written by Long Cao and Ken Caldeira in 2010, was much less hopeful.

Instead of a small-scale field test, Cao and Caldeira decided to model iron fertilization using the ocean GCM from Lawrence Livermore National Laboratory. To account for uncertainties, they chose to calculate an upper bound on iron fertilization rather than a most likely scenario. That is, they maxed out phytoplankton growth until something else became the limiting factor – in this case, phosphates. On every single cell of the sea surface, the model phytoplankton were programmed to grow until phosphate concentrations were zero.

A 2008-2100 simulation implementing this method was forced with CO2 emissions data from the A2 scenario. An otherwise identical A2 simulation did not include the ocean fertilization, to act as a control. Geoengineering modelling is strange that way, because there are multiple definitions of “control run”: a non-geoengineered climate that is allowed to warm unabated, as well as preindustrial conditions (the usual definition in climate modelling).

Without any geoengineering, atmospheric CO2 reached 965 ppm by 2100. With the maximum amount of iron fertilization possible, these levels only fell to 833 ppm. The mitigation of ocean acidification was also quite modest: the sea surface pH in 2100 was 7.74 without geoengineering, and 7.80 with. Given the potential side effects of iron fertilization, is such a small improvement worth the trouble?

Unfortunately, the ocean acidification doesn’t end there. Although the problem was lessened somewhat at the surface, deeper layers in the ocean actually became more acidic. There was less CO2 being gradually mixed in from the atmosphere, but another source of dissolved carbon appeared: as the phytoplankton died and sank, they decomposed a little bit and released enough CO2 to cause a net decrease in pH compared to the control run.

In the diagram below, compare the first row (A2 control run) to the second (A2 with iron fertilization). The more red the contours are, the more acidic that layer of the ocean is with respect to preindustrial conditions. The third row contains data from another simulation in which emissions were allowed to increase just enough to offest sequestration by phytoplankton, leading to the same CO2 concentrations as the control run. The general pattern – iron fertilization reduces some acidity at the surface, but increases it at depth – is clear.

depth vs. latitude at 2100 (left); depth vs. time (right)

The more I read about geoengineering, the more I realize how poor the associated cost-benefit ratios might be. The oft-repeated assertion is true: the easiest way to prevent further climate change is, by a long shot, to simply reduce our emissions.

Read Full Post »

While many forms of geoengineering involve counteracting global warming with induced cooling, others move closer to the source of the problem and target the CO2 increase. By artificially boosting the strength of natural carbon sinks, it might be possible to suck CO2 emissions right out of the air. Currently around 30% of human emissions are absorbed by these sinks; if we could make this metric greater than 100%, atmospheric CO2 concentrations would decline.

One of the most prominent proposals for carbon sink enhancement involves enlisting phytoplankton, photosynthetic organisms in the ocean which take the carbon out of carbon dioxide and use it to build their bodies. When nutrients are abundant, phytoplankton populations explode and create massive blue or green blooms visible from space. Very few animals enjoy eating these organisms, so they just float there for a while. Then they run out of nutrients, die, and sink to the bottom of the ocean, taking the carbon with them.

Phytoplankton blooms are a massive carbon sink, but they still can’t keep up with human emissions. This is because CO2 is not the limiting factor for their growth. In many parts of the ocean, the limiting factor is actually iron. So this geoengineering proposal, often known as “iron fertilization”, involves dumping iron compounds into the ocean and letting the phytoplankton go to work.

A recent study from Germany (see also the Nature news article) tested out this proposal on a small scale. The Southern Ocean, which surrounds Antarctica, was the location of their field tests, since it has a strong circumpolar current that kept the iron contained. After adding several tonnes of iron sulphate, the research ship tracked the phytoplankton as they bloomed, died, and sank.

Measurements showed that at least half of the phytoplankton sank below 1 km after they died, and “a substantial portion is likely to have reached the sea floor”. At this depth, which is below the mixed layer of the ocean, the water won’t be exposed to the atmosphere for centuries. The carbon from the phytoplankton’s bodies is safely stored away, without the danger of CO2 leakage that carbon capture and storage presents. Unlike in previous studies, the researchers were able to show that iron fertilization could be effective.

However, there are other potential side effects of large-scale iron fertilization. We don’t know what the impacts of so much iron might be on other marine life. Coating the sea surface with phytoplankton would block light from entering the mixed layer, decreasing photosynthesis in aquatic plants and possibly leading to oxygen depletion or “dead zones”. It’s also possible that toxic species of algae would get a hold of the nutrients and create poisonous blooms. On the other hand, the negative impacts of ocean acidification from high levels of CO2 would be lessened, a problem which is not addressed by solar radiation-based forms of geoengineering.

Evidently, the safest way to fix the global warming problem is to stop burning fossil fuels. Most scientists agree that geoengineering should be a last resort, an emergency measure to pull out if the Greenland ice sheet is about to go, rather than an excuse for nations to continue burning coal. And some scientists, myself included, fully expect that geoengineering will be necessary one day, so we might as well figure out the safest approach.

Read Full Post »

As my summer research continues, I’m learning a lot about previous experiments that used the UVic ESCM (Earth System Climate Model), as well as beginning to run my own. Over the past few years, the UVic model has played an integral role in a fascinating little niche of climate research: the importance of cumulative carbon emissions.

So far, global warming mitigation policies have focused on choosing an emissions pathway: making a graph of desired CO2 emissions vs. time, where emissions slowly reduce to safer levels. However, it turns out that the exact pathway we take doesn’t actually matter. All that matters is the area under the curve: the total amount of CO2 we emit, or “cumulative emissions” (Zickfeld et al, 2009). So if society decides to limit global warming to 2°C (a common target), there is a certain amount of total CO2 that the entire world is allowed to emit. We can use it all up in the first ten years and then emit nothing, or we can spread it out – either way, it will lead to the same amount of warming.

If you delve a little deeper into the science, it turns out that temperature change is directly proportional to cumulative emissions (Matthews et al, 2009). In other words, if you draw a graph of the total amount of warming vs. total CO2 emitted, it will be a straight line.

This is counter-intuitive, because the intermediate processes are definitely not straight lines. Firstly, the graph of warming vs. CO2 concentrations is logarithmic: as carbon dioxide builds up in the atmosphere, each extra molecule added has less and less effect on the climate.

However, as carbon dioxide builds up and the climate warms, carbon sinks (which suck up some of our emissions) become less effective. For example, warmer ocean water can’t hold as much CO2, and trees subjected to heat stress often die and stop photosynthesizing. Processes that absorb CO2 become less effective, so more of our emissions actually stay in the air. Consequently, the graph of CO2 concentrations vs. CO2 emissions is exponential.

These two relationships, warming vs. concentrations and concentrations vs. emissions, more or less cancel each other out, making total warming vs. total emissions linear. It doesn’t matter how much CO2 was in the air to begin with, or how fast the allowable emissions get used up. Once society decides how much warming is acceptable, all we need to do is nail down the proportionality constant (the slope of the straight line) in order to find out how much carbon we have to work with. Then, that number can be handed to economists, who will figure out the best way to spread out those emissions while causing minimal financial disruption.

Finding that slope is a little tricky, though. Best estimates, using models as well as observations, generally fall between 1.5°C and 2°C for every trillion tonnes of carbon emitted (Matthews et al, 2009; Allen et al, 2009; Zickfeld et al, 2009). Keep in mind that we’ve already emitted about 0.6 trillion tonnes of carbon (University of Oxford). Following a theme commonly seen in climate research, the uncertainty is larger on the high end of these slope estimates than on the low end. So if the real slope is actually lower than our best estimate, it’s probably only a little bit lower; if it’s actually higher than our best estimate, it could be much higher, and the problem could be much worse than we thought.

Also, this approach ignores other human-caused influences on global temperature, most prominently sulfate aerosols (which cause cooling) and greenhouse gases other than carbon dioxide (which cause warming). Right now, these two influences basically cancel, which is convenient for scientists because it means we can ignore both of them. Typically, we assume that they will continue to cancel far into the future, which might not be the case – there’s a good chance that developing countries like China and India will reduce their emissions of sulfate aerosols, allowing the non-CO2 greenhouse gases to dominate and cause warming. If this happened, we couldn’t even lump the extra greenhouse gases into the allowable CO2 emissions, because the warming they cause does depend on the exact pathway. For example, methane has such a short atmospheric lifetime that “cumulative methane emissions” is a useless measurement, and certainly isn’t directly proportional to temperature change.

This summer, one of my main projects at UVic is to compare what different models measure the slope of temperature change vs. cumulative CO2 emissions to be. As part of the international EMIC intercomparison project that the lab is coordinating, different modelling groups have sent us their measurements of allowable cumulative emissions for 1.5°C, 2°C, 3°C, and 4°C global warming. Right now (quite literally, as I write this) I’m running the same experiments on the UVic model. It’s very exciting to watch the results trickle in. Perhaps my excitement towards the most menial part of climate modelling, watching as the simulation chugs along, is a sign that I’m on the right career path.

Read Full Post »

Apologies for my silence recently – I just finished writing some final exams that I missed for the AGU conference, so I’ve been studying hard ever since Boxing Day.

I am working on a larger piece about climate models: an introduction to how they work and why they are useful. That will take about a week to finish, so in the mean time, here is an open thread to keep things moving.

Some possible discussion topics from posts I’ve enjoyed:

Enjoy!

Read Full Post »

Cross-posted from NextGen Journal

Following the COP17 talks in Durban, South Africa – the latest attempt to create a global deal to cut carbon emissions and solve global warming – world leaders claimed they had “made history”, calling the conference “a great success” that had “all the elements we were looking for”.

So what agreement did they all come to, that has them so proud? They agreed to figure out a deal by 2015. As James Hrynyshyn writes, it is “a roadmap to a unknown strategy that may or may not produce a plan that might combat climate change”.

Did I miss a meeting? Weren’t we supposed to figure out a deal by 2010, so it could come into force when the Kyoto Protocol expires in 2012? This unidentified future deal, if it even comes to pass, will not come into force until 2020 – that’s 8 years of unchecked global carbon emissions.

At COP15 in Copenhagen, countries agreed to limit global warming to 2 degrees Celsius. The German Advisory Council on Global Change crunched the numbers and discovered that the sooner we start reducing emissions, the easier it will be to attain this goal. This graph shows that if emissions peak in 2011 we have a “bunny slope” to ride, whereas if emissions peak in 2020 we have a “triple black diamond” that’s almost impossible, economically. (Thanks to Richard Sommerville for this analogy).

If we stay on the path that leaders agreed on in Durban, emissions will peak long after 2020 – in the best case scenario, they will only start slowing in 2020. If the triple black diamond looks steep, imagine a graph where emissions peak in 2030 or 2040 – it’s basically impossible to achieve our goal, no matter how high we tax carbon or how many wind turbines we build.

World leaders have committed our generation to a future where global warming spins out of our control. What is there to celebrate about that?

However, we shouldn’t throw our hands in the air and give up. 2 degrees is bad, but 4 degrees is worse, and 6 degrees is awful. There is never a point at which action is pointless, because the problem can always get worse if we ignore it.

Read Full Post »

I went to a public lecture on climate change last night (because I just didn’t get enough of that last week at AGU, apparently), where four professors from different departments at my university spoke about their work. They were great speeches – it sort of reminded me of TED Talks – but I was actually most interested in the audience questions and comments afterward.

There was the token crazy guy who stood up and said “The sun is getting hotter every day and one day we’re all going to FRY! So what does that say about your global warming theory? Besides, if it was CO2 we could all just stop breathing!” Luckily, everybody laughed at his comments…

There were also some more reasonable-sounding people, repeating common myths like “It’s a natural cycle” and “Volcanoes emit more CO2 than humans“. The speakers did a good job of explaining why these claims were false, but I still wanted to pull out the Skeptical Science app and wave it in the air…

Overall, though, the audience seemed to be composed of concerned citizens who understood the causes and severity of climate change, and were eager to learn about impacts, particularly on extreme weather. It was nice to see an audience moving past this silly public debate into a more productive one about risk management.

The best moment, though, was on the bus home. There was a first-year student in the seat behind me – I assume he came to see the lecture as well, but maybe he just talks about climate change on the bus all the time. He was telling his friend about sea level rise, and he was saying all the right things – we can expect one or two metres by the end of the century, which doesn’t sound like a lot, but it’s enough to endanger many densely populated coastal cities, as well as kill vegetation due to seawater seeping in.

He even had the statistics right! I was so proud! I was thinking about turning around to join in the conversation, but by then I had been listening in for so long that it would have been embarrassing.

It’s nice to see evidence of a shift in public understanding, even if it’s only anecdotal. Maybe we’re doing something right after all.

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 353 other followers