Breaching the Mainstream

It’s hard to overestimate the influence of John and Hank Green on the Internet, particularly among people my age. John (who writes books for teenagers) and Hank (who maintains the website EcoGeek and sings songs about particle physics) run a YouTube channel that celebrates nerdiness. This Internet community is now a huge part of pop culture among self-professed teenage nerds.

Hank’s new spin-off channel SciShow, which publishes videos about popular science topics, has only being going for a month but already has 90 000 subscribers and 1 million views. So I was very excited when Hank created this entertaining, polished, and wonderfully accurate video about climate change. He discusses sea level rise, anoxic events, and even the psychology of denial:

How much is most?

A growing body of research is showing that humans are likely causing more than 100% of global warming: without our influences on the climate, the planet would actually be cooling slightly.

In 2007, the Intergovernmental Panel on Climate Change published its fourth assessment report, internationally regarded as the most credible summary of climate science to date. It concluded that “most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations”.

A clear question remains: How much is “most”? 51%? 75%? 99%? At the time that the IPCC report was written, the answer was unclear. However, a new frontier of climate research has emerged since, and scientists are working hard to quantify the answer to this question.

I recently attended the 2011 American Geophysical Union Fall Meeting, a conference of over 20 000 scientists, many of whom study the climate system. This new area of research was a hot topic of discussion at AGU, and a phrase that came up many times was “more than 100%”.

That’s right, humans are probably causing more than 100% of observed global warming. That means that our influences are being offset by natural cooling factors. If we had never started burning fossil fuels, the world would be cooling slightly.

In the long term, oscillations of the Earth’s orbit show that, without human activity, we would be very slowly descending into a new ice age. There are other short-term cooling influences, though. Large volcanic eruptions, such as Mount Pinatubo in 1991, have thrown dust into the upper atmosphere where it blocks a small amount of sunlight. The sun, particularly in the last few years, has been less intense than usual, due to the 11-year sunspot cycle. We have also experienced several strong La Niña events in the Pacific Ocean, which move heat out of the atmosphere and into the ocean.

However, all of these cooling influences pale in comparison to the strength of the human-caused warming influences. The climate change communication project Skeptical Science recently summarized six scientific studies in this graphic:

Most of the studies estimated that humans caused over 100% of the warming since 1950, and all six put the number over 98%. Additionally, most of the studies find natural influences to be in the direction of cooling, and all six show that number to be close to zero.

If you are interested in the methodologies and uncertainty ranges of these six studies, Skeptical Science goes into more detail, and also provides links to the original journal articles.

To summarize, the perception that humans are accelerating a natural process of warming is false. We have created this problem entirely on our own. Luckily, that means we have the power to stop the problem in its tracks. We are in control, and we choose what happens in the future.

How do climate models work?

Also published at Skeptical Science

This is a climate model:

T = [(1-α)S/(4εσ)]1/4

(T is temperature, α is the albedo, S is the incoming solar radiation, ε is the emissivity, and σ is the Stefan-Boltzmann constant)

An extremely simplified climate model, that is. It’s one line long, and is at the heart of every computer model of global warming. Using basic thermodynamics, it calculates the temperature of the Earth based on incoming sunlight and the reflectivity of the surface. The model is zero-dimensional, treating the Earth as a point mass at a fixed time. It doesn’t consider the greenhouse effect, ocean currents, nutrient cycles, volcanoes, or pollution.

If you fix these deficiencies, the model becomes more and more complex. You have to derive many variables from physical laws, and use empirical data to approximate certain values. You have to repeat the calculations over and over for different parts of the Earth. Eventually the model is too complex to solve using pencil, paper and a pocket calculator. It’s necessary to program the equations into a computer, and that’s what climate scientists have been doing ever since computers were invented.

A pixellated Earth

Today’s most sophisticated climate models are called GCMs, which stands for General Circulation Model or Global Climate Model, depending on who you talk to. On average, they are about 500 000 lines of computer code long, and mainly written in Fortran, a scientific programming language. Despite the huge jump in complexity, GCMs have much in common with the one-line climate model above: they’re just a lot of basic physics equations put together.

Computers are great for doing a lot of calculations very quickly, but they have a disadvantage: computers are discrete, while the real world is continuous. To understand the term “discrete”, think about a digital photo. It’s composed of a finite number of pixels, which you can see if you zoom in far enough. The existence of these indivisible pixels, with clear boundaries between them, makes digital photos discrete. But the real world doesn’t work this way. If you look at the subject of your photo with your own eyes, it’s not pixellated, no matter how close you get – even if you look at it through a microscope. The real world is continuous (unless you’re working at the quantum level!)

Similarly, the surface of the world isn’t actually split up into three-dimensional cells (you can think of them as cubes, even though they’re usually wedge-shaped) where every climate variable – temperature, pressure, precipitation, clouds – is exactly the same everywhere in that cell. Unfortunately, that’s how scientists have to represent the world in climate models, because that’s the only way computers work. The same strategy is used for the fourth dimension, time, with discrete “timesteps” in the model, indicating how often calculations are repeated.

It would be fine if the cells could be really tiny – like a high-resolution digital photo that looks continuous even though it’s discrete – but doing calculations on cells that small would take so much computer power that the model would run slower than real time. As it is, the cubes are on the order of 100 km wide in most GCMs, and timesteps are on the order of hours to minutes, depending on the calculation. That might seem huge, but it’s about as good as you can get on today’s supercomputers. Remember that doubling the resolution of the model won’t just double the running time – instead, the running time will increase by a factor of sixteen (one doubling for each dimension).

Despite the seemingly enormous computer power available to us today, GCMs have always been limited by it. In fact, early computers were developed, in large part, to facilitate atmospheric models for weather and climate prediction.

Cracking the code

A climate model is actually a collection of models – typically an atmosphere model, an ocean model, a land model, and a sea ice model. Some GCMs split up the sub-models (let’s call them components) a bit differently, but that’s the most common arrangement.

Each component represents a staggering amount of complex, specialized processes. Here are just a few examples from the Community Earth System Model, developed at the National Center for Atmospheric Research in Boulder, Colorado:

  • Atmosphere: sea salt suspended in the air, three-dimensional wind velocity, the wavelengths of incoming sunlight
  • Ocean: phytoplankton, the iron cycle, the movement of tides
  • Land: soil hydrology, forest fires, air conditioning in cities
  • Sea Ice: pollution trapped within the ice, melt ponds, the age of different parts of the ice

Each component is developed independently, and as a result, they are highly encapsulated (bundled separately in the source code). However, the real world is not encapsulated – the land and ocean and air are very interconnected. Some central code is necessary to tie everything together. This piece of code is called the coupler, and it has two main purposes:

  1. Pass data between the components. This can get complicated if the components don’t all use the same grid (system of splitting the Earth up into cells).
  2. Control the main loop, or “time stepping loop”, which tells the components to perform their calculations in a certain order, once per time step.

For example, take a look at the IPSL (Institut Pierre Simon Laplace) climate model architecture. In the diagram below, each bubble represents an encapsulated piece of code, and the number of lines in this code is roughly proportional to the bubble’s area. Arrows represent data transfer, and the colour of each arrow shows where the data originated:

We can see that IPSL’s major components are atmosphere, land, and ocean (which also contains sea ice). The atmosphere is the most complex model, and land is the least. While both the atmosphere and the ocean use the coupler for data transfer, the land model does not – it’s simpler just to connect it directly to the atmosphere, since it uses the same grid, and doesn’t have to share much data with any other component. Land-ocean interactions are limited to surface runoff and coastal erosion, which are passed through the atmosphere in this model.

You can see diagrams like this for seven different GCMs, as well as a comparison of their different approaches to software architecture, in this summary of my research.

Show time

When it’s time to run the model, you might expect that scientists initialize the components with data collected from the real world. Actually, it’s more convenient to “spin up” the model: start with a dark, stationary Earth, turn the Sun on, start the Earth spinning, and wait until the atmosphere and ocean settle down into equilibrium. The resulting data fits perfectly into the cells, and matches up really nicely with observations. It fits within the bounds of the real climate, and could easily pass for real weather.

Scientists feed input files into the model, which contain the values of certain parameters, particularly agents that can cause climate change. These include the concentration of greenhouse gases, the intensity of sunlight, the amount of deforestation, and volcanoes that should erupt during the simulation. It’s also possible to give the model a different map to change the arrangement of continents. Through these input files, it’s possible to recreate the climate from just about any period of the Earth’s lifespan: the Jurassic Period, the last Ice Age, the present day…and even what the future might look like, depending on what we do (or don’t do) about global warming.

The highest resolution GCMs, on the fastest supercomputers, can simulate about 1 year for every day of real time. If you’re willing to sacrifice some complexity and go down to a lower resolution, you can speed things up considerably, and simulate millennia of climate change in a reasonable amount of time. For this reason, it’s useful to have a hierarchy of climate models with varying degrees of complexity.

As the model runs, every cell outputs the values of different variables (such as atmospheric pressure, ocean salinity, or forest cover) into a file, once per time step. The model can average these variables based on space and time, and calculate changes in the data. When the model is finished running, visualization software converts the rows and columns of numbers into more digestible maps and graphs. For example, this model output shows temperature change over the next century, depending on how many greenhouse gases we emit:

Predicting the past

So how do we know the models are working? Should we trust the predictions they make for the future? It’s not reasonable to wait for a hundred years to see if the predictions come true, so scientists have come up with a different test: tell the models to predict the past. For example, give the model the observed conditions of the year 1900, run it forward to 2000, and see if the climate it recreates matches up with observations from the real world.

This 20th-century run is one of many standard tests to verify that a GCM can accurately mimic the real world. It’s also common to recreate the last ice age, and compare the output to data from ice cores. While GCMs can travel even further back in time – for example, to recreate the climate that dinosaurs experienced – proxy data is so sparse and uncertain that you can’t really test these simulations. In fact, much of the scientific knowledge about pre-Ice Age climates actually comes from models!

Climate models aren’t perfect, but they are doing remarkably well. They pass the tests of predicting the past, and go even further. For example, scientists don’t know what causes El Niño, a phenomenon in the Pacific Ocean that affects weather worldwide. There are some hypotheses on what oceanic conditions can lead to an El Niño event, but nobody knows what the actual trigger is. Consequently, there’s no way to program El Niños into a GCM. But they show up anyway – the models spontaneously generate their own El Niños, somehow using the basic principles of fluid dynamics to simulate a phenomenon that remains fundamentally mysterious to us.

In some areas, the models are having trouble. Certain wind currents are notoriously difficult to simulate, and calculating regional climates requires an unaffordably high resolution. Phenomena that scientists can’t yet quantify, like the processes by which glaciers melt, or the self-reinforcing cycles of thawing permafrost, are also poorly represented. However, not knowing everything about the climate doesn’t mean scientists know nothing. Incomplete knowledge does not imply nonexistent knowledge – you don’t need to understand calculus to be able to say with confidence that 9 x 3 = 27.

Also, history has shown us that when climate models make mistakes, they tend to be too stable, and underestimate the potential for abrupt changes. Take the Arctic sea ice: just a few years ago, GCMs were predicting it would completely melt around 2100. Now, the estimate has been revised to 2030, as the ice melts faster than anyone anticipated:

Answering the big questions

At the end of the day, GCMs are the best prediction tools we have. If they all agree on an outcome, it would be silly to bet against them. However, the big questions, like “Is human activity warming the planet?”, don’t even require a model. The only things you need to answer those questions are a few fundamental physics and chemistry equations that we’ve known for over a century.

You could take climate models right out of the picture, and the answer wouldn’t change. Scientists would still be telling us that the Earth is warming, humans are causing it, and the consequences will likely be severe – unless we take action to stop it.

A Little Bit of Hope

I went to a public lecture on climate change last night (because I just didn’t get enough of that last week at AGU, apparently), where four professors from different departments at my university spoke about their work. They were great speeches – it sort of reminded me of TED Talks – but I was actually most interested in the audience questions and comments afterward.

There was the token crazy guy who stood up and said “The sun is getting hotter every day and one day we’re all going to FRY! So what does that say about your global warming theory? Besides, if it was CO2 we could all just stop breathing!” Luckily, everybody laughed at his comments…

There were also some more reasonable-sounding people, repeating common myths like “It’s a natural cycle” and “Volcanoes emit more CO2 than humans“. The speakers did a good job of explaining why these claims were false, but I still wanted to pull out the Skeptical Science app and wave it in the air…

Overall, though, the audience seemed to be composed of concerned citizens who understood the causes and severity of climate change, and were eager to learn about impacts, particularly on extreme weather. It was nice to see an audience moving past this silly public debate into a more productive one about risk management.

The best moment, though, was on the bus home. There was a first-year student in the seat behind me – I assume he came to see the lecture as well, but maybe he just talks about climate change on the bus all the time. He was telling his friend about sea level rise, and he was saying all the right things – we can expect one or two metres by the end of the century, which doesn’t sound like a lot, but it’s enough to endanger many densely populated coastal cities, as well as kill vegetation due to seawater seeping in.

He even had the statistics right! I was so proud! I was thinking about turning around to join in the conversation, but by then I had been listening in for so long that it would have been embarrassing.

It’s nice to see evidence of a shift in public understanding, even if it’s only anecdotal. Maybe we’re doing something right after all.

Labels

For a long time I have struggled with what to call the people who insist that climate change is natural/nonexistent/a global conspiracy. “Skeptics” is their preferred term, but I refuse to give such a compliment to those who don’t deserve it. Skepticism is a good thing in science, and it’s not being applied by self-professed “climate skeptics”. This worthy label has been hijacked by those who seek to redefine it.

“Deniers” is more accurate, in my opinion, but I feel uncomfortable using it. I don’t want to appear closed-minded and alienate those who are confused or undecided. Additionally, many people are in the audience of deniers, but aren’t in denial themselves. They repeat the myths they hear from other sources, but you can easily talk them out of their misconceptions using evidence.

I posed this question to some people at AGU. Which word did they use? “Pseudoskeptics” and “misinformants” are both accurate terms, but too difficult for a new reader to understand. My favourite answer, which I think I will adopt, was “contrarians”. Simple, clear, and non-judgmental. It emphasizes what they think, not how they think. Also, it hints that they are going against the majority in the scientific community. Another good suggestion was to say someone is “in denial”, rather than “a denier” – it depersonalizes the accusation.

John Cook, when I asked him this question, turned it around: “What should we call ourselves?” he asked, and I couldn’t come up with an answer. I feel that not being a contrarian is a default position that doesn’t require a qualifier. We are just scientists, communicators, and concerned citizens, and unless we say otherwise you can assume we follow the consensus. (John thinks we should call ourselves “hotties”, but apparently it hasn’t caught on.)

“What should I call myself?” is another puzzler, since I fall into multiple categories. Officially I’m an undergrad student, but I’m also getting into research, which isn’t a required part of undergraduate studies. In some ways I am a journalist too, but I see that as a side project rather than a career goal. So I can’t call myself a scientist, or even a fledgling scientist, but I feel like I’m on that path – a scientist larva, perhaps?

Thoughts?

General Thoughts on AGU

I returned home from the AGU Fall Meeting last night, and after a good night’s sleep I am almost recovered – it’s amazing how tired science can make you!

The whole conference felt sort of surreal. Meeting and conversing with others was definitely the best part. I shook the hand of James Hansen and assured him that he is making a difference. I talked about my research with Gavin Schmidt. I met dozens of people that were previously just names on a screen, from top scientists like Michael Mann and Ben Santer to fellow bloggers like Michael Tobis and John Cook.

I filled most of a journal with notes I took during presentations, and saw literally hundreds of posters. I attended a workshop on climate science communication, run by Susan Joy Hassol and Richard Sommerville, which fundamentally altered my strategies for public outreach. Be sure to check out their new website, and their widely acclaimed Physics Today paper that summarizes most of their work.

Speaking of fabulous communication, take a few minutes to watch this memorial video for Stephen Schneider – it’s by the same folks who turned Bill McKibben’s article into a video:

AGU inspired so many posts that I think I will publish something every day this week. Be sure to check back often!

A Conversation with Gavin Schmidt

Cross-posted from NextGenJournal

Dr. Gavin Schmidt is a climate modeller at NASA’s Goddard Institute for Space Studies, as well as the editor at RealClimate. I recently had the opportunity to interview Dr. Schmidt, one of the top scientists in his field, on what we can expect from the climate in the coming decades. Here is the entirety of the interview we completed for my article Climate Change and Young People.

Kate: In a business-as-usual scenario, what range of warming can we expect within the lifetimes of today’s young people – so to about 2070 or 2080?

Gavin: Well, we don’t have a perfect crystal ball for exactly what “business-as-usual” means, but the kind of projections that people have been looking at – which involve quite high increases in population and minimal changes in technology – you are talking about global temperature changes, by about 2070, of somewhere between two, three, five degrees Celsius, depending a little bit on the scenario, and a little bit on how sensitive the climate actually is.

That metric is a bit abstract to most people, so how will that amount of warming actually impact people’s lives?

That’s a very good question, because most people don’t live in the global mean temperature, or the global mean anything. Those kinds of numbers translate to larger changes, between four and six degrees of warming, over the land. As you go towards the poles it becomes larger as well, because of the amplifying feedbacks of ice albedo changes and reductions in snow cover.

Right now the range between a cold summer and a warm summer, in most mid-latitude places, is on the order of a couple of degrees. You’ll be looking at summers then – the normal summer then – will be warmer than the warmest summers that you have now, and significantly warmer than the coldest summers. The same will be true in winter time and other seasons.

How will that impact metrics such as agriculture, food prices, the economy…?

It’s easy enough to say that there are going to be some impacts – obviously agriculture depends on the climate that exists. People will adapt to that, they’ll plant earlier, but crops are very sensitive to peak summer temperatures. So you’ll see losses in the fatally sensitive crops. But then you’ll see movement north of crops that were grown further south. You have to deal with the other changes – in nutrient balances, water availability, soil quality. We’re not talking about just moving the subtropics further toward the poles.

Lots of other things are going to change as well. Pests travel much faster with climate than do other kinds of species: invasive species tend to increase faster, because they’re moving into an empty niche, than species that are already well established. There’s going to be changes to rainfall regimes, whether it snows or rains, how heavily it rains – a lot of those things will tax infrastructure.

You’ve got changes for people living on the coast related to sea level rise. That will lead to changes in the damaging effects of storm surges when any particular storm comes through. We’re also looking at more subtle changes to the storms themselves, which could even amplify that effect.

How much of this warming, and these impacts, are now inevitable? Do we have the ability to prevent most of it, and what would that take?

Some further changes are inevitable. The system has so much inertia, and it hasn’t even caught up with what we’ve put into the atmosphere so far. As it continues to catch up, even if we don’t do anything else to the atmosphere from now on, we’ll still see further warming and further changes to the climate. But we do have a choice as to whether we try and minimize these changes in the future, or we allow the maximum change to occur. And the maximum changes really are very large. It’s been said that if we allow that to happen, we’ll end up living on a different planet, and I think there’s some certain truth to that.

I hear you talking a lot about uncertainty, and that’s something a lot of people are paralyzed by: they don’t want us to take these actions because they think everything might be fine on its own. What’s your response to that attitude?

Any decision that you’re making now that has to do with the future is uncertain. We make decisions all the time: where to invest money, whether to buy a house – these things aren’t certain, and we still have to make decisions. The issue with climate is that no action is a decision in and of itself. That one is actually laden with far more uncertainty than if we actually try and produce energy more efficiently, try and use more renewables, adjust the way we live so that we have a more sustainable future. The uncertainty comes with what would happen if we don’t make a decision, and I find that to be the dominant uncertainty. But climate change is not unique in having to deal with decision making under uncertainty. All decisions are like that. It’s nothing special about climate change in that there’s uncertainty about what’s going to happen in the future. Any time we decide to do anything, there’s uncertainty about the future, yet we still manage to get out of bed in the morning.

Probably in response to this attitude, climate science has got a lot of bad press in the past couple years. What have your experiences been – what sort of reactions have there been to your research?

There are a lot of people, particularly in the US, who perceive the science itself – just describing what’s going on and why – as a threat to their interests. To my mind, knowing what’s going on in the planet and trying to understand why should just be information, it shouldn’t be a threat. But other people see it as a threat, and instead of dealing with either their perceptions or what the science actually means, they choose to attack the science and they choose to attack the scientists. Basically, you just have people adopting a “shoot the messenger” strategy, which plays well in the media. It doesn’t get us very far in terms of better understanding what’s going on. But it does add a sort of smokescreen to divert people’s attention from what the real issues are. That’s regrettable, but I don’t think it’s at all surprising.

And finally, are you at all optimistic about the future?

It depends on the day.

The Pitfalls of General Reporting: A Case Study

Today’s edition of Nature included an alarming paper, indicating record ozone loss in the Arctic due to an unusually long period of cold temperatures in the lower stratosphere.

On the same day, coverage of the story by the Canadian Press included a fundamental error that is already contributing to public confusion about the reality of climate change.

Counter-intuitively, while global warming causes temperatures in the troposphere (the lowest layer of the atmosphere) to rise, it causes temperatures in the stratosphere (the next layer up), as well as every layer above that, to fall. The exact mechanics are complex, but the pattern of a warming troposphere and a cooling stratosphere has been both predicted and observed.

This pattern was observed in the Arctic this year. As the Nature paper mentions, the stratosphere was unusually cold in early 2011. The surface temperatures, however, were unusually warm, as data from NASA shows:

Mar-May 2011

Dec-Feb 2011

While we can’t know for sure whether or not the unusual stratospheric conditions were caused by climate change, this chain of cause and effect is entirely consistent with what we can expect in a warming world.

However, if all you read was an article by the Canadian Press, you could be forgiven for thinking differently.

The article states that the ozone loss was “caused by an unusually prolonged period of extremely low temperatures.” I’m going to assume that means surface temperatures, because nothing else is specified – and virtually every member of the public would assume that too. As we saw from the NASA maps, though, cold surface temperatures couldn’t be further from the truth.

The headline, which was probably written by the Winnipeg Free Press, rather than the Canadian Press, tops off the glaring misconception nicely:

Record Ozone loss over the Arctic caused by extremely cold weather: scientists

No, no, no. Weather happens in the troposphere, not the stratosphere. While the stratosphere was extremely cold, the troposphere certainly was not. It appears that the reporters assumed the word “stratosphere” in the paper’s abstract was completely unimportant. In fact, it changes the meaning of the story entirely.

The reaction to this article, as seen in the comments section, is predictable:

So with global warming our winters are colder?

First it’s global warming that is destroying Earth, now it’s being too cold?! I’m starting to think these guys know as much about this as weather guys know about forecasting the weather!

Al gore the biggest con man since the beginning of mankind!! This guys holdings leave a bigger carbon footprint than most small countries!!

I’m confused. I thought the north was getting warmer and that’s why the polar bears are roaming around Churchill looking for food. There isn’t ice for them to go fishing.

People are already confused, and deniers are already using this journalistic error as evidence that global warming is fake. All because a major science story was written by a general reporter who didn’t understand the study they were covering.

In Manitoba, high school students learn about the different layers of the atmosphere in the mandatory grade 10 science course. Now, reporters who can’t recall this information are writing science stories for the Canadian Press.

Another Sporadic Open Thread

I keep forgetting to put these up.

Possible topics for discussion:

  • La Niña is expected to continue into the winter. This is definitely not what southern U.S. states, such as Texas, want – after a summer of intense drought, the drying effect of La Niña on that area of the world won’t bring any relief.
  • For those of you going to AGU, an itinerary planner is now available to browse the program and save sessions you’re interested in. I am compiling an awesome-looking list of presentations by the likes of James Hansen, Wally Broecker and Gavin Schmidt. Our poster is entitled “The Software Architecture of Global Climate Models”, and is on the Thursday morning.
  • Has anyone read Earth, an Operator’s Manual by Richard Alley? If so, would you recommend it?

Enjoy!

What Does the Public Know?

Part 4 in a series of 5 for NextGen Journal

Like it or not, a scientific consensus exists that humans are causing the Earth to warm. However, the small number of scientists that disagree with this conclusion get a disproportionate amount of media time, particularly in the United States: most newspaper articles give the two “sides” equal weight. Does this false sense of balance in the media take a toll on public understanding of climate science? Are people getting the false impression that global warming is a tenuous and controversial theory? Recent survey data from George Mason University can help answer these questions.

65% of Americans say the world is warming, but only 46% attribute this change to human activities. Compare these numbers to 96% and 97% of climate scientists, respectively. Somewhere, the lines of communication are getting muddled.

It’s not as if people hear about scientific results but don’t believe them. Given that 76% of Americans “strongly” or “somewhat” trust scientists as sources of information on climate change, you would expect public knowledge to fall in line with scientific consensus. However, it appears that most people don’t know about this consensus. 41% of Americans say there is “a lot of disagreement among scientists” regarding global warming. Among Republicans, this figure rises to 56%; for the Tea Party, 69%.

If you could ask an expert one question about climate change, what would it be? Among survey respondents, the most popular answer (19%) was, “How do you know that global warming is caused mostly by human activities, not natural changes in the environment?” As a science communicator, this statistic intrigues me – it tells me what to focus on. For those who are interested, scientists can attribute changes in the climate to particular causes based on the way the global temperature changes: patterns of warming in different layers of the atmosphere, the rate of warming at night compared to in the day, in summer compared to in winter, and so on. You can read more about this topic here and here.

In this survey, the differences between Republicans and Democrats weren’t as extreme as I expected. Instead, it was the Tea Party that really stuck out. Self-identified Tea Party members are, based on their responses, the least informed about climate science, but also the most likely to consider themselves well-informed and the least likely to change their minds. A majority of members in every other political group would choose environmental sustainability over economic growth, if it came down to a choice; a majority of every other party thinks that the United States should reduce its greenhouse gas emissions regardless of what other countries do. But the Tea Party seems opposed to everything, including solutions as benign as urban planning.

Luckily, this anti-science movement only made up 12% of the survey respondents. Most Americans are far more willing to learn about climate change and question their knowledge, and there is no source that they trust more than scientists.