Future projections of Antarctic ice shelf melting

Climate change will increase ice shelf melt rates around Antarctica. That’s the not-very-surprising conclusion of my latest modelling study, done in collaboration with both Australian and German researchers, which was just published in Journal of Climate. Here’s the less intuitive result: much of the projected increase in melt rates is actually linked to a decrease in sea ice formation.

That’s a lot of different kinds of ice, so let’s back up a bit. Sea ice is just frozen seawater. But ice shelves (as well as ice sheets and icebergs) are originally formed of snow. Snow falls on the Antarctic continent, and over many years compacts into a system of interconnected glaciers that we call an ice sheet. These glaciers flow downhill towards the coast. If they hit the coast and keep going, floating on the ocean surface, the floating bits are called ice shelves. Sometimes the edges of ice shelves will break off and form icebergs, but they don’t really come into this story.

Climate models don’t typically include ice sheets, or ice shelves, or icebergs. This is one reason why projections of sea level rise are so uncertain. But some standalone ocean models do include ice shelves. At least, they include the little pockets of ocean beneath the ice shelves – we call them ice shelf cavities – and can simulate the melting and refreezing that happens on the ice shelf base.

We took one of these ocean/ice-shelf models and forced it with the atmospheric output of regular climate models, which periodically make projections of climate change from now until the end of the century. We completed four different simulations, consisting of two different greenhouse gas emissions scenarios (“Representative Concentration Pathways” or RCPs) and two different choices of climate model (“ACCESS 1.0”, or “MMM” for the multi-model mean). Each simulation required 896 processors on the supercomputer in Canberra. By comparison, your laptop or desktop computer probably has about 4 processors. These are pretty sizable models!

In every simulation, and in every region of Antarctica, ice shelf melting increased over the 21st century. The total increase ranged from 41% to 129% depending on the scenario. The largest increases occurred in the Amundsen Sea region, marked with red circles in the maps below, which happens to be the region exhibiting the most severe melting in recent observations. In the most extreme scenario, ice shelf melting in this region nearly quadrupled.

Percent change in ice shelf melting, caused by the ocean, during the four future projections. The values are shown for all of Antarctica (written on the centre of the continent) as well as split up into eight sectors (colour-coded, written inside the circles). Figure 3 of Naughten et al., 2018, © American Meteorological Society.

So what processes were causing this melting? This is where the sea ice comes in. When sea ice forms, it spits out most of the salt from the seawater (brine rejection), leaving the remaining water saltier than before. Salty water is denser than fresh water, so it sinks. This drives a lot of vertical mixing, and the heat from warmer, deeper water is lost to the atmosphere. The ocean surrounding Antarctica is unusual in that the deep water is generally warmer than the surface water. We call this warm, deep water Circumpolar Deep Water, and it’s currently the biggest threat to the Antarctic Ice Sheet. (I say “warm” – it’s only about 1°C, so you wouldn’t want to go swimming in it, but it’s plenty warm enough to melt ice.)

In our simulations, warming winters caused a decrease in sea ice formation. So there was less brine rejection, causing fresher surface waters, causing less vertical mixing, and the warmth of Circumpolar Deep Water was no longer lost to the atmosphere. As a result, ocean temperatures near the bottom of the Amundsen Sea increased. This better-preserved Circumpolar Deep Water found its way into ice shelf cavities, causing large increases in melting.

Slices through the Amundsen Sea – you’re looking at the ocean sideways, like a slice of birthday cake, so you can see the vertical structure. Temperature is shown on the top row (blue is cold, red is warm); salinity is shown on the bottom row (blue is fresh, red is salty). Conditions at the beginning of the simulation are shown in the left 2 panels, and conditions at the end of the simulation are shown in the right 2 panels. At the beginning of the simulation, notice how the warm, salty Circumpolar Deep Water rises onto the continental shelf from the north (right side of each panel), but it gets cooler and fresher as it travels south (towards the left) due to vertical mixing. At the end of the simulation, the surface water has freshened and the vertical mixing has weakened, so the warmth of the Circumpolar Deep Water is preserved. Figure 8 of Naughten et al., 2018, © American Meteorological Society.

This link between weakened sea ice formation and increased ice shelf melting has troubling implications for sea level rise. The next step is to simulate the sea level rise itself, which requires some model development. Ocean models like the one we used for this study have to assume that ice shelf geometry stays constant, so no matter how much ice shelf melting the model simulates, the ice shelves aren’t allowed to thin or collapse. Basically, this design assumes that any ocean-driven melting is exactly compensated by the flow of the upstream glacier such that ice shelf geometry remains constant.

Of course this is not a good assumption, because we’re observing ice shelves thinning all over the place, and a few have even collapsed. But removing this assumption would necessitate coupling with an ice sheet model, which presents major engineering challenges. We’re working on it – at least ten different research groups around the world – and over the next few years, fully coupled ice-sheet/ocean models should be ready to use for the most reliable sea level rise projections yet.

A modified version of this post appeared on the EGU Cryospheric Sciences Blog.

Advertisements

With a Little Help from the Elephant Seals

A problem which has plagued oceanography since the very beginning is a lack of observations. We envy atmospheric scientists with their surface stations and satellite data that monitor virtually the entire atmosphere in real time. Until very recently, all that oceanographers had to work with were measurements taken by ships. This data was very sparse in space and time, and was biased towards certain ship tracks and seasons.

A lack of observations makes life difficult for ocean modellers, because there is very little to compare the simulations to. You can’t have confidence in a model if you have no way of knowing how well it’s performing, and you can’t make many improvements to a model without an understanding of its shortcomings.

Our knowledge of the ocean took a giant leap forward in 2000, when a program called Argo began. “Argo floats” are smallish instruments floating around in the ocean that control their own buoyancy, rising and sinking between the surface and about 2000 m depth. They use a CTD sensor to measure Conductivity (from which you can easily calculate salinity), Temperature, and Depth. Every 10 days they surface and send these measurements to a satellite. Argo floats are battery-powered and last for about 4 years before losing power. After this point they are sacrificed to the ocean, because collecting them would be too expensive.

This is what an Argo float looks like while it’s being deployed:

With at least 27 countries helping with deployment, the number of active Argo floats is steadily rising. At the time of this writing, there were 3748 in operation, with good coverage everywhere except in the polar oceans:

The result of this program is a massive amount of high-quality, high-resolution data for temperature and salinity in the surface and intermediate ocean. A resource like this is invaluable for oceanographers, analogous to the global network of weather stations used by atmospheric scientists. It allows us to better understand the current state of the ocean, to monitor trends in temperature and salinity as climate change continues, and to assess the skill of ocean models.

But it’s still not good enough. There are two major shortcomings to Argo floats. First, they can’t withstand the extreme pressure in the deep ocean, so they don’t sink below about 2000 m depth. Since the average depth of the world’s oceans is around 4000 m, the Argo program is only sampling the upper half. Fortunately, a new program called Deep Argo has developed floats which can withstand pressures down to 6000 m depth, covering all but the deepest ocean trenches. Last June, two prototypes were successfully deployed off the coast of New Zealand, and the data collected so far is looking good. If all future Argo floats were of the Deep Argo variety, in five or ten years we would know as much about the deep ocean’s temperature and salinity structure as we currently know about the surface. To oceanographers, particularly those studying bottom water formation and transport, there is almost nothing more exciting than this prospect.

The other major problem with Argo floats is that they can’t handle sea ice. Even if they manage to get underneath the ice by drifting in sideways, the next time they rise to the surface they will bash into the underside of the ice, get stuck, and stay there until their battery dies. This is a major problem for scientists like me who study the Southern Ocean (surrounding Antarctica), which is largely covered with sea ice for much of the year. This ocean will be incredibly important for sea level rise, because the easiest way to destabilise the Antarctic Ice Sheet is to warm up the ocean and melt the ice shelves (the edges of the ice sheet which extend over the ocean) from below. But we can’t monitor this process using Argo data, because there is a big gap in observations over the region. There’s always the manual option – sending in scientists to take measurements – but this is very expensive, and nobody wants to go there in the winter.

Instead, oceanographers have recently teamed up with biologists to try another method of data collection, which is just really excellent:

They are turning seals into Argo floats that can navigate sea ice.

Southern elephant seals swim incredible distances in the Southern Ocean, and often dive as far as 2000 m below the surface. Scientists are utilising the seals’ natural talents to fill in the gaps in the Argo network, so far with great success. Each seal is tranquilized while a miniature CTD is glued to the fur on its head, after which it is released back into the wild. As the seal swims around, the sensors take measurements and communicate with satellites just like regular Argo floats. The next time the seal sheds its coat (once per year), the CTD falls off and the seal gets on with its life, probably wondering what that whole thing was about.

This project is relatively new and it will be a few years before it’s possible to identify trends in the data. It’s also not clear whether or not the seals tend to swim right underneath the ice shelves, where observations would be most useful. But if this dataset gains popularity among oceanographers, and seals become officially integrated into the Argo network…

…then we will be the coolest scientists of all.

Uncertainty

Part 5 in a series of 5 for NextGen Journal
Read Part 1, Part 2, Part 3, and Part 4

Scientists can never say that something is 100% certain, but they can come pretty close. After a while, a theory becomes so strong that the academic community accepts it and moves on to more interesting problems. Replicating an experiment for the thousandth time just isn’t a good use of scientific resources. For example, conducting a medical trial to confirm that smoking increases one’s risk of cancer is no longer very useful; we covered that decades ago. Instead, a medical trial to test the effectiveness of different strategies to help people quit smoking will lead to much greater scientific and societal benefit.

In the same manner, scientists have known since the 1970s that human emissions of greenhouse gases are exerting a warming force on the climate. More recently, the warming started to show up, in certain patterns that confirm it is caused by our activities. These facts are no longer controversial in the scientific community (the opinion pages of newspapers are another story, though). While they will always have a tiny bit of uncertainty, it’s time to move on to more interesting problems. So where are the real uncertainties? What are the new frontiers of climate science?

First of all, projections of climate change depend on what the world decides to do about climate change – a metric that is more uncertain than any of the physics underlying our understanding of the problem. If we collectively step up and reduce our emissions, both quickly and significantly, the world won’t warm too much. If we ignore the problem and do nothing, it will warm a great deal. At this point, our actions could go either way.

Additionally, even though we know the world is going to warm, we don’t know exactly how much, even given a particular emission scenario. We don’t know exactly how sensitive the climate system is, because it’s quite a complex beast. However, using climate models and historical data, we can get an idea. Here is a probability density function for climate sensitivity: the greater the area under the curve at a specific point on the x-axis, the greater the probability that the climate sensitivity is equal to that value of x (IPCC, 2007):

This curve shows us that climate sensitivity is most likely around 3 degrees Celsius for every doubling of atmospheric carbon dixoide, since that’s where the area peaks. There’s a small chance that it’s less than that, so the world might warm a little less. But there’s a greater chance that climate sensitivity is greater than 3 degrees so the world will warm more. So this graph tells us something kind of scary: if we’re wrong about climate sensitivity being about 3 degrees, we’re probably wrong in the direction we don’t want – that is, the problem being worse than we expect. This metric has a lot to do with positive feedbacks (“vicious cycles” of warming) in the climate system.

Another area of uncertainty is precipitation. Temperature is a lot easier to forecast than precipitation, both regionally and globally. With global warming, the extra thermal energy in the climate system will lead to more water in the air, so there will be more precipitation overall – but the extra energy also drives evaporation of surface water to increase. Some areas will experience flooding, and some will experience drought; many areas will experience some of each, depending on the time of year. In summary, we will have more of each extreme when it comes to precipitation, but the when and where is highly uncertain.

Scientists are also unsure about the rate and extent of future sea level rise. Warming causes the sea to rise for two different reasons:

  1. Water expands as it warms, which is easy to model;
  2. Glaciers and ice sheets melt and fall into the ocean, which is very difficult to model.

If we cause the Earth to warm indefinitely, all the ice in the world will turn into water, but we won’t get that far (hopefully). So how much ice will melt, and how fast will it go? This depends on feedbacks in the climate system, glacial dynamics, and many other phenomena that are quantitatively poorly understood.

These examples of uncertainty in climate science, just a few of many, don’t give us an excuse to do nothing about the problem. As Brian, a Master’s student from Canada, wrote, “You don’t have to have the seventh decimal place filled in to see that the number isn’t looking good.”. We know that there is a problem, and it might be somewhat better or somewhat worse than scientists are currently predicting, but it won’t go away. As we noted above, in many cases it’s more likely to be worse than it is to be better. Even a shallow understanding of the implications of “worse” should be enough for anyone to see the necessity of action.