Methane gas levels begin to increase again

The amount of methane in Earth’s atmosphere shot up in 2007, bringing to an end a period of about a decade in which atmospheric levels of the potent greenhouse gas were essentially stable, according to a team led by MIT researchers.

Methane levels in the atmosphere have more than doubled since pre-industrial times, accounting for around one-fifth of the human contribution to greenhouse gas-driven global warming. Until recently, the leveling off of methane levels had suggested that the rate of its emission from the Earth’s surface was approximately balanced by the rate of its destruction in the atmosphere.

However, since early 2007 the balance has been upset, according to a paper on the new findings being published this week in Geophysical Review Letters. The paper’s lead authors, postdoctoral researcher Matthew Rigby and Ronald Prinn, the TEPCO Professor of Atmospheric Chemistry, in MIT’s Department of Earth, Atmospheric and Planetary Science, say this imbalance has resulted in several million metric tons of additional methane in the atmosphere. Methane is produced by wetlands, rice paddies, cattle, and the gas and coal industries, and is destroyed by reaction with the hydroxyl free radical (OH), often referred to as the atmosphere’s “cleanser.”

One surprising feature of this recent growth is that it occurred almost simultaneously at all measurement locations across the globe. However, the majority of methane emissions are in the Northern Hemisphere, and it takes more than one year for gases to be mixed from the Northern Hemisphere to the Southern Hemisphere. Hence, theoretical analysis of the measurements shows that if an increase in emissions is solely responsible, these emissions must have risen by a similar amount in both hemispheres at the same time.

A rise in Northern Hemispheric emissions may be due to the very warm conditions that were observed over Siberia throughout 2007, potentially leading to increased bacterial emissions from wetland areas. However, a potential cause for an increase in Southern Hemispheric emissions is less clear.

An alternative explanation for the rise may lie, at least in part, with a drop in the concentrations of the methane-destroying OH. Theoretical studies show that if this has happened, the required global methane emissions rise would have been smaller, and more strongly biased to the Northern Hemisphere. At present, however, it is uncertain whether such a drop in hydroxyl free radical concentrations did occur because of the inherent uncertainty in the current method for estimating global OH levels.

To help pin down the cause of the methane increase, Prinn said, “the next step will be to study this using a very high-resolution atmospheric circulation model and additional measurements from other networks.” But doing that could take another year, he said, and because the detection of increased methane has important consequences for global warming the team wanted to get these initial results out as quickly as possible.

“The key thing is to better determine the relative roles of increased methane emission versus an increase in the rate of removal,” Prinn said. “Apparently we have a mix of the two, but we want to know how much of each” is responsible for the overall increase.

It is too early to tell whether this increase represents a return to sustained methane growth, or the beginning of a relatively short-lived anomaly, according to Rigby and Prinn. Given that, pound for pound, methane is 25 times more powerful as a greenhouse gas than carbon dioxide, the situation will require careful monitoring in the near future.

Arctic sea ice thinning at record rate

The thickness of sea ice in large parts of the Arctic declined by as much as 19% last winter compared to the previous five winters, according to data from ESA’s Envisat satellite.

Using Envisat radar altimeter data, scientists from the Centre for Polar Observation and Modelling at University College London (UCL) measured sea ice thickness over the Arctic from 2002 to 2008 and found that it had been fairly constant until the record loss of ice in the summer of 2007.
Unusually warm weather conditions were present over the Arctic in 2007, which some scientists have said explain that summer ice loss. However, this summer reached the second-lowest extent ever recorded with cooler weather conditions present.

Dr Katharine Giles of UCL, who led the study, said: “This summer’s low ice extent doesn’t seem to have been driven by warm weather, so the question is, was last winter’s thinning behind it?”

The research, reported in Geophysical Research Letters, showed that last winter the average thickness of sea ice over the whole Arctic fell by 26 cm (10%) compared with the average thickness of the previous five winters, but sea ice in the western Arctic lost around 49 cm of thickness.

Giles said the extent of sea ice in the Arctic is down to a number of factors, including warm temperatures, currents and wind, making it important to know how ice thickness is changing as well as the extent of the ice.

“As the Arctic ice pack is constantly moving, conventional methods can only provide sparse and intermittent measurements of ice thickness from which it is difficult to tell whether the changes are local or across the whole Arctic,” Giles said.

“Satellites provide the only means to determine trends and a consistent and wide area basis. Envisat altimeter data have provided the critical third dimension to the satellite images which have already revealed a dramatic decrease in the area of ice covered in the Arctic.”

The team, including Dr Seymour Laxon and Andy Ridout, was the first to measure ice thickness throughout the Arctic winter, from October to March, over more than half of the Arctic.

“We will continue to use Envisat to monitor the evolution of ice thickness through this winter to see whether this downward trend will continue,” Laxon said. “Next year we will have an even better tool to measure ice thickness in the shape of ESA’s CryoSat-2 mission which will provide higher resolution data and with almost complete coverage to the pole.”

A glacier’s life

EPFL researchers have developed a numerical model that can re-create the state of Switzerland’s Rhône Glacier as it was in 1874 and predict its evolution until the year 2100. This is the longest period of time ever modeled in the life of a glacier, involving complex data analysis and mathematical techniques.
The work will serve as a benchmark study for those interested in the state of glaciers and their relation to climate change.

The Laboratory of Hydraulics, Hydrology and Glaciology at ETH Zurich has been a repository for temperature, rainfall and flow data on the Rhône Glacier since the 1800s. Researchers there have used this data to reconstruct the glacier’s mass balance, i.e. the difference between the amount of ice it accumulates over the winter and the amount that melts during the summer(see 1 below). Now, led by professor Jacques Rappaz from EPFL’s Numerical Analysis and Simulations group, a team of mathematicians has taken the next step, using all this information to create a numerical model of glacier evolution, which they have used to simulate the history and predict the future of Switzerland’s enormous Rhone glacier over a 226-year period.

The mathematicians developed their model using three possible future climate scenarios. “We took the most moderate one, avoiding extremely optimistic or pessimistic scenarios,” explains PhD student Guillaume Jouvet. With a temperature increase of 3.6 degrees Celsius and a decrease in rainfall of 6% over a century, the glacier’s “equilibrium line”, or the transition from the snowfall accumulation zone to the melting zone (currently situated at an altitude of around 3000 meters), rose significantly. According to this same scenario, the simulation anticipates a loss of 50% of the volume by 2060 and forecasts the complete disapearance of the Rhône glacier around 2100.</P

“It is the first time that the evolution of a glacier has been numerically simulated over such a long period of time, taking into account very complex data,” notes EPFL mathematician Marco Picasso. Even though measurements have been taken for quite some time, the sophisticated numerical techniques that were needed to analyze them have only been developed very recently.

To verify their results, the mathematicians have also reconstructed a long-vanished glacier in Eastern Switzerland. They were able to pinpoint the 10,000-year-old equilibrium line from vestiges of moraines that still exist (see 2 below).

The scientists’ work will be of interest not only to climate change experts, but also to those to whom glaciers are important – from tourism professionals to hydroelectric energy suppliers. Picasso adds that this numerical model could be applied to the polar icecaps. “Mathematics and numerical methods have an important role to play in our society,” he enthuses. “They allow us to simulate with great confidence a large number of environmental phenomena.”

Scientist uses tracer to predict ancient ocean circulation

Even though the Cretaceous Period ended more than 65 million years ago, clues remain about how the ocean water circulated at that time. Measuring a chemical tracer in samples of ancient fish scales, bones and teeth, University of Missouri and University of Florida researchers have studied circulation in the Late Cretaceous North Atlantic Ocean. The Late Cretaceous was a time with high atmospheric levels of carbon dioxide and warm temperatures. Understanding such ancient greenhouse climates is important for predicting what may happen in the future. The new findings contradict some previous models.

Water masses are naturally imprinted with a chemical signature that reflects the geology in the land masses surrounding the area where they form. They carry this signature with them as they travel through the oceans, and the signature is recorded by fish skeletal material. If this fish debris is fossilized, so is the signature. MU and UF researchers collected 45 samples of 95- to 65- million-year-old fish debris from the Demerara Rise in the tropical western North Atlantic Ocean. They measured the chemical signature of these samples to estimate the source and circulation of intermediate waters during the Cretaceous Period.

“This technique allows us to track how water flowed in the Cretaceous oceans better than has been possible previously,” said Ken MacLeod, a professor of geological sciences in the MU College of Arts and Science. “Constraining ocean circulation patterns during greenhouse times, especially across the very large changes in the global carbon cycle that occurred during the interval we studied, is giving us a better understanding of how greenhouse oceans behave.”

Late Cretaceous atmospheric carbon dioxide levels were two to four times higher than today, which resulted in a greenhouse climate with tropical sea-surface temperatures rising to more than 34 degrees Celsius, 4 to 7 degrees Celsius (7 to 12 degrees Fahrenheit) warmer than today. </P

“The chemical signatures we measured presented two surprising findings. Values were extremely low for open-ocean sites for most of the time between 95 and 65 million years ago, but they were interrupted by a shift that was larger and more rapid than anything previously documented in marine sediments. This shift happened precisely at the time of the largest disturbance to the global carbon cycle of the past 200 million years,” MacLeod said.

Based on the results, the researchers proposed the Late Cretaceous North Atlantic was characterized by sinking of warm, salty, equatorial waters, and that circulation became more vigorous or a new source of the chemical signature was introduced at the time of the disturbance to the carbon cycle. Both the persistent formation of warm, saline intermediate waters and enhanced mixing contradict leading paleoceanographic models for these times.

Catching quakes with laptops

The interactive program built around the BOINC screensaver, designed for classroom activities.  Recent earthquakes and sites of major historic earthquakes are indicated; information about these events can be retrieved by clicking on them. -  Quake Catcher Network Project
The interactive program built around the BOINC screensaver, designed for classroom activities. Recent earthquakes and sites of major historic earthquakes are indicated; information about these events can be retrieved by clicking on them. – Quake Catcher Network Project

Inside your laptop is a small accelerometer chip, there to protect the delicate moving parts of your hard disk from sudden jolts.

It turns out that the same chip is a pretty good earthquake sensor, too-especially if the signals from lots of them are compared, in order to filter out more mundane sources of laptop vibrations, such as typing.

It’s an approach that is starting to gain acceptance. The project Quake Catcher Network (QCN), already has about 1500 laptops connected in a network that has detected several tremors, including a magnitude 5.4 quake in Los Angeles in July. Led by Elizabeth Cochran at the University of California, Riverside, and Jesse Lawrence at Stanford University, QCN uses the same BOINC platform for volunteer computing that projects like SETI@home rely on.

One of the benefits of this new technology is price: Research-grade earthquake sensors typically cost between $10,000 and $100,000. Of course, they are much more sensitive, and can detect the subtle signals of far-away quakes that laptops will never pick up. But Lawrence notes that, “with many more cheap sensors, instead of guessing where strong motions were felt by interpolating between sensors, we should be able to know where strong motions were felt immediately, because we have sensors there.”

Another advantage is that QCN sensors can record the maximum ground shaking. Many high-sensitivity sensors cut short the full extent of the oscillations they are measuring even for moderate earthquakes. Lawrence argues that with enough sensors, eventually “we should have the ability to triangulate earthquakes for earthquake early warning, providing several seconds of warning before the earthquake hits neighboring populated regions.”

There is a catch with the QCN sensors, though: getting accurate coordinates for their position. At present, since most laptops do not have GPS, the project relies on coordinates that the users type in. Fortunately, rough coordinates can also be automatically retrieved from network routers that the laptop is connected to, as a backup.

It all started with teenage mutant ninjas

Laptop accelerometers were never meant to be used this way. But in 2005, a benign hacker group called the teenage mutant ninjas figured out how to access the “sudden motion sensor” in Apple computers. A year later, David Griscom at the company Suitable Systems wrote SeisMac as an educational tool for IRIS, a group of U.S. earthquake seismologists.

Cochran had the idea that this approach could be linked with BOINC. Carl Christensen, a distributed computing expert, was recruited to implement QCN in BOINC last year. A first limited release was made in March of this year, and by April the network had already detected its first quake, in Reno, Nevada.

Christensen is now working on integrating stand-alone sensors that attach to desktop machines with USB connections (since desktops don’t get bumped around like laptops, they don’t have built-in sensors). These USB sensors can be as cheap as $30, and the idea is to have large numbers of them sponsored as educational tools for schools.

Lawrence notes that “the USB accelerometers will provide a stable backbone, without which the ever-changing configuration of laptops would not be quite as reliable. The USB accelerometers can also mount directly to the floor, which means they will have better sensitivity to ground motions.”

So this is not just a neat outreach opportunity-it could one day save lives.

Less Ice In Arctic Ocean 6000-7000 Years Ago

Settlement: Astrid Lyså in August 2007 in the ruined settlement left by the Independence I Culture in North Greenland. The first immigrants to these inhospitable regions succumbed to the elements nearly 4000 years ago, when the climate became colder again. (Credit: Eiliv Larsen, NGU)
Settlement: Astrid Lyså in August 2007 in the ruined settlement left by the Independence I Culture in North Greenland. The first immigrants to these inhospitable regions succumbed to the elements nearly 4000 years ago, when the climate became colder again. (Credit: Eiliv Larsen, NGU)

Recent mapping of a number of raised beach ridges on the north coast of Greenland suggests that the ice cover in the Arctic Ocean was greatly reduced some 6000-7000 years ago. The Arctic Ocean may have been periodically ice free.

“The climate in the northern regions has never been milder since the last Ice Age than it was about 6000-7000 years ago. We still don’t know whether the Arctic Ocean was completely ice free, but there was more open water in the area north of Greenland than there is today,” says Astrid Lyså, a geologist and researcher at the Geological Survey of Norway (NGU).

Shore features

Together with her NGU colleague, Eiliv Larsen, she has worked on the north coast of Greenland with a group of scientists from the University of Copenhagen, mapping sea-level changes and studying a number of shore features. She has also collected samples of driftwood that originated from Siberia or Alaska and had these dated, and has collected shells and microfossils from shore sediments.

“The architecture of a sandy shore depends partly on whether wave activity or pack ice has influenced its formation. Beach ridges, which are generally distinct, very long, broad features running parallel to the shoreline, form when there is wave activity and occasional storms. This requires periodically open water,” Astrid Lyså explains.

Pack-ice ridges which form when drift ice is pressed onto the seashore piling up shore sediments that lie in its path, have a completely different character. They are generally shorter, narrower and more irregular in shape.

Open sea

“The beach ridges which we have had dated to about 6000-7000 years ago were shaped by wave activity,” says Astrid Lyså. They are located at the mouth of Independence Fjord in North Greenland, on an open, flat plain facing directly onto the Arctic Ocean. Today, drift ice forms a continuous cover from the land here.

Astrid Lyså says that such old beach formations require that the sea all the way to the North Pole was periodically ice free for a long time.

“This stands in sharp contrast to the present-day situation where only ridges piled up by pack ice are being formed,” she says.

However, the scientists are very careful about drawing parallels with the present-day trend in the Arctic Ocean where the cover of sea ice seems to be decreasing.

“Changes that took place 6000-7000 years ago were controlled by other climatic forces than those which seem to dominate today,” Astrid Lyså believes.

Inuit immigration

The mapping at 82 degrees North took place in summer 2007 as part of the LongTerm project, a sub-project of the major International Polar Year project, SciencePub. The scientists also studied ruined settlements dating from the first Inuit immigration to these desolate coasts.

The first people from Alaska and Canada, called the Independence I Culture, travelled north-east as far as they could go on land as long ago as 4000-4500 years ago. The scientists have found out that drift ice had formed on the sea again in this period, which was essential for the Inuit in connection with their hunting. No beach ridges have been formed since then.

“Seals and driftwood were absolutely vital if they were to survive. They needed seals for food and clothing, and driftwood for fuel when the temperature crept towards minus 50 degrees. For us, it is inconceivable and extremely impressive,” says Eiliv Larsen, the NGU scientist and geologist.

Impacts of climate change on lakes

Lake Toya is a volcanic caldera lake in Shikotsu-Toya National Park on Hokkaido Island in north of Japan. - Photo: Dr. Bertram Boehrer/UFZ
Lake Toya is a volcanic caldera lake in Shikotsu-Toya National Park on Hokkaido Island in north of Japan. – Photo: Dr. Bertram Boehrer/UFZ

Climate change will have different effects on lakes in warmer and colder regions of the globe. This is the conclusion reached by Japanese and German researchers following studies of very deep caldera lakes in Japan. Scientists from Hokkaido University, the Hokkaido Institute of Environmental Sciences, Kagoshima University and the Helmholtz Centre for Environmental Research (UFZ) compared current measurements with measurements taken 70 years ago. This confirmed a rise in temperatures in the deep water layers of lakes in the south of Japan, while the deep water temperatures of lakes in the north remained the same.

Rising temperatures can lead to changes in nutrient exchange and turnover in the water. In certain circumstances, winter circulation behaviour can be so severely affected by rising temperatures and other climatic factors that oxygen supplies to the lower depths become insufficient for many organisms, leading to an accumulation of nutrients in the deep water, say the researchers writing in Geophysical Research Letters.

Measurements from 2005 and 2007 in deep Japanese caldera lakes provide information about the distribution of dissolved nutrients in the water. There are two reasons why this chain of lakes makes an excellent research subject for providing general information about circulation under changeable climatic conditions that will be valid for lakes outside the research area.

Firstly, the lakes cover a climate gradient that stretches from the south of Japan to the northern island of Hokkaido. Secondly, oxygen and nutrient exchange between the deep water and the surface in the lakes under investigation is controlled almost exclusively by temperature differences.

The researchers found that almost all of the lakes studied displayed a good distribution of the dissolved nutrients, despite their enormous depths of up to 423 metres (Lake Tazawa, Honshu). The lakes can be divided into two main depth-circulation categories based on their climatic conditions.

The researchers expect deep water temperatures of colder lakes (e.g. Lake Shikotsu, Hokkaido) to remain unchanged in warmer winters, provided the temperature rises are not excessive, while deep water temperatures in warmer lakes are likely to rise. This was confirmed by comparisons with single-point measurements from the 1930s.

The scientists warn that a very steep rise in winter temperatures over the years results in water temperatures that do not fall anywhere near as low as the temperatures of the previous years and depth circulation can cease altogether (Lake Ikeda, Kyushu). In such circumstances oxygen supplies and nutrient distribution would be interrupted, which would have impacts on organisms.

Water quality in lakes is an important economic factor for tourism, water companies and fishing businesses. Together with colleagues in Australia, Canada and Spain, UFZ scientists are therefore working on numeric lake simulation models which are designed to provide predictions about water quality under altered conditions.

Scientists Map Soils on an Extinct American Volcano

Thousands of years after the lava cooled, soil scientists conduct sophisticated mapping of the resulting soil landscape.

Union County New Mexico is a landscape of striking diversity. Out of expansive rangelands rise sporadic yet majestic cinder cone volcanoes and mesas preserved by basalt, part of the Raton-Clayton Volcanic Field. Capulin volcano, formed approximately 62,000 years ago, is the youngest volcano in the field. The cone rises 396 m from the plain, reaching an altitude of 2,495 m above sea level. The base of the volcano is 6.4 km in circumference, and the crater is 126 m deep and 442 m across. Four different flows of lava can be observed across the monument, indicative of different eruptive events. Conditions across the park are highly dynamic with respect to vegetation distribution, slope, and depth to bedrock, but the available soils data was highly generalized and lacked sufficient specificity to be of much use in park management of natural resources.

In 2006, Dr. David C. Weindorf, Assistant Professor of Soil Classification and Land Use at the LSU AgCenter in Baton Rouge, LA, visited the volcano with a group of undergraduate soil science students. As a result of the visit, the National Parks Service commissioned a more detailed study of soils in the park. The results are published in the Fall 2008 issue of Soil Survey Horizons (“High resolution soil survey of Capulin Volcano National Monument, New Mexico” by D. Weindorf, B. Rinard, Y. Zhu, S. Johnson, B. Haggard, J. McPherson, M. Dia, C. Spinks, and A. McWhirt, Soil Surv. Horiz. 49:55-62

The unprecedented access for sampling allowed for the collection of more than 140 soil samples, the description of five soil profiles (vertical cross sections of soil extending into the subsoil). At each site, global positioning system (GPS) coordinates were recorded so the exact location of the sample could be mapped. Slope and site characteristics such as vegetative cover were also noted at each point.

In the lab, soil color, texture, organic matter, pH, and other properties were carefully examined. When all datasets were complete, they were loaded into a computer program that creates interpolated maps between data collection points. In doing so, map layers were created of each data parameter. Finally, when all maps are simultaneously considered, the research team drew the boundaries of each unique soil.

An additional benefit of the study was the involvement of undergraduate students. Beatrix Haggard and Stephanie Johnson, two of the undergraduate students integrally involved in the study, stated “Research on Capulin has allowed us to apply our studies in a real-world research study and prepared us for graduate research in soil science.” Both students have now begun graduate work at LSU.

Ongoing research at the LSU AgCenter is focusing on the validation of soils data and the use of new field portable technologies such as x-ray fluorescence (XRF) spectrometry and diffuse reflectance spectroscopy (DRS) in soil survey. Accurate soils information is vital not only to agriculture, but also civil engineering, environmental science, and other disciplines.

Origin of Alps-size Antarctic mountain range unknown

A U.S.-led, multinational team of scientists this month will investigate one of the Earth’s last major unexplored places, using sophisticated airborne radar and ground-based seismologic tools to virtually peel away more than 2.5 miles of ice covering an Antarctic mountain range that rivals the Alps in elevation.

Researchers from Penn State and Washington University in St. Louis will contribute to the fieldwork by using seismic recordings of earthquakes to create images of the crust and mantle beneath the mountain range. Andrew Nyblade, professor of geosciences, Penn State, and Douglas Wiens, professor and chair, Earth and planetary sciences, Washington University, are principal investigators on the Gamburtsev Antarctic Mountains Seismic Experiment (GAMSEIS).

The seismic images they obtain will help determine how the mountain range formed. GAMSEIS will deploy an array of 23 stations spread over the mountain range that will gather seismic data.

Current scientific knowledge leads researchers to conclude that the Gamburtsev Mountain range “shouldn’t be there” at all.

The researchers from six nations hope to find answers to questions about the nature of Antarctica and specifically the massive East Antarctic Ice Sheet. Researchers want to know how Antarctica became ice-covered and whether that process began millions of years ago in the enigmatic Gamburtsev Mountain range.

Working daily at extreme altitudes, in 24 hours of sunlight and temperatures as low as minus 40 Fahrenheit, the researchers of the Antarctic Gamburtsev Province (AGAP) team hope they can answer whether the Gamburtsevs were born of tectonic activity in Antarctica or date from a period millions of years ago, when Antarctica was the center of an enormous supercontinent located at far lower latitudes.

Robin Bell, Columbia University’s Lamont-Doherty Earth Observatory, shares the leadership of the U.S. science effort and is in charge of the airborne work. She said AGAP will help scientists understand one of Antarctica’s last major unexplored regions.

“Because the heart of East Antarctica is so difficult to get to we know very little about it,” says Bell. “The Gamburtsev mountain range is fascinating-it defies all geological understanding of how mountains evolve-it really shouldn’t be there.

“We think also that there’s a strong possibility that the mountains are the birthplace of the East Antarctic Ice Sheet. Over 30 million years ago ice began to grow around the peaks, eventually burying the range and its surrounding lakes. I’m really excited that at last we have a chance to find out what happened,” she said.

“For two and a half months our international teams will pool their resources and expertise to survey mountains the size of the Alps buried under the ice sheet that currently defy any reasonable geological explanation,” added Fausto Ferraccioli, geophysicist, British Antarctic Survey, who is leading the United Kingdom’s team. “At the same time, we will hunt for ice that is more than 1.2 million years old. Locked in this ancient ice is a detailed record of past climate change that may assist in making better predictions for our future.”

AGAP, involving researchers and support personnel from Australia, China, Germany, Japan, United Kingdom andUnited States, caps the global scientific deployment known as the International Polar Year (IPY), the largest coordinated international scientific effort in 50 years. The Gamburtsevs were discovered by a Soviet traverse during the last IPY in 1957-58 that was known as the International Geophysical Year.

Traveling deep into the Antarctic interior, roughly 394 miles from the South Pole, the science teams will spend two months at a pair of remote field camps while they complete the first major geophysical survey to map the mysterious landscape.

AGAP fieldwork is emblematic of the scientific goals of the current IPY and of the scientific advances made in the past 50 years because it will use tools and techniques that were simply unavailable in IGY. BAS and NSF aircraft, specially equipped with ice-penetrating radar technology, gravimeter and magnetic field sensors, will fly survey lines over an area more than twice the size of California.

“This project is possible almost uniquely at this point in time because of the international framework created by IPY, which gives researchers from many nations as single common conduit to pool their efforts for the greater scientific good,” said AGAP researcher Detlef Damaske of Germany’s Federal Institute for Geosciences and Natural Resources.

In addition to researchers from the six participating nations, AGAP requires nine aircraft, the establishment of two deep-field science camps, support from U.S. Amundsen-Scott South Pole and McMurdo research stations, the Australian Antarctic Davis Station and the British Antarctic Survey’s Rothera Research Station. Science and support teams on the Chinese tractor train from Zhongshan Station to Dome A will sample ice cores and decommission the UK-Australian Camp. Field depot camps and three other logistics support stations will ensure that food, fuel, supplies and equipment and people are in the right place at the right time.

The U.S. research teams, from Columbia; Penn State; Washington University; the Center for Remote Sensing of Ice Sheets, University of Kansas; Incorporated Research Institutions in Seismology and the U.S. Geological Survey, are supported by the National Science Foundation, which manages all U.S. research on the southernmost continent through the U.S. Antarctic Program. NSF also is the lead U.S. agency for IPY.

Scientists help to resolve long-standing puzzle in climate science

Computer model simulated changes in surface temperature and sea-ice extent. The models used to simulate these changes were first run with best estimates of historical changes in human and natural factors over the 20th century, and then driven by estimated future changes in greenhouse gases. Temperature and sea-ice changes are shown at four different times. Results are averages from the output of nearly two dozen individual climate models. The database of climate model output used to produce this Figure is archived at LLNL.
Computer model simulated changes in surface temperature and sea-ice extent. The models used to simulate these changes were first run with best estimates of historical changes in human and natural factors over the 20th century, and then driven by estimated future changes in greenhouse gases. Temperature and sea-ice changes are shown at four different times. Results are averages from the output of nearly two dozen individual climate models. The database of climate model output used to produce this Figure is archived at LLNL.

A team led by Livermore scientists has helped reconcile the differences between simulated and observed temperature trends in the tropics.

Using state-of-the-art observational datasets and results from computer model simulations archived at Lawrence Livermore National Laboratory, LLNL researchers and colleagues from 11 other scientific institutions have refuted a recent claim that simulated temperature trends in the tropics are fundamentally inconsistent with observations. This claim was based on the application of a flawed statistical test and the use of older observational datasets.

Climate model experiments invariably predict that human-caused greenhouse gas increases should lead to more warming in the tropical troposphere (the lowest layer of the atmosphere) than at the tropical land and ocean surface. This predicted “amplification” behavior is in accord with basic theoretical expectations.

Until several years ago, however, most satellite and weather balloon records suggested that the tropical troposphere had warmed substantially less than the surface.

For nearly a decade, this apparent discrepancy between simulations and reality was a major conundrum for climate scientists. The discrepancy was at odds with the overwhelming body of other scientific evidence pointing toward a “discernible human influence” on global climate.

A paper published online last year in the International Journal of Climatology claimed to show definitively that “models and observations disagree to a statistically significant extent” in terms of their tropical temperature trends. This claim formed the starting point for an investigation by a large team of climate modelers and observational data specialists, which was led by LLNL’s Benjamin Santer.

In marked contrast to the earlier claim, Santer’s international team found that there is no fundamental discrepancy between modeled and observed trends in tropical temperatures.

“We’ve gone a long way toward reconciling modeled and observed temperature trends in the problem area of the tropics,” said Santer, the lead author of a paper now appearing online in the International Journal of Climatology.

There are two reasons for this reconciliation.

First, the analysis that reported disagreement between models and observations had applied an inappropriate statistical test, which did not account for the statistical uncertainty in observed warming trends. This uncertainty arises because the human-caused component of recent temperature changes is not perfectly known in any individual observed time series – it must be estimated from data that are influenced by both human effects and the “noise” of natural climate variability. Examples of such “noise” include large El Niño and La Niña events, which have pronounced effects on the year-to-year variability of tropical temperatures.

The Livermore-led consortium applied this inappropriate test to randomly generated data. The test revealed a strong bias in the method toward “detecting” differences that were not real.

The consortium modified the test to correctly account for uncertainty in estimating temperature trends from noisy observational data. With this modified test, there were no longer pervasive, statistically significant differences between simulated and observed tropical temperature trends.

The second reason for the reconciliation of models and observations was the availability of new and improved observational datasets, both for surface and tropospheric temperatures. The developers of these datasets used different procedures to identify and adjust for biases (such as those caused by changes over time in the instruments and platforms used to measure temperature).

Access to multiple, independently produced datasets provided the LLNL-led consortium with a valuable perspective on the inherent uncertainty in observations. Many of the recently developed observational datasets showed larger warming aloft than at the surface, and were more consistent with climate model results.

Even with improved datasets, there are still important uncertainties in observational estimates of recent tropospheric temperature trends that may never be fully resolved, and are partly a consequence of historical observing strategies, which were geared toward weather forecasting rather than climate monitoring.

“We should apply what we learned in this study toward improving existing climate monitoring systems, so that future model evaluation studies are less sensitive to observational ambiguity,” Santer said.

Other researchers in this international consortium were Karl Taylor, Peter Gleckler and Stephen Klein (all at Livermore); Peter Thorne at the United Kingdom Meteorological Office Hadley Centre; Leo Haimberger at the University of Vienna; Tom Wigley and Doug Nychka at the National Center for Atmospheric Research; John Lanzante at the National Oceanic and Atmospheric Administration (NOAA)/Geophysical Fluid Dynamics Laboratory; Susan Solomon at the NOAA/Earth System Research Laboratory; Melissa Free at the NOAA/Air Resources Laboratory; Phil Jones at the University of East Anglia; Tom Karl at the NOAA/National Climatic Data Center; Carl Mears and Frank Wentz at Remote Sensing Systems; Gavin Schmidt at the NASA/Goddard Institute for Space Studies; and Steve Sherwood at Yale University.