New study measures methane emissions from natural gas production and offers insights into 2 large sources

A team of researchers from the Cockrell School of Engineering at The University of Texas at Austin and environmental testing firm URS reports that a small subset of natural gas wells are responsible for the majority of methane emissions from two major sources — liquid unloadings and pneumatic controller equipment — at natural gas production sites.

With natural gas production in the United States expected to continue to increase during the next few decades, there is a need for a better understanding of methane emissions during natural gas production. The study team believes this research, published Dec. 9 in Environmental Science & Technology, will help to provide a clearer picture of methane emissions from natural gas production sites.

The UT Austin-led field study closely examined two major sources of methane emissions — liquid unloadings and pneumatic controller equipment — at well pad sites across the United States. Researchers found that 19 percent of the pneumatic devices accounted for 95 percent of the emissions from pneumatic devices, and 20 percent of the wells with unloading emissions that vent to the atmosphere accounted for 65 percent to 83 percent of those emissions.

“To put this in perspective, over the past several decades, 10 percent of the cars on the road have been responsible for the majority of automotive exhaust pollution,” said David Allen, chemical engineering professor at the Cockrell School and principal investigator for the study. “Similarly, a small group of sources within these two categories are responsible for the vast majority of pneumatic and unloading emissions at natural gas production sites.”

Additionally, for pneumatic devices, the study confirmed regional differences in methane emissions first reported by the study team in 2013. The researchers found that methane emissions from pneumatic devices were highest in the Gulf Coast and lowest in the Rocky Mountains.

The study is the second phase of the team’s 2013 study, which included some of the first measurements for methane emissions taken directly at hydraulically fractured well sites. Both phases of the study involved a partnership between the Environmental Defense Fund, participating energy companies, an independent Scientific Advisory Panel and the UT Austin study team.

The unprecedented access to natural gas production facilities and equipment allowed researchers to acquire direct measurements of methane emissions.

Study and Findings on Pneumatic Devices

Pneumatic devices, which use gas pressure to control the opening and closing of valves, emit gas as they operate. These emissions are estimated to be among the larger sources of methane emissions from the natural gas supply chain. The Environmental Protection Agency reports that 477,606 pneumatic (gas actuated) devices are in use at natural gas production sites throughout the U.S.

“Our team’s previous work established that pneumatics are a major contributor to emissions,” Allen said. “Our goal here was to measure a more diverse population of wells to characterize the features of high-emitting pneumatic controllers.”

The research team measured emissions from 377 gas actuated (pneumatic) controllers at natural gas production sites and a small number of oil production sites throughout the U.S.

The researchers sampled all identifiable pneumatic controller devices at each well site, a more comprehensive approach than the random sampling previously conducted. The average methane emissions per pneumatic controller reported in this study are 17 percent higher than the average emissions per pneumatic controller in the 2012 EPA greenhouse gas national emission inventory (released in 2014), but the average from the study is dominated by a small subpopulation of the controllers. Specifically, 19 percent of controllers, with measured emission rates in excess of 6 standard cubic feet per hour (scf/h), accounted for 95 percent of emissions.

The high-emitting pneumatic devices are a combination of devices that are not operating as designed, are used in applications that cause them to release gas frequently or are designed to emit continuously at a high rate.

The researchers also observed regional differences in methane emission levels, with the lowest emissions per device measured in the Rocky Mountains and the highest emissions in the Gulf Coast, similar to the earlier 2013 study. At least some of the regional differences in emission rates can be attributed to the difference in controller type (continuous vent vs. intermittent vent) among regions.

Study and Findings on Liquid Unloadings

After observing variable emissions for liquid unloadings for a limited group of well types in the 2013 study, the research team made more extensive measurements and confirmed that a majority of emissions come from a small fraction of wells that vent frequently. Although it is not surprising to see some correlation between frequency of unloadings and higher annual emissions, the study’s findings indicate that wells with a high frequency of unloadings have annual emissions that are 10 or more times as great as wells that unload less frequently.

The team’s field study, which measured emissions from unloadings from wells at 107 natural gas production wells throughout the U.S., represents the most extensive measurement of emissions associated with liquid unloadings in scientific literature thus far.

A liquid unloading is one method used to clear wells of accumulated liquids to increase production. Because older wells typically produce less gas as they near the end of their life cycle, liquid unloadings happen more often in those wells than in newer wells. The team found a statistical correlation between the age of wells and the frequency of liquid unloadings. The researchers found that the key identifier for high-emitting wells is how many times the well unloads in a given year.

Because liquid unloadings can employ a variety of liquid lifting mechanisms, the study results also reflect differences in liquid unloadings emissions between wells that use two different mechanisms (wells with plunger lifts and wells without plunger lifts). Emissions for unloading events for wells without plunger lifts averaged 21,000 scf (standard cubic feet) to 35,000 scf. For wells with plunger lifts that vent to the atmosphere, emissions averaged 1,000 scf to 10,000 scf of methane per event. Although the emissions per event were higher for wells without plunger lifts, these wells had, on average, fewer events than wells with plunger lifts. Wells without plunger lifts averaged fewer than 10 unloading events per year, and wells with plunger lifts averaged more than 200 events per year.Overall, wells with plunger lifts were estimated to account for 70 percent of emissions from unloadings nationally.

Additionally, researchers found that the Rocky Mountain region, with its large number of wells with a high frequency of unloadings that vent to the atmosphere, accounts for about half of overall emissions from liquid unloadings.

The study team hopes its measurements of liquid unloadings and pneumatic devices will provide a clearer picture of methane emissions from natural gas well sites and about the relationship between well characteristics and emissions.

The study was a cooperative effort involving experts from the Environmental Defense Fund, Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

The University of Texas at Austin is committed to transparency and disclosure of all potential conflicts of interest of its researchers. Lead researcher David Allen serves as chair of the Environmental Protection Agency’s Science Advisory Board and in this role is a paid Special Governmental Employee. He is also a journal editor for the American Chemical Society and has served as a consultant for multiple companies, including Eastern Research Group, ExxonMobil and the Research Triangle Institute. He has worked on other research projects funded by a variety of governmental, nonprofit and private sector sources including the National Science Foundation, the Environmental Protection Agency, the Texas Commission on Environmental Quality, the American Petroleum Institute and an air monitoring and surveillance project that was ordered by the U.S. District Court for the Southern District of Texas. Adam Pacsi and Daniel Zavala-Araiza, who were graduate students at The University of Texas at the time this work was done, have accepted positions at Chevron Energy Technology Company and the Environmental Defense Fund, respectively.

Financial support for this work was provided by the Environmental Defense Fund (EDF), Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

Major funding for the EDF’s 30-month methane research series, including their portion of the University of Texas study, is provided for by the following individuals and foundations: Fiona and Stan Druckenmiller, the Heising-Simons Foundation, Bill and Susan Oberndorf, Betsy and Sam Reeves, the Robertson Foundation, TomKat Charitable Trust and the Walton Family Foundation.

No laughing matter: Nitrous oxide rose at end of last ice age

Researchers measured increases in atmospheric nitrous oxide concentrations about 16,000 to 10,000 years ago using ice from Taylor Glacier in Antarctica. -  Adrian Schilt
Researchers measured increases in atmospheric nitrous oxide concentrations about 16,000 to 10,000 years ago using ice from Taylor Glacier in Antarctica. – Adrian Schilt

Nitrous oxide (N2O) is an important greenhouse gas that doesn’t receive as much notoriety as carbon dioxide or methane, but a new study confirms that atmospheric levels of N2O rose significantly as the Earth came out of the last ice age and addresses the cause.

An international team of scientists analyzed air extracted from bubbles enclosed in ancient polar ice from Taylor Glacier in Antarctica, allowing for the reconstruction of the past atmospheric composition. The analysis documented a 30 percent increase in atmospheric nitrous oxide concentrations from 16,000 years ago to 10,000 years ago. This rise in N2O was caused by changes in environmental conditions in the ocean and on land, scientists say, and contributed to the warming at the end of the ice age and the melting of large ice sheets that then existed.

The findings add an important new element to studies of how Earth may respond to a warming climate in the future. Results of the study, which was funded by the U.S. National Science Foundation and the Swiss National Science Foundation, are being published this week in the journal Nature.

“We found that marine and terrestrial sources contributed about equally to the overall increase of nitrous oxide concentrations and generally evolved in parallel at the end of the last ice age,” said lead author Adrian Schilt, who did much of the work as a post-doctoral researcher at Oregon State University. Schilt then continued to work on the study at the Oeschger Centre for Climate Change Research at the University of Bern in Switzerland.

“The end of the last ice age represents a partial analog to modern warming and allows us to study the response of natural nitrous oxide emissions to changing environmental conditions,” Schilt added. “This will allow us to better understand what might happen in the future.”

Nitrous oxide is perhaps best known as laughing gas, but it is also produced by microbes on land and in the ocean in processes that occur naturally, but can be enhanced by human activity. Marine nitrous oxide production is linked closely to low oxygen conditions in the upper ocean and global warming is predicted to intensify the low-oxygen zones in many of the world’s ocean basins. N2O also destroys ozone in the stratosphere.

“Warming makes terrestrial microbes produce more nitrous oxide,” noted co-author Edward Brook, an Oregon State paleoclimatologist whose research team included Schilt. “Greenhouse gases go up and down over time, and we’d like to know more about why that happens and how it affects climate.”

Nitrous oxide is among the most difficult greenhouse gases to study in attempting to reconstruct the Earth’s climate history through ice core analysis. The specific technique that the Oregon State research team used requires large samples of pristine ice that date back to the desired time of study – in this case, between about 16,000 and 10,000 years ago.

The unusual way in which Taylor Glacier is configured allowed the scientists to extract ice samples from the surface of the glacier instead of drilling deep in the polar ice cap because older ice is transported upward near the glacier margins, said Brook, a professor in Oregon State’s College of Earth, Ocean, and Atmospheric Sciences.

The scientists were able to discern the contributions of marine and terrestrial nitrous oxide through analysis of isotopic ratios, which fingerprint the different sources of N2O in the atmosphere.

“The scientific community knew roughly what the N2O concentration trends were prior to this study,” Brook said, “but these findings confirm that and provide more exact details about changes in sources. As nitrous oxide in the atmosphere continues to increase – along with carbon dioxide and methane – we now will be able to more accurately assess where those contributions are coming from and the rate of the increase.”

Atmospheric N2O was roughly 200 parts per billion at the peak of the ice age about 20,000 years ago then rose to 260 ppb by 10,000 years ago. As of 2014, atmospheric N2Owas measured at about 327 ppb, an increase attributed primarily to agricultural influences.

Although the N2O increase at the end of the last ice age was almost equally attributable to marine and terrestrial sources, the scientists say, there were some differences.

“Our data showed that terrestrial emissions changed faster than marine emissions, which was highlighted by a fast increase of emissions on land that preceded the increase in marine emissions,” Schilt pointed out. “It appears to be a direct response to a rapid temperature change between 15,000 and 14,000 years ago.”

That finding underscores the complexity of analyzing how Earth responds to changing conditions that have to account for marine and terrestrial influences; natural variability; the influence of different greenhouse gases; and a host of other factors, Brook said.

“Natural sources of N2O are predicted to increase in the future and this study will help up test predictions on how the Earth will respond,” Brook said.

Abandoned wells can be ‘super-emitters’ of greenhouse gas

One of the wells the researchers tested; this one in the Allegheny National Forest. -  Princeton University
One of the wells the researchers tested; this one in the Allegheny National Forest. – Princeton University

Princeton University researchers have uncovered a previously unknown, and possibly substantial, source of the greenhouse gas methane to the Earth’s atmosphere.

After testing a sample of abandoned oil and natural gas wells in northwestern Pennsylvania, the researchers found that many of the old wells leaked substantial quantities of methane. Because there are so many abandoned wells nationwide (a recent study from Stanford University concluded there were roughly 3 million abandoned wells in the United States) the researchers believe the overall contribution of leaking wells could be significant.

The researchers said their findings identify a need to make measurements across a wide variety of regions in Pennsylvania but also in other states with a long history of oil and gas development such as California and Texas.

“The research indicates that this is a source of methane that should not be ignored,” said Michael Celia, the Theodore Shelton Pitney Professor of Environmental Studies and professor of civil and environmental engineering at Princeton. “We need to determine how significant it is on a wider basis.”

Methane is the unprocessed form of natural gas. Scientists say that after carbon dioxide, methane is the most important contributor to the greenhouse effect, in which gases in the atmosphere trap heat that would otherwise radiate from the Earth. Pound for pound, methane has about 20 times the heat-trapping effect as carbon dioxide. Methane is produced naturally, by processes including decomposition, and by human activity such as landfills and oil and gas production.

While oil and gas companies work to minimize the amount of methane emitted by their operations, almost no attention has been paid to wells that were drilled decades ago. These wells, some of which date back to the 19th century, are typically abandoned and not recorded on official records.

Mary Kang, then a doctoral candidate at Princeton, originally began looking into methane emissions from old wells after researching techniques to store carbon dioxide by injecting it deep underground. While examining ways that carbon dioxide could escape underground storage, Kang wondered about the effect of old wells on methane emissions.

“I was looking for data, but it didn’t exist,” said Kang, now a postdoctoral researcher at Stanford.

In a paper published Dec. 8 in the Proceedings of the National Academy of Sciences, the researchers describe how they chose 19 wells in the adjacent McKean and Potter counties in northwestern Pennsylvania. The wells chosen were all abandoned, and records about the origin of the wells and their conditions did not exist. Only one of the wells was on the state’s list of abandoned wells. Some of the wells, which can look like a pipe emerging from the ground, are located in forests and others in people’s yards. Kang said the lack of documentation made it hard to tell when the wells were originally drilled or whether any attempt had been made to plug them.

“What surprised me was that every well we measured had some methane coming out,” said Celia.

To conduct the research, the team placed enclosures called flux chambers over the tops of the wells. They also placed flux chambers nearby to measure the background emissions from the terrain and make sure the methane was emitted from the wells and not the surrounding area.

Although all the wells registered some level of methane, about 15 percent emitted the gas at a markedly higher level — thousands of times greater than the lower-level wells. Denise Mauzerall, a Princeton professor and a member of the research team, said a critical task is to discover the characteristics of these super-emitting wells.

Mauzerall said the relatively low number of high-emitting wells could offer a workable solution: while trying to plug every abandoned well in the country might be too costly to be realistic, dealing with the smaller number of high emitters could be possible.

“The fact that most of the methane is coming out of a small number of wells should make it easier to address if we can identify the high-emitting wells,” said Mauzerall, who has a joint appointment as a professor of civil and environmental engineering and as a professor of public and international affairs at the Woodrow Wilson School.

The researchers have used their results to extrapolate total methane emissions from abandoned wells in Pennsylvania, although they stress that the results are preliminary because of the relatively small sample. But based on that data, they estimate that emissions from abandoned wells represents as much as 10 percent of methane from human activities in Pennsylvania — about the same amount as caused by current oil and gas production. Also, unlike working wells, which have productive lifetimes of 10 to 15 years, abandoned wells can continue to leak methane for decades.

“This may be a significant source,” Mauzerall said. “There is no single silver bullet but if it turns out that we can cap or capture the methane coming off these really big emitters, that would make a substantial difference.”


Besides Kang, who is the paper’s lead author, Celia and Mauzerall, the paper’s co-authors include: Tullis Onstott, a professor of geosciences at Princeton; Cynthia Kanno, who was a Princeton undergraduate and who is a graduate student at the Colorado School of Mines; Matthew Reid, who was a graduate student at Princeton and is a postdoctoral researcher at EPFL in Luzerne, Switzerland; Xin Zhang, a postdoctoral researcher in the Woodrow Wilson School at Princeton; and Yuheng Chen, an associate research scholar in geosciences at Princeton.

Technology-dependent emissions of gas extraction in the US

The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. -  Photo: F. Geiger/KIT
The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. – Photo: F. Geiger/KIT

Not all boreholes are the same. Scientists of the Karlsruhe Institute of Technology (KIT) used mobile measurement equipment to analyze gaseous compounds emitted by the extraction of oil and natural gas in the USA. For the first time, organic pollutants emitted during a fracking process were measured at a high temporal resolution. The highest values measured exceeded typical mean values in urban air by a factor of one thousand, as was reported in ACP journal. (DOI 10.5194/acp-14-10977-2014)

Emission of trace gases by oil and gas fields was studied by the KIT researchers in the USA (Utah and Colorado) together with US institutes. Background concentrations and the waste gas plumes of single extraction plants and fracking facilities were analyzed. The air quality measurements of several weeks duration took place under the “Uintah Basin Winter Ozone Study” coordinated by the National Oceanic and Atmospheric Administration (NOAA).

The KIT measurements focused on health-damaging aromatic hydrocarbons in air, such as carcinogenic benzene. Maximum concentrations were determined in the waste gas plumes of boreholes. Some extraction plants emitted up to about a hundred times more benzene than others. The highest values of some milligrams of benzene per cubic meter air were measured downstream of an open fracking facility, where returning drilling fluid is stored in open tanks and basins. Much better results were reached by oil and gas extraction plants and plants with closed production processes. In Germany, benzene concentration at the workplace is subject to strict limits: The Federal Emission Control Ordinance gives an annual benzene limit of five micrograms per cubic meter for the protection of human health, which is smaller than the values now measured at the open fracking facility in the US by a factor of about one thousand. The researchers published the results measured in the journal Atmospheric Chemistry and Physics ACP.

“Characteristic emissions of trace gases are encountered everywhere. These are symptomatic of gas and gas extraction. But the values measured for different technologies differ considerably,” Felix Geiger of the Institute of Meteorology and Climate Research (IMK) of KIT explains. He is one of the first authors of the study. By means of closed collection tanks and so-called vapor capture systems, for instance, the gases released during operation can be collected and reduced significantly.

“The gas fields in the sparsely populated areas of North America are a good showcase for estimating the range of impacts of different extraction and fracking technologies,” explains Professor Johannes Orphal, Head of IMK. “In the densely populated Germany, framework conditions are much stricter and much more attention is paid to reducing and monitoring emissions.”

Fracking is increasingly discussed as a technology to extract fossil resources from unconventional deposits. Hydraulic breaking of suitable shale stone layers opens up the fossil fuels stored there and makes them accessible for economically efficient use. For this purpose, boreholes are drilled into these rock formations. Then, they are subjected to high pressure using large amounts of water and auxiliary materials, such as sand, cement, and chemicals. The oil or gas can flow to the surface through the opened microstructures in the rock. Typically, the return flow of the aqueous fracking liquid with the dissolved oil and gas constituents to the surface lasts several days until the production phase proper of purer oil or natural gas. This return flow is collected and then reused until it finally has to be disposed of. Air pollution mainly depends on the treatment of this return flow at the extraction plant. In this respect, currently practiced fracking technologies differ considerably. For the first time now, the resulting local atmospheric emissions were studied at a high temporary resolution. Based on the results, emissions can be assigned directly to the different plant sections of an extraction plant. For measurement, the newly developed, compact, and highly sensitive instrument, a so-called proton transfer reaction mass spectrometer (PTR-MS), of KIT was installed on board of a minivan and driven closer to the different extraction points, the distances being a few tens of meters. In this way, the waste gas plumes of individual extraction sources and fracking processes were studied in detail.

Warneke, C., Geiger, F., Edwards, P. M., Dube, W., Pétron, G., Kofler, J., Zahn, A., Brown, S. S., Graus, M., Gilman, J. B., Lerner, B. M., Peischl, J., Ryerson, T. B., de Gouw, J. A., and Roberts, J. M.: Volatile organic compound emissions from the oil and natural gas industry in the Uintah Basin, Utah: oil and gas well pad emissions compared to ambient air composition, Atmos. Chem. Phys., 14, 10977-10988, doi:10.5194/acp-14-10977-2014, 2014.

Re-thinking Southern California earthquake scenarios in Coachella Valley, San Andreas Fault

The Coachella Valley segment of the southernmost section of the San Andreas Fault in California has a high likelihood for a large rupture in the near future, since it has a recurrence interval of about 180 years but has not ruptured in over 300 years. -  UMass Amherst and Google Earth
The Coachella Valley segment of the southernmost section of the San Andreas Fault in California has a high likelihood for a large rupture in the near future, since it has a recurrence interval of about 180 years but has not ruptured in over 300 years. – UMass Amherst and Google Earth

New three-dimensional (3D) numerical modeling that captures far more geometric complexity of an active fault segment in southern California than any other, suggests that the overall earthquake hazard for towns on the west side of the Coachella Valley such as Palm Springs and Palm Desert may be slightly lower than previously believed.

New simulations of deformation on three alternative fault configurations for the Coachella Valley segment of the San Andreas Fault conducted by geoscientists Michele Cooke and Laura Fattaruso of the University of Massachusetts Amherst, with Rebecca Dorsey of the University of Oregon, appear in the December issue of Geosphere.

The Coachella Valley segment is the southernmost section of the San Andreas Fault in California. It has a high likelihood for a large rupture in the near future, since it has a recurrence interval of about 180 years but has not ruptured in over 300 years, the authors point out.

The researchers acknowledge that their new modeling offers “a pretty controversial interpretation” of the data. Many geoscientists do not accept a dipping active fault geometry to the San Andreas Fault in the Coachella Valley, they say. Some argue that the data do not confirm the dipping structure. “Our contribution to this debate is that we add an uplift pattern to the data that support a dipping active fault and it rejects the other models,” say Cooke and colleagues.

Their new model yields an estimated 10 percent increase in shaking overall for the Coachella segment. But for the towns to the west of the fault where most people live, it yields decreased shaking due to the dipping geometry. It yields a doubling of shaking in mostly unpopulated areas east of the fault. “This isn’t a direct outcome of our work but an implication,” they add.

Cooke says, “Others have used a dipping San Andreas in their models but they didn’t include the degree of complexity that we did. By including the secondary faults within the Mecca Hills we more accurately capture the uplift pattern of the region.”

Fattaruso adds, “Others were comparing to different data sets, such as geodesy, and since we were comparing to uplift it is important that we have this complexity.” In this case, geodesy is the science of measuring and representing the Earth and its crustal motion, taking into account the competition of geological processes in 3D over time.

Most other models of deformation, stress, rupture and ground shaking have assumed that the southern San Andreas Fault is vertical, say Cooke and colleagues. However, seismic, imaging, aerial magnetometric surveys and GPS-based strain observations suggest that the fault dips 60 to 70 degrees toward the northeast, a hypothesis they set out to investigate.

Specifically, they explored three alternative geometric models of the fault’s Coachella Valley segment with added complexity such as including smaller faults in the nearby Indio and Mecca Hills. “We use localized uplift patterns in the Mecca Hills to assess the most plausible geometry for the San Andreas Fault in the Coachella Valley and better understand the interplay of fault geometry and deformation,” they write.

Cooke and colleagues say the fault structures in their favored model agree with distributions of local seismicity, and are consistent with geodetic observations of recent strain. “Crustal deformation models that neglect the northeast dip of the San Andreas Fault in the Coachella Valley will not replicate the ground shaking in the region and therefore inaccurately estimate seismic hazard,” they note.

This work was supported by the National Science Foundation.
More: http://geosphere.gsapubs.org/content/10/6/1235.abstract

Antarctica: Heat comes from the deep

The Antarctic ice sheet is a giant water reservoir. The ice cap on the southern continent is on average 2,100 meters thick and contains about 70 percent of the world’s fresh water. If this ice mass were to melt completely, it could raise the global sea level by 60 meters. Therefore scientists carefully observe changes in the Antarctic. In the renowned international journal Science, researchers from Germany, the UK, the US and Japan are now publishing data according to which water temperatures, in particular on the shallow shelf seas of West Antarctica, are rising. “There are many large glaciers in the area. The elevated temperatures have accelerated the melting and sliding of these glaciers in recent decades and there are no indications that this trend is changing,” says the lead author of the study, Dr. Sunke Schmidtko from GEOMAR Helmholtz Centre for Ocean Research Kiel.

For their study, he and his colleagues of the University of East Anglia, the California Institute of Technology and the University of Hokkaido (Japan) evaluated all oceanographic data from the waters around Antarctica from 1960 to 2014 that were available in public databases. These data show that five decades ago, the water masses in the West Antarctic shelf seas were already warmer than in other parts of Antarctica, for example, in the Weddell Sea. However, the temperature difference is not constant. Since 1960, the temperatures in the West Antarctic Amundsen Sea and the Bellingshausen Sea have been rising. “Based on the data we were able to see that this shelf process is induced from the open ocean,” says Dr. Schmidtko.

Around Antarctica in greater depth along the continental slope water masses with temperatures from 0.5 to 1.5°C (33-35°F) are predominant. These temperatures are very warm for Antarctic conditions. “These waters have warmed in West Antarctica over the past 50 years. And they are significant shallower than 50 years ago,” says Schmidtko. Especially in the Amundsen Sea and Bellingshausen Sea they now increasingly spill onto the shelf and warm the shelf.

“These are the regions in which accelerated glacial melting has been observed for some time. We show that oceanographic changes over the past 50 years have probably caused this melting. If the water continues to warm, the increased penetration of warmer water masses onto the shelf will likely further accelerate this process, with an impact on the rate of global sea level rise ” explains Professor Karen Heywood from the University of East Anglia.

The scientists also draw attention to the rising up of warm water masses in the southwestern Weddell Sea. Here very cold temperatures (less than minus 1.5°C or 29°F) prevail on the shelf and a large-scale melting of shelf ice has not been observed yet. If the shoaling of warm water masses continues, it is expected that there will be major environmental changes with dramatic consequences for the Filchner or Ronne Ice Shelf, too. For the first time glaciers outside the West Antarctic could experience enhanced melting from below.

To what extent the diverse biology of the Southern Ocean is influenced by the observed changes is not fully understood. The shelf areas include spawning areas for the Antarctic krill, a shrimp species widespread in the Southern Ocean, which plays a key role in the Antarctic food chain. Research results have shown that spawning cycles could change in warmer conditions. A final assessment of the impact has not yet been made.

The exact reasons for the increase of the heating and the rising of warm water masses has not yet been completely resolved. “We suspect that they are related to large-scale variations in wind systems over the southern hemisphere. But which processes specifically play a role must be evaluated in more detail.” says Dr. Schmidtko.

Geophysicists challenge traditional theory underlying the origin of mid-plate volcanoes

Traditional thought holds that hot updrafts from the Earth's core cause volcanoes, but researchers say eruptions may stem from the asthenosphere, a layer closer to the surface. -  Virginia Tech
Traditional thought holds that hot updrafts from the Earth’s core cause volcanoes, but researchers say eruptions may stem from the asthenosphere, a layer closer to the surface. – Virginia Tech

A long-held assumption about the Earth is discussed in today’s edition of Science, as Don L. Anderson, an emeritus professor with the Seismological Laboratory of the California Institute of Technology, and Scott King, a professor of geophysics in the College of Science at Virginia Tech, look at how a layer beneath the Earth’s crust may be responsible for volcanic eruptions.

The discovery challenges conventional thought that volcanoes are caused when plates that make up the planet’s crust shift and release heat.

Instead of coming from deep within the interior of the planet, the responsibility is closer to the surface, about 80 kilometers to 200 kilometers deep — a layer above the Earth’s mantle, known as the as the asthenosphere.

“For nearly 40 years there has been a debate over a theory that volcanic island chains, such as Hawaii, have been formed by the interaction between plates at the surface and plumes of hot material that rise from the core-mantle boundary nearly 1,800 miles below the Earth’s surface,” King said. “Our paper shows that a hot layer beneath the plates may explain the origin of mid-plate volcanoes without resorting to deep conduits from halfway to the center of the Earth.”

Traditionally, the asthenosphere has been viewed as a passive structure that separates the moving tectonic plates from the mantle.

As tectonic plates move several inches every year, the boundaries between the plates spawn most of the planet’s volcanoes and earthquakes.

“As the Earth cools, the tectonic plates sink and displace warmer material deep within the interior of the Earth,” explained King. “This material rises as two broad, passive updrafts that seismologists have long recognized in their imaging of the interior of the Earth.”

The work of Anderson and King, however, shows that the hot, weak region beneath the plates acts as a lubricating layer, preventing the plates from dragging the material below along with them as they move.

The researchers show this lubricating layer is also the hottest part of the mantle, so there is no need for heat to be carried up to explain mid-plate volcanoes.

“We’re taking the position that plate tectonics and mid-plate volcanoes are the natural results of processes in the plates and the layer beneath them,” King said.

West Antarctic melt rate has tripled: UC Irvine-NASA

A comprehensive, 21-year analysis of the fastest-melting region of Antarctica has found that the melt rate of glaciers there has tripled during the last decade.

The glaciers in the Amundsen Sea Embayment in West Antarctica are hemorrhaging ice faster than any other part of Antarctica and are the most significant Antarctic contributors to sea level rise. This study is the first to evaluate and reconcile observations from four different measurement techniques to produce an authoritative estimate of the amount and the rate of loss over the last two decades.

“The mass loss of these glaciers is increasing at an amazing rate,” said scientist Isabella Velicogna, jointly of the UC Irvine and NASA’s Jet Propulsion Laboratory. Velicogna is a coauthor of a paper on the results, which has been accepted for Dec. 5 publication in the journal Geophysical Research Letters.

Lead author Tyler Sutterley, a UCI doctoral candidate, and his team did the analysis to verify that the melting in this part of Antarctica is shifting into high gear. “Previous studies had suggested that this region is starting to change very dramatically since the 1990s, and we wanted to see how all the different techniques compared,” Sutterley said. “The remarkable agreement among the techniques gave us confidence that we are getting this right.”

The researchers reconciled measurements of the mass balance of glaciers flowing into the Amundsen Sea Embayment. Mass balance is a measure of how much ice the glaciers gain and lose over time from accumulating or melting snow, discharges of ice as icebergs, and other causes. Measurements from all four techniques were available from 2003 to 2009. Combined, the four data sets span the years 1992 to 2013.

The glaciers in the embayment lost mass throughout the entire period. The researchers calculated two separate quantities: the total amount of loss, and the changes in the rate of loss.

The total amount of loss averaged 83 gigatons per year (91.5 billion U.S. tons). By comparison, Mt. Everest weighs about 161 gigatons, meaning the Antarctic glaciers lost a Mt.-Everest’s-worth amount of water weight every two years over the last 21 years.

The rate of loss accelerated an average of 6.1 gigatons (6.7 billion U.S. tons) per year since 1992.

From 2003 to 2009, when all four observational techniques overlapped, the melt rate increased an average of 16.3 gigatons per year — almost three times the rate of increase for the full 21-year period. The total amount of loss was close to the average at 84 gigatons.

The four sets of observations include NASA’s Gravity Recovery and Climate Experiment (GRACE) satellites, laser altimetry from NASA’s Operation IceBridge airborne campaign and earlier ICESat satellite, radar altimetry from the European Space Agency’s Envisat satellite, and mass budget analyses using radars and the University of Utrecht’s Regional Atmospheric Climate Model.

The scientists noted that glacier and ice sheet behavior worldwide is by far the greatest uncertainty in predicting future sea level. “We have an excellent observing network now. It’s critical that we maintain this network to continue monitoring the changes,” Velicogna said, “because the changes are proceeding very fast.”

###

About the University of California, Irvine:

Founded in 1965, UCI is the youngest member of the prestigious Association of American Universities. The campus has produced three Nobel laureates and is known for its academic achievement, premier research, innovation and anteater mascot. Led by Chancellor Howard Gillman, UCI has more than 30,000 students and offers 192 degree programs. Located in one of the world’s safest and most economically vibrant communities, it’s Orange County’s second-largest employer, contributing $4.8 billion annually to the local economy.

Media access: Radio programs/stations may, for a fee, use an on-campus ISDN line to interview UC Irvine faculty and experts, subject to availability and university approval. For more UC Irvine news, visit news.uci.edu. Additional resources for journalists may be found at communications.uci.edu/for-journalists.

New research highlights the key role of ozone in climate change

Many of the complex computer models which are used to predict climate change could be missing an important ozone ‘feedback’ factor in their calculations of future global warming, according to new research led by the University of Cambridge and published today (1 December) in the journal Nature Climate Change.

Computer models play a crucial role in informing climate policy. They are used to assess the effect that carbon emissions have had on the Earth’s climate to date, and to predict possible pathways for the future of our climate.

Increasing computing power combined with increasing scientific knowledge has led to major advances in our understanding of the climate system during the past decades. However, the Earth’s inherent complexity, and the still limited computational power available, means that not every variable can be included in current models. Consequently, scientists have to make informed choices in order to build models which are fit for purpose.

“These models are the only tools we have in terms of predicting the future impacts of climate change, so it’s crucial that they are as accurate and as thorough as we can make them,” said the paper’s lead author Peer Nowack, a PhD student in the Centre for Atmospheric Science, part of Cambridge’s Department of Chemistry.

The new research has highlighted a key role that ozone, a major component of the stratosphere, plays in how climate change occurs, and the possible implications for predictions of global warming. Changes in ozone are often either not included, or are included a very simplified manner, in current climate models. This is due to the complexity and the sheer computational power it takes to calculate these changes, an important deficiency in some studies.

In addition to its role in protecting the Earth from the Sun’s harmful ultraviolet rays, ozone is also a greenhouse gas. The ozone layer is part of a vast chemical network, and changes in environmental conditions, such as changes in temperature or the atmospheric circulation, result in changes in ozone abundance. This process is known as an atmospheric chemical feedback.

Using a comprehensive atmosphere-ocean chemistry-climate model, the Cambridge team, working with researchers from the University of East Anglia, the National Centre for Atmospheric Science, the Met Office and the University of Reading, compared ozone at pre-industrial levels with how it evolves in response to a quadrupling of CO2 in the atmosphere, which is a standard climate change experiment.

What they discovered is a reduction in global surface warming of approximately 20% – equating to 1° Celsius – when compared with most models after 75 years. This difference is due to ozone changes in the lower stratosphere in the tropics, which are mainly caused by changes in the atmospheric circulation under climate change.

“This research has shown that ozone feedback can play a major role in global warming and that it should be included consistently in climate models,” said Nowack. “These models are incredibly complex, just as the Earth is, and there are an almost infinite number of different processes which we could include. Many different processes have to be simplified in order to make them run effectively within the model, but what this research shows is that ozone feedback plays a major role in climate change, and therefore should be included in models in order to make them as accurate as we can make them. However, this particular feedback is especially complex since it depends on many other climate processes that models still simulate differently. Therefore, the best option to represent this feedback consistently might be to calculate ozone changes in every model, in spite of the high computational costs of such a procedure.

“Climate change research is all about having the best data possible. Every climate model currently in use shows that warming is occurring and will continue to occur, but the difference is in how and when they predict warming will happen. Having the best models possible will help make the best climate policy.”

###

For more information, or to speak with the researchers, contact:

Sarah Collins, Office of Communications University of Cambridge Tel: +44 (0)1223 765542, Mob: +44 (0)7525 337458 Email: sarah.collins@admin.cam.ac.uk

Notes for editors:

1.The paper, “A large ozone-circulation feedback and its implications for global warming assessments” is published in the journal Nature Climate Change. DOI: 10.1038/nclimate2451

2.The mission of the University of Cambridge is to contribute to society through the pursuit of education, learning and research at the highest international levels of excellence. To date, 90 affiliates of the University have won the Nobel Prize. http://www.cam.ac.uk

3.UEA’s school of Environmental Sciences is one of the longest established, largest and most fully developed of its kind in Europe. It was ranked 5th in the Guardian League Table 2015.In the last Research Assessment Exercise, 95 per cent of the school’s activity was classified as internationally excellent or world leading. http://www.uea.ac.uk/env

<br clear="both

UW team explores large, restless volcanic field in Chile

If Brad Singer knew for sure what was happening three miles under an odd-shaped lake in the Andes, he might be less eager to spend a good part of his career investigating a volcanic field that has erupted 36 times during the last 25,000 years. As he leads a large scientific team exploring a region in the Andes called Laguna del Maule, Singer hopes the area remains quiet.

But the primary reason to expend so much effort on this area boils down to one fact: The rate of uplift is among the highest ever observed by satellite measurement for a volcano that is not actively erupting.

That uplift is almost definitely due to a large intrusion of magma — molten rock — beneath the volcanic complex. For seven years, an area larger than the city of Madison has been rising by 10 inches per year.

That rapid rise provides a major scientific opportunity: to explore a mega-volcano before it erupts. That effort, and the hazard posed by the restless magma reservoir beneath Laguna del Maule, are described in a major research article in the December issue of the Geological Society of America’s GSA Today.

“We’ve always been looking at these mega-eruptions in the rear-view mirror,” says Singer. “We look at the lava, dust and ash, and try to understand what happened before the eruption. Since these huge eruptions are rare, that’s usually our only option. But we look at the steady uplift at Laguna del Maule, which has a history of regular eruptions, combined with changes in gravity, electrical conductivity and swarms of earthquakes, and we suspect that conditions necessary to trigger another eruption are gathering force.”

Laguna del Maule looks nothing like a classic, cone-shaped volcano, since the high-intensity erosion caused by heavy rain and snow has carried most of the evidence to the nearby Pacific Ocean. But the overpowering reason for the absence of “typical volcano cones” is the nature of the molten rock underground. It’s called rhyolite, and it’s the most explosive type of magma on the planet.

The eruption of a rhyolite volcano is too quick and violent to build up a cone. Instead, this viscous, water-rich magma often explodes into vast quantities of ash that can form deposits hundreds of yards deep, followed by a slower flow of glassy magma that can be tens of yards tall and measure more than a mile in length.

The next eruption could be in the size range of Mount St. Helens — or it could be vastly bigger, Singer says. “We know that over the past million years or so, several eruptions at Laguna del Maule or nearby volcanoes have been more than 100 times larger than Mount St. Helens,” he says. “Those are rare, but they are possible.” Such a mega-eruption could change the weather, disrupt the ecosystem and damage the economy.

Trying to anticipate what Laguna del Maule holds in store, Singer is heading a new $3 million, five-year effort sponsored by the National Science Foundation to document its behavior before an eruption. With colleagues from Chile, Argentina, Canada, Singapore, and Cornell and Georgia Tech universities, he is masterminding an effort to build a scientific model of the underground forces that could lead to eruption. “This model should capture how this system has evolved in the crust at all scales, from the microscopic to basinwide, over the last 100,000 years,” Singer says. “It’s like a movie from the past to the present and into the future.”

Over the next five years, Singer says he and 30 colleagues will “throw everything, including the kitchen sink, at the problem — geology, geochemistry, geochronology and geophysics — to help measure, and then model, what’s going on.”

One key source of information on volcanoes is seismic waves. Ground shaking triggered by the movement of magma can signal an impending eruption. Team member Clifford Thurber, a seismologist and professor of geoscience at UW-Madison, wants to use distant earthquakes to locate the underground magma body.

As many as 50 seismometers will eventually be emplaced above and around the magma at Laguna del Maule, in the effort to create a 3-D image of Earth’s crust in the area.

By tracking multiple earthquakes over several years, Thurber and his colleagues want to pinpoint the size and location of the magma body — roughly estimated as an oval measuring five kilometers (3.1 miles) by 10 kilometers (6.2 miles).

Each seismometer will record the travel time of earthquake waves originating within a few thousand kilometers, Thurber explains. Since soft rock transmits sound less efficiently than hard rock, “we expect that waves that pass through the presumed magma body will be delayed,” Thurber says. “It’s very simple. It’s like a CT scan, except instead of density we are looking at seismic wave velocity.”

As Singer, who has been visiting Laguna del Maule since 1998, notes, “The rate of uplift — among the highest ever observed — has been sustained for seven years, and we have discovered a large, fluid-rich zone in the crust under the lake using electrical resistivity methods. Thus, there are not many possible explanations other than a big, active body of magma at a shallow depth.”

The expanding body of magma could freeze in place — or blow its top, he says. “One thing we know for sure is that the surface cannot continue rising indefinitely.”