New study measures methane emissions from natural gas production and offers insights into 2 large sources

A team of researchers from the Cockrell School of Engineering at The University of Texas at Austin and environmental testing firm URS reports that a small subset of natural gas wells are responsible for the majority of methane emissions from two major sources — liquid unloadings and pneumatic controller equipment — at natural gas production sites.

With natural gas production in the United States expected to continue to increase during the next few decades, there is a need for a better understanding of methane emissions during natural gas production. The study team believes this research, published Dec. 9 in Environmental Science & Technology, will help to provide a clearer picture of methane emissions from natural gas production sites.

The UT Austin-led field study closely examined two major sources of methane emissions — liquid unloadings and pneumatic controller equipment — at well pad sites across the United States. Researchers found that 19 percent of the pneumatic devices accounted for 95 percent of the emissions from pneumatic devices, and 20 percent of the wells with unloading emissions that vent to the atmosphere accounted for 65 percent to 83 percent of those emissions.

“To put this in perspective, over the past several decades, 10 percent of the cars on the road have been responsible for the majority of automotive exhaust pollution,” said David Allen, chemical engineering professor at the Cockrell School and principal investigator for the study. “Similarly, a small group of sources within these two categories are responsible for the vast majority of pneumatic and unloading emissions at natural gas production sites.”

Additionally, for pneumatic devices, the study confirmed regional differences in methane emissions first reported by the study team in 2013. The researchers found that methane emissions from pneumatic devices were highest in the Gulf Coast and lowest in the Rocky Mountains.

The study is the second phase of the team’s 2013 study, which included some of the first measurements for methane emissions taken directly at hydraulically fractured well sites. Both phases of the study involved a partnership between the Environmental Defense Fund, participating energy companies, an independent Scientific Advisory Panel and the UT Austin study team.

The unprecedented access to natural gas production facilities and equipment allowed researchers to acquire direct measurements of methane emissions.

Study and Findings on Pneumatic Devices

Pneumatic devices, which use gas pressure to control the opening and closing of valves, emit gas as they operate. These emissions are estimated to be among the larger sources of methane emissions from the natural gas supply chain. The Environmental Protection Agency reports that 477,606 pneumatic (gas actuated) devices are in use at natural gas production sites throughout the U.S.

“Our team’s previous work established that pneumatics are a major contributor to emissions,” Allen said. “Our goal here was to measure a more diverse population of wells to characterize the features of high-emitting pneumatic controllers.”

The research team measured emissions from 377 gas actuated (pneumatic) controllers at natural gas production sites and a small number of oil production sites throughout the U.S.

The researchers sampled all identifiable pneumatic controller devices at each well site, a more comprehensive approach than the random sampling previously conducted. The average methane emissions per pneumatic controller reported in this study are 17 percent higher than the average emissions per pneumatic controller in the 2012 EPA greenhouse gas national emission inventory (released in 2014), but the average from the study is dominated by a small subpopulation of the controllers. Specifically, 19 percent of controllers, with measured emission rates in excess of 6 standard cubic feet per hour (scf/h), accounted for 95 percent of emissions.

The high-emitting pneumatic devices are a combination of devices that are not operating as designed, are used in applications that cause them to release gas frequently or are designed to emit continuously at a high rate.

The researchers also observed regional differences in methane emission levels, with the lowest emissions per device measured in the Rocky Mountains and the highest emissions in the Gulf Coast, similar to the earlier 2013 study. At least some of the regional differences in emission rates can be attributed to the difference in controller type (continuous vent vs. intermittent vent) among regions.

Study and Findings on Liquid Unloadings

After observing variable emissions for liquid unloadings for a limited group of well types in the 2013 study, the research team made more extensive measurements and confirmed that a majority of emissions come from a small fraction of wells that vent frequently. Although it is not surprising to see some correlation between frequency of unloadings and higher annual emissions, the study’s findings indicate that wells with a high frequency of unloadings have annual emissions that are 10 or more times as great as wells that unload less frequently.

The team’s field study, which measured emissions from unloadings from wells at 107 natural gas production wells throughout the U.S., represents the most extensive measurement of emissions associated with liquid unloadings in scientific literature thus far.

A liquid unloading is one method used to clear wells of accumulated liquids to increase production. Because older wells typically produce less gas as they near the end of their life cycle, liquid unloadings happen more often in those wells than in newer wells. The team found a statistical correlation between the age of wells and the frequency of liquid unloadings. The researchers found that the key identifier for high-emitting wells is how many times the well unloads in a given year.

Because liquid unloadings can employ a variety of liquid lifting mechanisms, the study results also reflect differences in liquid unloadings emissions between wells that use two different mechanisms (wells with plunger lifts and wells without plunger lifts). Emissions for unloading events for wells without plunger lifts averaged 21,000 scf (standard cubic feet) to 35,000 scf. For wells with plunger lifts that vent to the atmosphere, emissions averaged 1,000 scf to 10,000 scf of methane per event. Although the emissions per event were higher for wells without plunger lifts, these wells had, on average, fewer events than wells with plunger lifts. Wells without plunger lifts averaged fewer than 10 unloading events per year, and wells with plunger lifts averaged more than 200 events per year.Overall, wells with plunger lifts were estimated to account for 70 percent of emissions from unloadings nationally.

Additionally, researchers found that the Rocky Mountain region, with its large number of wells with a high frequency of unloadings that vent to the atmosphere, accounts for about half of overall emissions from liquid unloadings.

The study team hopes its measurements of liquid unloadings and pneumatic devices will provide a clearer picture of methane emissions from natural gas well sites and about the relationship between well characteristics and emissions.

The study was a cooperative effort involving experts from the Environmental Defense Fund, Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

The University of Texas at Austin is committed to transparency and disclosure of all potential conflicts of interest of its researchers. Lead researcher David Allen serves as chair of the Environmental Protection Agency’s Science Advisory Board and in this role is a paid Special Governmental Employee. He is also a journal editor for the American Chemical Society and has served as a consultant for multiple companies, including Eastern Research Group, ExxonMobil and the Research Triangle Institute. He has worked on other research projects funded by a variety of governmental, nonprofit and private sector sources including the National Science Foundation, the Environmental Protection Agency, the Texas Commission on Environmental Quality, the American Petroleum Institute and an air monitoring and surveillance project that was ordered by the U.S. District Court for the Southern District of Texas. Adam Pacsi and Daniel Zavala-Araiza, who were graduate students at The University of Texas at the time this work was done, have accepted positions at Chevron Energy Technology Company and the Environmental Defense Fund, respectively.

Financial support for this work was provided by the Environmental Defense Fund (EDF), Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

Major funding for the EDF’s 30-month methane research series, including their portion of the University of Texas study, is provided for by the following individuals and foundations: Fiona and Stan Druckenmiller, the Heising-Simons Foundation, Bill and Susan Oberndorf, Betsy and Sam Reeves, the Robertson Foundation, TomKat Charitable Trust and the Walton Family Foundation.

New study measures methane emissions from natural gas production and offers insights into 2 large sources

A team of researchers from the Cockrell School of Engineering at The University of Texas at Austin and environmental testing firm URS reports that a small subset of natural gas wells are responsible for the majority of methane emissions from two major sources — liquid unloadings and pneumatic controller equipment — at natural gas production sites.

With natural gas production in the United States expected to continue to increase during the next few decades, there is a need for a better understanding of methane emissions during natural gas production. The study team believes this research, published Dec. 9 in Environmental Science & Technology, will help to provide a clearer picture of methane emissions from natural gas production sites.

The UT Austin-led field study closely examined two major sources of methane emissions — liquid unloadings and pneumatic controller equipment — at well pad sites across the United States. Researchers found that 19 percent of the pneumatic devices accounted for 95 percent of the emissions from pneumatic devices, and 20 percent of the wells with unloading emissions that vent to the atmosphere accounted for 65 percent to 83 percent of those emissions.

“To put this in perspective, over the past several decades, 10 percent of the cars on the road have been responsible for the majority of automotive exhaust pollution,” said David Allen, chemical engineering professor at the Cockrell School and principal investigator for the study. “Similarly, a small group of sources within these two categories are responsible for the vast majority of pneumatic and unloading emissions at natural gas production sites.”

Additionally, for pneumatic devices, the study confirmed regional differences in methane emissions first reported by the study team in 2013. The researchers found that methane emissions from pneumatic devices were highest in the Gulf Coast and lowest in the Rocky Mountains.

The study is the second phase of the team’s 2013 study, which included some of the first measurements for methane emissions taken directly at hydraulically fractured well sites. Both phases of the study involved a partnership between the Environmental Defense Fund, participating energy companies, an independent Scientific Advisory Panel and the UT Austin study team.

The unprecedented access to natural gas production facilities and equipment allowed researchers to acquire direct measurements of methane emissions.

Study and Findings on Pneumatic Devices

Pneumatic devices, which use gas pressure to control the opening and closing of valves, emit gas as they operate. These emissions are estimated to be among the larger sources of methane emissions from the natural gas supply chain. The Environmental Protection Agency reports that 477,606 pneumatic (gas actuated) devices are in use at natural gas production sites throughout the U.S.

“Our team’s previous work established that pneumatics are a major contributor to emissions,” Allen said. “Our goal here was to measure a more diverse population of wells to characterize the features of high-emitting pneumatic controllers.”

The research team measured emissions from 377 gas actuated (pneumatic) controllers at natural gas production sites and a small number of oil production sites throughout the U.S.

The researchers sampled all identifiable pneumatic controller devices at each well site, a more comprehensive approach than the random sampling previously conducted. The average methane emissions per pneumatic controller reported in this study are 17 percent higher than the average emissions per pneumatic controller in the 2012 EPA greenhouse gas national emission inventory (released in 2014), but the average from the study is dominated by a small subpopulation of the controllers. Specifically, 19 percent of controllers, with measured emission rates in excess of 6 standard cubic feet per hour (scf/h), accounted for 95 percent of emissions.

The high-emitting pneumatic devices are a combination of devices that are not operating as designed, are used in applications that cause them to release gas frequently or are designed to emit continuously at a high rate.

The researchers also observed regional differences in methane emission levels, with the lowest emissions per device measured in the Rocky Mountains and the highest emissions in the Gulf Coast, similar to the earlier 2013 study. At least some of the regional differences in emission rates can be attributed to the difference in controller type (continuous vent vs. intermittent vent) among regions.

Study and Findings on Liquid Unloadings

After observing variable emissions for liquid unloadings for a limited group of well types in the 2013 study, the research team made more extensive measurements and confirmed that a majority of emissions come from a small fraction of wells that vent frequently. Although it is not surprising to see some correlation between frequency of unloadings and higher annual emissions, the study’s findings indicate that wells with a high frequency of unloadings have annual emissions that are 10 or more times as great as wells that unload less frequently.

The team’s field study, which measured emissions from unloadings from wells at 107 natural gas production wells throughout the U.S., represents the most extensive measurement of emissions associated with liquid unloadings in scientific literature thus far.

A liquid unloading is one method used to clear wells of accumulated liquids to increase production. Because older wells typically produce less gas as they near the end of their life cycle, liquid unloadings happen more often in those wells than in newer wells. The team found a statistical correlation between the age of wells and the frequency of liquid unloadings. The researchers found that the key identifier for high-emitting wells is how many times the well unloads in a given year.

Because liquid unloadings can employ a variety of liquid lifting mechanisms, the study results also reflect differences in liquid unloadings emissions between wells that use two different mechanisms (wells with plunger lifts and wells without plunger lifts). Emissions for unloading events for wells without plunger lifts averaged 21,000 scf (standard cubic feet) to 35,000 scf. For wells with plunger lifts that vent to the atmosphere, emissions averaged 1,000 scf to 10,000 scf of methane per event. Although the emissions per event were higher for wells without plunger lifts, these wells had, on average, fewer events than wells with plunger lifts. Wells without plunger lifts averaged fewer than 10 unloading events per year, and wells with plunger lifts averaged more than 200 events per year.Overall, wells with plunger lifts were estimated to account for 70 percent of emissions from unloadings nationally.

Additionally, researchers found that the Rocky Mountain region, with its large number of wells with a high frequency of unloadings that vent to the atmosphere, accounts for about half of overall emissions from liquid unloadings.

The study team hopes its measurements of liquid unloadings and pneumatic devices will provide a clearer picture of methane emissions from natural gas well sites and about the relationship between well characteristics and emissions.

The study was a cooperative effort involving experts from the Environmental Defense Fund, Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

The University of Texas at Austin is committed to transparency and disclosure of all potential conflicts of interest of its researchers. Lead researcher David Allen serves as chair of the Environmental Protection Agency’s Science Advisory Board and in this role is a paid Special Governmental Employee. He is also a journal editor for the American Chemical Society and has served as a consultant for multiple companies, including Eastern Research Group, ExxonMobil and the Research Triangle Institute. He has worked on other research projects funded by a variety of governmental, nonprofit and private sector sources including the National Science Foundation, the Environmental Protection Agency, the Texas Commission on Environmental Quality, the American Petroleum Institute and an air monitoring and surveillance project that was ordered by the U.S. District Court for the Southern District of Texas. Adam Pacsi and Daniel Zavala-Araiza, who were graduate students at The University of Texas at the time this work was done, have accepted positions at Chevron Energy Technology Company and the Environmental Defense Fund, respectively.

Financial support for this work was provided by the Environmental Defense Fund (EDF), Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

Major funding for the EDF’s 30-month methane research series, including their portion of the University of Texas study, is provided for by the following individuals and foundations: Fiona and Stan Druckenmiller, the Heising-Simons Foundation, Bill and Susan Oberndorf, Betsy and Sam Reeves, the Robertson Foundation, TomKat Charitable Trust and the Walton Family Foundation.

NASA study finds 1934 had worst drought of last thousand years

A new study using a reconstruction of North American drought history over the last 1,000 years found that the drought of 1934 was the driest and most widespread of the last millennium.

Using a tree-ring-based drought record from the years 1000 to 2005 and modern records, scientists from NASA and Lamont-Doherty Earth Observatory found the 1934 drought was 30 percent more severe than the runner-up drought (in 1580) and extended across 71.6 percent of western North America. For comparison, the average extent of the 2012 drought was 59.7 percent.

“It was the worst by a large margin, falling pretty far outside the normal range of variability that we see in the record,” said climate scientist Ben Cook at NASA’s Goddard Institute for Space Studies in New York. Cook is lead author of the study, which will publish in the Oct. 17 edition of Geophysical Research Letters.

Two sets of conditions led to the severity and extent of the 1934 drought. First, a high-pressure system in winter sat over the west coast of the United States and turned away wet weather – a pattern similar to that which occurred in the winter of 2013-14. Second, the spring of 1934 saw dust storms, caused by poor land management practices, suppress rainfall.

“In combination then, these two different phenomena managed to bring almost the entire nation into a drought at that time,” said co-author Richard Seager, professor at the Lamont-Doherty Earth Observatory of Columbia University in New York. “The fact that it was the worst of the millennium was probably in part because of the human role.”

According to the recent Fifth Assessment Report of the Intergovernmental Panel on Climate Change, or IPCC, climate change is likely to make droughts in North America worse, and the southwest in particular is expected to become significantly drier as are summers in the central plains. Looking back one thousand years in time is one way to get a handle on the natural variability of droughts so that scientists can tease out anthropogenic effects – such as the dust storms of 1934.

“We want to understand droughts of the past to understand to what extent climate change might make it more or less likely that those events occur in the future,” Cook said.

The abnormal high-pressure system is one lesson from the past that informs scientists’ understanding of the current severe drought in California and the western United States.

“What you saw during this last winter and during 1934, because of this high pressure in the atmosphere, is that all the wintertime storms that would normally come into places like California instead got steered much, much farther north,” Cook said. “It’s these wintertime storms that provide most of the moisture in California. So without getting that rainfall it led to a pretty severe drought.”

This type of high-pressure system is part of normal variation in the atmosphere, and whether or not it will appear in a given year is difficult to predict in computer models of the climate. Models are more attuned to droughts caused by La Niña’s colder sea surface temperatures in the Pacific Ocean, which likely triggered the multi-year Dust Bowl drought throughout the 1930s. In a normal La Niña year, the Pacific Northwest receives more rain than usual and the southwestern states typically dry out.

But a comparison of weather data to models looking at La Niña effects showed that the rain-blocking high-pressure system in the winter of 1933-34 overrode the effects of La Niña for the western states. This dried out areas from northern California to the Rockies that otherwise might have been wetter.

As winter ended, the high-pressure system shifted eastward, interfering with spring and summer rains that typically fall on the central plains. The dry conditions were exacerbated and spread even farther east by dust storms.

“We found that a lot of the drying that occurred in the spring time occurred downwind from where the dust storms originated,” Cook said, “suggesting that it’s actually the dust in the atmosphere that’s driving at least some of the drying in the spring and really allowing this drought event to spread upwards into the central plains.”

Dust clouds reflect sunlight and block solar energy from reaching the surface. That prevents evaporation that would otherwise help form rain clouds, meaning that the presence of the dust clouds themselves leads to less rain, Cook said.

“Previous work and this work offers some evidence that you need this dust feedback to explain the real anomalous nature of the Dust Bowl drought in 1934,” Cook said.

Dust storms like the ones in the 1930s aren’t a problem in North America today. The agricultural practices that gave rise to the Dust Bowl were replaced by those that minimize erosion. Still, agricultural producers need to pay attention to the changing climate and adapt accordingly, not forgetting the lessons of the past, said Seager. “The risk of severe mid-continental droughts is expected to go up over time, not down,” he said.

NASA study finds 1934 had worst drought of last thousand years

A new study using a reconstruction of North American drought history over the last 1,000 years found that the drought of 1934 was the driest and most widespread of the last millennium.

Using a tree-ring-based drought record from the years 1000 to 2005 and modern records, scientists from NASA and Lamont-Doherty Earth Observatory found the 1934 drought was 30 percent more severe than the runner-up drought (in 1580) and extended across 71.6 percent of western North America. For comparison, the average extent of the 2012 drought was 59.7 percent.

“It was the worst by a large margin, falling pretty far outside the normal range of variability that we see in the record,” said climate scientist Ben Cook at NASA’s Goddard Institute for Space Studies in New York. Cook is lead author of the study, which will publish in the Oct. 17 edition of Geophysical Research Letters.

Two sets of conditions led to the severity and extent of the 1934 drought. First, a high-pressure system in winter sat over the west coast of the United States and turned away wet weather – a pattern similar to that which occurred in the winter of 2013-14. Second, the spring of 1934 saw dust storms, caused by poor land management practices, suppress rainfall.

“In combination then, these two different phenomena managed to bring almost the entire nation into a drought at that time,” said co-author Richard Seager, professor at the Lamont-Doherty Earth Observatory of Columbia University in New York. “The fact that it was the worst of the millennium was probably in part because of the human role.”

According to the recent Fifth Assessment Report of the Intergovernmental Panel on Climate Change, or IPCC, climate change is likely to make droughts in North America worse, and the southwest in particular is expected to become significantly drier as are summers in the central plains. Looking back one thousand years in time is one way to get a handle on the natural variability of droughts so that scientists can tease out anthropogenic effects – such as the dust storms of 1934.

“We want to understand droughts of the past to understand to what extent climate change might make it more or less likely that those events occur in the future,” Cook said.

The abnormal high-pressure system is one lesson from the past that informs scientists’ understanding of the current severe drought in California and the western United States.

“What you saw during this last winter and during 1934, because of this high pressure in the atmosphere, is that all the wintertime storms that would normally come into places like California instead got steered much, much farther north,” Cook said. “It’s these wintertime storms that provide most of the moisture in California. So without getting that rainfall it led to a pretty severe drought.”

This type of high-pressure system is part of normal variation in the atmosphere, and whether or not it will appear in a given year is difficult to predict in computer models of the climate. Models are more attuned to droughts caused by La Niña’s colder sea surface temperatures in the Pacific Ocean, which likely triggered the multi-year Dust Bowl drought throughout the 1930s. In a normal La Niña year, the Pacific Northwest receives more rain than usual and the southwestern states typically dry out.

But a comparison of weather data to models looking at La Niña effects showed that the rain-blocking high-pressure system in the winter of 1933-34 overrode the effects of La Niña for the western states. This dried out areas from northern California to the Rockies that otherwise might have been wetter.

As winter ended, the high-pressure system shifted eastward, interfering with spring and summer rains that typically fall on the central plains. The dry conditions were exacerbated and spread even farther east by dust storms.

“We found that a lot of the drying that occurred in the spring time occurred downwind from where the dust storms originated,” Cook said, “suggesting that it’s actually the dust in the atmosphere that’s driving at least some of the drying in the spring and really allowing this drought event to spread upwards into the central plains.”

Dust clouds reflect sunlight and block solar energy from reaching the surface. That prevents evaporation that would otherwise help form rain clouds, meaning that the presence of the dust clouds themselves leads to less rain, Cook said.

“Previous work and this work offers some evidence that you need this dust feedback to explain the real anomalous nature of the Dust Bowl drought in 1934,” Cook said.

Dust storms like the ones in the 1930s aren’t a problem in North America today. The agricultural practices that gave rise to the Dust Bowl were replaced by those that minimize erosion. Still, agricultural producers need to pay attention to the changing climate and adapt accordingly, not forgetting the lessons of the past, said Seager. “The risk of severe mid-continental droughts is expected to go up over time, not down,” he said.

Icebergs once drifted to Florida, new climate model suggests

This is a map showing the pathway taken by icebergs from Hudson Bay, Canada, to Florida. The blue colors (behind the arrows) are an actual snapshot from the authors' high resolution model showing how much less salty the water is than normal. The more blue the color the less salty it is than normal. In this case, blue all the way along the coast shows that very fresh, cold waters are flowing along the entire east coast from Hudson Bay to Florida. -  UMass Amherst
This is a map showing the pathway taken by icebergs from Hudson Bay, Canada, to Florida. The blue colors (behind the arrows) are an actual snapshot from the authors’ high resolution model showing how much less salty the water is than normal. The more blue the color the less salty it is than normal. In this case, blue all the way along the coast shows that very fresh, cold waters are flowing along the entire east coast from Hudson Bay to Florida. – UMass Amherst

Using a first-of-its-kind, high-resolution numerical model to describe ocean circulation during the last ice age about 21,000 year ago, oceanographer Alan Condron of the University of Massachusetts Amherst has shown that icebergs and meltwater from the North American ice sheet would have regularly reached South Carolina and even southern Florida. The models are supported by the discovery of iceberg scour marks on the sea floor along the entire continental shelf.

Such a view of past meltwater and iceberg movement implies that the mechanisms of abrupt climate change are more complex than previously thought, Condron says. “Our study is the first to show that when the large ice sheet over North America known as the Laurentide ice sheet began to melt, icebergs calved into the sea around Hudson Bay and would have periodically drifted along the east coast of the United States as far south as Miami and the Bahamas in the Caribbean, a distance of more than 3,100 miles, about 5,000 kilometers.”

His work, conducted with Jenna Hill of Coastal Carolina University, is described in the current advance online issue of Nature Geosciences. “Determining how far south of the subpolar gyre icebergs and meltwater penetrated is vital for understanding the sensitivity of North Atlantic Deep Water formation and climate to past changes in high-latitude freshwater runoff,” the authors say.

Hill analyzed high-resolution images of the sea floor from Cape Hatteras to Florida and identified about 400 scour marks on the seabed that were formed by enormous icebergs plowing through mud on the sea floor. These characteristic grooves and pits were formed as icebergs moved into shallower water and their keels bumped and scraped along the ocean floor.

“The depth of the scours tells us that icebergs drifting to southern Florida were at least 1,000 feet, or 300 meters thick,” says Condron. “This is enormous. Such icebergs are only found off the coast of Greenland today.”

To investigate how icebergs might have drifted as far south as Florida, Condron simulated the release of a series of glacial meltwater floods in his high-resolution ocean circulation model at four different levels for two locations, Hudson Bay and the Gulf of St. Lawrence.

Condron reports, “In order for icebergs to drift to Florida, our glacial ocean circulation model tells us that enormous volumes of meltwater, similar to a catastrophic glacial lake outburst flood, must have been discharging into the ocean from the Laurentide ice sheet, from either Hudson Bay or the Gulf of St. Lawrence.”

Further, during these large meltwater flood events, the surface ocean current off the coast of Florida would have undergone a complete, 180-degree flip in direction, so that the warm, northward flowing Gulf Stream would have been replaced by a cold, southward flowing current, he adds.

As a result, waters off the coast of Florida would have been only a few degrees above freezing. Such events would have led to the sudden appearance of massive icebergs along the east coast of the United States all the way to Florida Keys, Condron points out. These events would have been abrupt and short-lived, probably less than a year, he notes.

“This new research shows that much of the meltwater from the Greenland ice sheet may be redistributed by narrow coastal currents and circulate through subtropical regions prior to reaching the subpolar ocean. It’s a more complicated picture than we believed before,” Condron says. He and Hill say that future research on mechanisms of abrupt climate change should take into account coastal boundary currents in redistributing ice sheet runoff and subpolar fresh water.

Icebergs once drifted to Florida, new climate model suggests

This is a map showing the pathway taken by icebergs from Hudson Bay, Canada, to Florida. The blue colors (behind the arrows) are an actual snapshot from the authors' high resolution model showing how much less salty the water is than normal. The more blue the color the less salty it is than normal. In this case, blue all the way along the coast shows that very fresh, cold waters are flowing along the entire east coast from Hudson Bay to Florida. -  UMass Amherst
This is a map showing the pathway taken by icebergs from Hudson Bay, Canada, to Florida. The blue colors (behind the arrows) are an actual snapshot from the authors’ high resolution model showing how much less salty the water is than normal. The more blue the color the less salty it is than normal. In this case, blue all the way along the coast shows that very fresh, cold waters are flowing along the entire east coast from Hudson Bay to Florida. – UMass Amherst

Using a first-of-its-kind, high-resolution numerical model to describe ocean circulation during the last ice age about 21,000 year ago, oceanographer Alan Condron of the University of Massachusetts Amherst has shown that icebergs and meltwater from the North American ice sheet would have regularly reached South Carolina and even southern Florida. The models are supported by the discovery of iceberg scour marks on the sea floor along the entire continental shelf.

Such a view of past meltwater and iceberg movement implies that the mechanisms of abrupt climate change are more complex than previously thought, Condron says. “Our study is the first to show that when the large ice sheet over North America known as the Laurentide ice sheet began to melt, icebergs calved into the sea around Hudson Bay and would have periodically drifted along the east coast of the United States as far south as Miami and the Bahamas in the Caribbean, a distance of more than 3,100 miles, about 5,000 kilometers.”

His work, conducted with Jenna Hill of Coastal Carolina University, is described in the current advance online issue of Nature Geosciences. “Determining how far south of the subpolar gyre icebergs and meltwater penetrated is vital for understanding the sensitivity of North Atlantic Deep Water formation and climate to past changes in high-latitude freshwater runoff,” the authors say.

Hill analyzed high-resolution images of the sea floor from Cape Hatteras to Florida and identified about 400 scour marks on the seabed that were formed by enormous icebergs plowing through mud on the sea floor. These characteristic grooves and pits were formed as icebergs moved into shallower water and their keels bumped and scraped along the ocean floor.

“The depth of the scours tells us that icebergs drifting to southern Florida were at least 1,000 feet, or 300 meters thick,” says Condron. “This is enormous. Such icebergs are only found off the coast of Greenland today.”

To investigate how icebergs might have drifted as far south as Florida, Condron simulated the release of a series of glacial meltwater floods in his high-resolution ocean circulation model at four different levels for two locations, Hudson Bay and the Gulf of St. Lawrence.

Condron reports, “In order for icebergs to drift to Florida, our glacial ocean circulation model tells us that enormous volumes of meltwater, similar to a catastrophic glacial lake outburst flood, must have been discharging into the ocean from the Laurentide ice sheet, from either Hudson Bay or the Gulf of St. Lawrence.”

Further, during these large meltwater flood events, the surface ocean current off the coast of Florida would have undergone a complete, 180-degree flip in direction, so that the warm, northward flowing Gulf Stream would have been replaced by a cold, southward flowing current, he adds.

As a result, waters off the coast of Florida would have been only a few degrees above freezing. Such events would have led to the sudden appearance of massive icebergs along the east coast of the United States all the way to Florida Keys, Condron points out. These events would have been abrupt and short-lived, probably less than a year, he notes.

“This new research shows that much of the meltwater from the Greenland ice sheet may be redistributed by narrow coastal currents and circulate through subtropical regions prior to reaching the subpolar ocean. It’s a more complicated picture than we believed before,” Condron says. He and Hill say that future research on mechanisms of abrupt climate change should take into account coastal boundary currents in redistributing ice sheet runoff and subpolar fresh water.

Team advances understanding of the Greenland Ice Sheet’s meltwater channels

An international team of researchers deployed to western Greenland to study the melt rates of the Greenland Ice Sheet. -  Matt Hoffman, Los Alamos National Laboratory
An international team of researchers deployed to western Greenland to study the melt rates of the Greenland Ice Sheet. – Matt Hoffman, Los Alamos National Laboratory

An international research team’s field work, drilling and measuring melt rates and ice sheet movement in Greenland is showing that things are, in fact, more complicated than we thought.

“Although the Greenland Ice Sheet initially speeds up each summer in its slow-motion race to the sea, the network of meltwater channels beneath the sheet is not necessarily forming the slushy racetrack that had been previously considered,” said Matthew Hoffman, a Los Alamos National Laboratory scientist on the project.

A high-profile paper appearing in Nature this week notes that observations of moulins (vertical conduits connecting water on top of the glacier down to the bed of the ice sheet) and boreholes in Greenland show that subglacial channels ameliorate the speedup caused by water delivery to the base of the ice sheet in the short term. By mid summer, however, the channels stabilize and are unable to grow any larger. In a previous paper appearing in Science, researchers had posited that the undersheet channels were not even a consideration in Greenland, but as happens in the science world, more data fills in the complex mosaic of facts and clarifies the evolution of the meltwater flow rates over the seasons.

In reality, these two papers are not inconsistent – they are studying different places at different times – and they both are consistent in that channelization is less important than previously assumed, said Hoffman.

The Greenland Ice Sheet’s movement speeds up each summer as melt from the surface penetrates kilometer-thick ice through moulins, lubricating the bed of the ice sheet. Greater melt is predicted for Greenland in the future, but its impact on ice sheet flux and associated sea level rise is uncertain: direct observations of the subglacial drainage system are lacking and its evolution over the melt season is poorly understood.

“Everyone wants to know what’s happening under Greenland as it experiences more and more melt,” said study coauthor Ginny Catania, a research scientist at the institute and an associate professor in the University of Texas at Austin’s Jackson School of Geosciences. “This subglacial plumbing may or may not be critical for sea level rise in the next 100 years, but we don’t really know until we fully understand it.”

To resolve these unknowns, the research team drilled and instrumented 13 boreholes through 700-meter thick ice in west Greenland. There they performed the first combined analysis of Greenland ice velocity and water pressure in moulins and boreholes, and they determined that moulin water pressure does not lower over the latter half of the melt season, indicating a limited role of high-efficiency channels in subglacial drainage.

Instead they found that boreholes monitor a hydraulically isolated region of the bed, but decreasing water pressure seen in some boreholes can explain the decreasing ice velocity seen over the melt season.

“Like loosening the seal of a bathtub drain, the hydrologic changes that occur each summer may cause isolated pockets of pressurized water to slowly drain out from under the ice sheet, resulting in more friction,” said Hoffman.

Their observations identify a previously unrecognized role of changes in hydraulically isolated regions of the bed in controlling evolution of subglacial drainage over summer. Understanding this process will be crucial for predicting the effect of increasing melt on summer speedup and associated autumn slowdown of the ice sheet into the future.

###

The research letter is published in this week’s Nature magazine as “Direct observations of evolving subglacial drainage beneath the Greenland Ice Sheet.” The project was an international collaboration between the University of Texas at Austin, Los Alamos National Laboratory, NASA Goddard Space Flight Center, Michigan Technological University, University of Zurich, the Swiss Federal Institute of Technology and Dartmouth College.

This project was supported by United States National Science Foundation, the Swiss National Science Foundation and the National Geographic Society. The work at Los Alamos was supported by NASA Cryospheric Sciences, and through climate modeling programs within the US Department of Energy, Office of Science.

Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the Department of Energy’s National Nuclear Security Administration.

Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

Team advances understanding of the Greenland Ice Sheet’s meltwater channels

An international team of researchers deployed to western Greenland to study the melt rates of the Greenland Ice Sheet. -  Matt Hoffman, Los Alamos National Laboratory
An international team of researchers deployed to western Greenland to study the melt rates of the Greenland Ice Sheet. – Matt Hoffman, Los Alamos National Laboratory

An international research team’s field work, drilling and measuring melt rates and ice sheet movement in Greenland is showing that things are, in fact, more complicated than we thought.

“Although the Greenland Ice Sheet initially speeds up each summer in its slow-motion race to the sea, the network of meltwater channels beneath the sheet is not necessarily forming the slushy racetrack that had been previously considered,” said Matthew Hoffman, a Los Alamos National Laboratory scientist on the project.

A high-profile paper appearing in Nature this week notes that observations of moulins (vertical conduits connecting water on top of the glacier down to the bed of the ice sheet) and boreholes in Greenland show that subglacial channels ameliorate the speedup caused by water delivery to the base of the ice sheet in the short term. By mid summer, however, the channels stabilize and are unable to grow any larger. In a previous paper appearing in Science, researchers had posited that the undersheet channels were not even a consideration in Greenland, but as happens in the science world, more data fills in the complex mosaic of facts and clarifies the evolution of the meltwater flow rates over the seasons.

In reality, these two papers are not inconsistent – they are studying different places at different times – and they both are consistent in that channelization is less important than previously assumed, said Hoffman.

The Greenland Ice Sheet’s movement speeds up each summer as melt from the surface penetrates kilometer-thick ice through moulins, lubricating the bed of the ice sheet. Greater melt is predicted for Greenland in the future, but its impact on ice sheet flux and associated sea level rise is uncertain: direct observations of the subglacial drainage system are lacking and its evolution over the melt season is poorly understood.

“Everyone wants to know what’s happening under Greenland as it experiences more and more melt,” said study coauthor Ginny Catania, a research scientist at the institute and an associate professor in the University of Texas at Austin’s Jackson School of Geosciences. “This subglacial plumbing may or may not be critical for sea level rise in the next 100 years, but we don’t really know until we fully understand it.”

To resolve these unknowns, the research team drilled and instrumented 13 boreholes through 700-meter thick ice in west Greenland. There they performed the first combined analysis of Greenland ice velocity and water pressure in moulins and boreholes, and they determined that moulin water pressure does not lower over the latter half of the melt season, indicating a limited role of high-efficiency channels in subglacial drainage.

Instead they found that boreholes monitor a hydraulically isolated region of the bed, but decreasing water pressure seen in some boreholes can explain the decreasing ice velocity seen over the melt season.

“Like loosening the seal of a bathtub drain, the hydrologic changes that occur each summer may cause isolated pockets of pressurized water to slowly drain out from under the ice sheet, resulting in more friction,” said Hoffman.

Their observations identify a previously unrecognized role of changes in hydraulically isolated regions of the bed in controlling evolution of subglacial drainage over summer. Understanding this process will be crucial for predicting the effect of increasing melt on summer speedup and associated autumn slowdown of the ice sheet into the future.

###

The research letter is published in this week’s Nature magazine as “Direct observations of evolving subglacial drainage beneath the Greenland Ice Sheet.” The project was an international collaboration between the University of Texas at Austin, Los Alamos National Laboratory, NASA Goddard Space Flight Center, Michigan Technological University, University of Zurich, the Swiss Federal Institute of Technology and Dartmouth College.

This project was supported by United States National Science Foundation, the Swiss National Science Foundation and the National Geographic Society. The work at Los Alamos was supported by NASA Cryospheric Sciences, and through climate modeling programs within the US Department of Energy, Office of Science.

Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the Department of Energy’s National Nuclear Security Administration.

Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

Gas leaks from faulty wells linked to contamination in some groundwater

A study has pinpointed the likely source of most natural gas contamination in drinking-water wells associated with hydraulic fracturing, and it’s not the source many people may have feared.

What’s more, the problem may be fixable: improved construction standards for cement well linings and casings at hydraulic fracturing sites.

A team led by a researcher at The Ohio State University and composed of researchers at Duke, Stanford, Dartmouth, and the University of Rochester devised a new method of geochemical forensics to trace how methane migrates under the earth. The study identified eight clusters of contaminated drinking-water wells in Pennsylvania and Texas.

Most important among their findings, published this week in the Proceedings of the National Academy of Sciences, is that neither horizontal drilling nor hydraulic fracturing of shale deposits seems to have caused any of the natural gas contamination.

“There is no question that in many instances elevated levels of natural gas are naturally occurring, but in a subset of cases, there is also clear evidence that there were human causes for the contamination,” said study leader Thomas Darrah, assistant professor of earth sciences at Ohio State. “However our data suggests that where contamination occurs, it was caused by poor casing and cementing in the wells,” Darrah said.

In hydraulic fracturing, water is pumped underground to break up shale at a depth far below the water table, he explained. The long vertical pipes that carry the resulting gas upward are encircled in cement to keep the natural gas from leaking out along the well. The study suggests that natural gas that has leaked into aquifers is the result of failures in the cement used in the well.

“Many of the leaks probably occur when natural gas travels up the outside of the borehole, potentially even thousands of feet, and is released directly into drinking-water aquifers” said Robert Poreda, professor of geochemistry at the University of Rochester.

“These results appear to rule out the migration of methane up into drinking water aquifers from depth because of horizontal drilling or hydraulic fracturing, as some people feared,” said Avner Vengosh, professor of geochemistry and water quality at Duke.

“This is relatively good news because it means that most of the issues we have identified can potentially be avoided by future improvements in well integrity,” Darrah said.

“In some cases homeowner’s water has been harmed by drilling,” said Robert B. Jackson, professor of environmental and earth sciences at Stanford and Duke. “In Texas, we even saw two homes go from clean to contaminated after our sampling began.”

The method that the researchers used to track the source of methane contamination relies on the basic physics of the noble gases (which happen to leak out along with the methane). Noble gases such as helium and neon are so called because they don’t react much with other chemicals, although they mix with natural gas and can be transported with it.

That means that when they are released underground, they can flow long distances without getting waylaid by microbial activity or chemical reactions along the way. The only important variable is the atomic mass, which determines how the ratios of noble gases change as they tag along with migrating natural gas. These properties allow the researchers to determine the source of fugitive methane and the mechanism by which it was transported into drinking water aquifers.

The researchers were able to distinguish between the signatures of naturally occurring methane and stray gas contamination from shale gas drill sites overlying the Marcellus shale in Pennsylvania and the Barnett shale in Texas.

The researchers sampled water from the sites in 2012 and 2013. Sampling sites included wells where contamination had been debated previously; wells known to have naturally high level of methane and salts, which tend to co-occur in areas overlying shale gas deposits; and wells located both within and beyond a one-kilometer distance from drill sites.

As hydraulic fracturing starts to develop around the globe, including countries South Africa, Argentina, China, Poland, Scotland, and Ireland, Darrah and his colleagues are continuing their work in the United States and internationally. And, since the method that the researchers employed relies on the basic physics of the noble gases, it can be employed anywhere. Their hope is that their findings can help highlight the necessity to improve well integrity.

Gas leaks from faulty wells linked to contamination in some groundwater

A study has pinpointed the likely source of most natural gas contamination in drinking-water wells associated with hydraulic fracturing, and it’s not the source many people may have feared.

What’s more, the problem may be fixable: improved construction standards for cement well linings and casings at hydraulic fracturing sites.

A team led by a researcher at The Ohio State University and composed of researchers at Duke, Stanford, Dartmouth, and the University of Rochester devised a new method of geochemical forensics to trace how methane migrates under the earth. The study identified eight clusters of contaminated drinking-water wells in Pennsylvania and Texas.

Most important among their findings, published this week in the Proceedings of the National Academy of Sciences, is that neither horizontal drilling nor hydraulic fracturing of shale deposits seems to have caused any of the natural gas contamination.

“There is no question that in many instances elevated levels of natural gas are naturally occurring, but in a subset of cases, there is also clear evidence that there were human causes for the contamination,” said study leader Thomas Darrah, assistant professor of earth sciences at Ohio State. “However our data suggests that where contamination occurs, it was caused by poor casing and cementing in the wells,” Darrah said.

In hydraulic fracturing, water is pumped underground to break up shale at a depth far below the water table, he explained. The long vertical pipes that carry the resulting gas upward are encircled in cement to keep the natural gas from leaking out along the well. The study suggests that natural gas that has leaked into aquifers is the result of failures in the cement used in the well.

“Many of the leaks probably occur when natural gas travels up the outside of the borehole, potentially even thousands of feet, and is released directly into drinking-water aquifers” said Robert Poreda, professor of geochemistry at the University of Rochester.

“These results appear to rule out the migration of methane up into drinking water aquifers from depth because of horizontal drilling or hydraulic fracturing, as some people feared,” said Avner Vengosh, professor of geochemistry and water quality at Duke.

“This is relatively good news because it means that most of the issues we have identified can potentially be avoided by future improvements in well integrity,” Darrah said.

“In some cases homeowner’s water has been harmed by drilling,” said Robert B. Jackson, professor of environmental and earth sciences at Stanford and Duke. “In Texas, we even saw two homes go from clean to contaminated after our sampling began.”

The method that the researchers used to track the source of methane contamination relies on the basic physics of the noble gases (which happen to leak out along with the methane). Noble gases such as helium and neon are so called because they don’t react much with other chemicals, although they mix with natural gas and can be transported with it.

That means that when they are released underground, they can flow long distances without getting waylaid by microbial activity or chemical reactions along the way. The only important variable is the atomic mass, which determines how the ratios of noble gases change as they tag along with migrating natural gas. These properties allow the researchers to determine the source of fugitive methane and the mechanism by which it was transported into drinking water aquifers.

The researchers were able to distinguish between the signatures of naturally occurring methane and stray gas contamination from shale gas drill sites overlying the Marcellus shale in Pennsylvania and the Barnett shale in Texas.

The researchers sampled water from the sites in 2012 and 2013. Sampling sites included wells where contamination had been debated previously; wells known to have naturally high level of methane and salts, which tend to co-occur in areas overlying shale gas deposits; and wells located both within and beyond a one-kilometer distance from drill sites.

As hydraulic fracturing starts to develop around the globe, including countries South Africa, Argentina, China, Poland, Scotland, and Ireland, Darrah and his colleagues are continuing their work in the United States and internationally. And, since the method that the researchers employed relies on the basic physics of the noble gases, it can be employed anywhere. Their hope is that their findings can help highlight the necessity to improve well integrity.