New study measures methane emissions from natural gas production and offers insights into 2 large sources

A team of researchers from the Cockrell School of Engineering at The University of Texas at Austin and environmental testing firm URS reports that a small subset of natural gas wells are responsible for the majority of methane emissions from two major sources — liquid unloadings and pneumatic controller equipment — at natural gas production sites.

With natural gas production in the United States expected to continue to increase during the next few decades, there is a need for a better understanding of methane emissions during natural gas production. The study team believes this research, published Dec. 9 in Environmental Science & Technology, will help to provide a clearer picture of methane emissions from natural gas production sites.

The UT Austin-led field study closely examined two major sources of methane emissions — liquid unloadings and pneumatic controller equipment — at well pad sites across the United States. Researchers found that 19 percent of the pneumatic devices accounted for 95 percent of the emissions from pneumatic devices, and 20 percent of the wells with unloading emissions that vent to the atmosphere accounted for 65 percent to 83 percent of those emissions.

“To put this in perspective, over the past several decades, 10 percent of the cars on the road have been responsible for the majority of automotive exhaust pollution,” said David Allen, chemical engineering professor at the Cockrell School and principal investigator for the study. “Similarly, a small group of sources within these two categories are responsible for the vast majority of pneumatic and unloading emissions at natural gas production sites.”

Additionally, for pneumatic devices, the study confirmed regional differences in methane emissions first reported by the study team in 2013. The researchers found that methane emissions from pneumatic devices were highest in the Gulf Coast and lowest in the Rocky Mountains.

The study is the second phase of the team’s 2013 study, which included some of the first measurements for methane emissions taken directly at hydraulically fractured well sites. Both phases of the study involved a partnership between the Environmental Defense Fund, participating energy companies, an independent Scientific Advisory Panel and the UT Austin study team.

The unprecedented access to natural gas production facilities and equipment allowed researchers to acquire direct measurements of methane emissions.

Study and Findings on Pneumatic Devices

Pneumatic devices, which use gas pressure to control the opening and closing of valves, emit gas as they operate. These emissions are estimated to be among the larger sources of methane emissions from the natural gas supply chain. The Environmental Protection Agency reports that 477,606 pneumatic (gas actuated) devices are in use at natural gas production sites throughout the U.S.

“Our team’s previous work established that pneumatics are a major contributor to emissions,” Allen said. “Our goal here was to measure a more diverse population of wells to characterize the features of high-emitting pneumatic controllers.”

The research team measured emissions from 377 gas actuated (pneumatic) controllers at natural gas production sites and a small number of oil production sites throughout the U.S.

The researchers sampled all identifiable pneumatic controller devices at each well site, a more comprehensive approach than the random sampling previously conducted. The average methane emissions per pneumatic controller reported in this study are 17 percent higher than the average emissions per pneumatic controller in the 2012 EPA greenhouse gas national emission inventory (released in 2014), but the average from the study is dominated by a small subpopulation of the controllers. Specifically, 19 percent of controllers, with measured emission rates in excess of 6 standard cubic feet per hour (scf/h), accounted for 95 percent of emissions.

The high-emitting pneumatic devices are a combination of devices that are not operating as designed, are used in applications that cause them to release gas frequently or are designed to emit continuously at a high rate.

The researchers also observed regional differences in methane emission levels, with the lowest emissions per device measured in the Rocky Mountains and the highest emissions in the Gulf Coast, similar to the earlier 2013 study. At least some of the regional differences in emission rates can be attributed to the difference in controller type (continuous vent vs. intermittent vent) among regions.

Study and Findings on Liquid Unloadings

After observing variable emissions for liquid unloadings for a limited group of well types in the 2013 study, the research team made more extensive measurements and confirmed that a majority of emissions come from a small fraction of wells that vent frequently. Although it is not surprising to see some correlation between frequency of unloadings and higher annual emissions, the study’s findings indicate that wells with a high frequency of unloadings have annual emissions that are 10 or more times as great as wells that unload less frequently.

The team’s field study, which measured emissions from unloadings from wells at 107 natural gas production wells throughout the U.S., represents the most extensive measurement of emissions associated with liquid unloadings in scientific literature thus far.

A liquid unloading is one method used to clear wells of accumulated liquids to increase production. Because older wells typically produce less gas as they near the end of their life cycle, liquid unloadings happen more often in those wells than in newer wells. The team found a statistical correlation between the age of wells and the frequency of liquid unloadings. The researchers found that the key identifier for high-emitting wells is how many times the well unloads in a given year.

Because liquid unloadings can employ a variety of liquid lifting mechanisms, the study results also reflect differences in liquid unloadings emissions between wells that use two different mechanisms (wells with plunger lifts and wells without plunger lifts). Emissions for unloading events for wells without plunger lifts averaged 21,000 scf (standard cubic feet) to 35,000 scf. For wells with plunger lifts that vent to the atmosphere, emissions averaged 1,000 scf to 10,000 scf of methane per event. Although the emissions per event were higher for wells without plunger lifts, these wells had, on average, fewer events than wells with plunger lifts. Wells without plunger lifts averaged fewer than 10 unloading events per year, and wells with plunger lifts averaged more than 200 events per year.Overall, wells with plunger lifts were estimated to account for 70 percent of emissions from unloadings nationally.

Additionally, researchers found that the Rocky Mountain region, with its large number of wells with a high frequency of unloadings that vent to the atmosphere, accounts for about half of overall emissions from liquid unloadings.

The study team hopes its measurements of liquid unloadings and pneumatic devices will provide a clearer picture of methane emissions from natural gas well sites and about the relationship between well characteristics and emissions.

The study was a cooperative effort involving experts from the Environmental Defense Fund, Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

The University of Texas at Austin is committed to transparency and disclosure of all potential conflicts of interest of its researchers. Lead researcher David Allen serves as chair of the Environmental Protection Agency’s Science Advisory Board and in this role is a paid Special Governmental Employee. He is also a journal editor for the American Chemical Society and has served as a consultant for multiple companies, including Eastern Research Group, ExxonMobil and the Research Triangle Institute. He has worked on other research projects funded by a variety of governmental, nonprofit and private sector sources including the National Science Foundation, the Environmental Protection Agency, the Texas Commission on Environmental Quality, the American Petroleum Institute and an air monitoring and surveillance project that was ordered by the U.S. District Court for the Southern District of Texas. Adam Pacsi and Daniel Zavala-Araiza, who were graduate students at The University of Texas at the time this work was done, have accepted positions at Chevron Energy Technology Company and the Environmental Defense Fund, respectively.

Financial support for this work was provided by the Environmental Defense Fund (EDF), Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

Major funding for the EDF’s 30-month methane research series, including their portion of the University of Texas study, is provided for by the following individuals and foundations: Fiona and Stan Druckenmiller, the Heising-Simons Foundation, Bill and Susan Oberndorf, Betsy and Sam Reeves, the Robertson Foundation, TomKat Charitable Trust and the Walton Family Foundation.

NASA study finds 1934 had worst drought of last thousand years

A new study using a reconstruction of North American drought history over the last 1,000 years found that the drought of 1934 was the driest and most widespread of the last millennium.

Using a tree-ring-based drought record from the years 1000 to 2005 and modern records, scientists from NASA and Lamont-Doherty Earth Observatory found the 1934 drought was 30 percent more severe than the runner-up drought (in 1580) and extended across 71.6 percent of western North America. For comparison, the average extent of the 2012 drought was 59.7 percent.

“It was the worst by a large margin, falling pretty far outside the normal range of variability that we see in the record,” said climate scientist Ben Cook at NASA’s Goddard Institute for Space Studies in New York. Cook is lead author of the study, which will publish in the Oct. 17 edition of Geophysical Research Letters.

Two sets of conditions led to the severity and extent of the 1934 drought. First, a high-pressure system in winter sat over the west coast of the United States and turned away wet weather – a pattern similar to that which occurred in the winter of 2013-14. Second, the spring of 1934 saw dust storms, caused by poor land management practices, suppress rainfall.

“In combination then, these two different phenomena managed to bring almost the entire nation into a drought at that time,” said co-author Richard Seager, professor at the Lamont-Doherty Earth Observatory of Columbia University in New York. “The fact that it was the worst of the millennium was probably in part because of the human role.”

According to the recent Fifth Assessment Report of the Intergovernmental Panel on Climate Change, or IPCC, climate change is likely to make droughts in North America worse, and the southwest in particular is expected to become significantly drier as are summers in the central plains. Looking back one thousand years in time is one way to get a handle on the natural variability of droughts so that scientists can tease out anthropogenic effects – such as the dust storms of 1934.

“We want to understand droughts of the past to understand to what extent climate change might make it more or less likely that those events occur in the future,” Cook said.

The abnormal high-pressure system is one lesson from the past that informs scientists’ understanding of the current severe drought in California and the western United States.

“What you saw during this last winter and during 1934, because of this high pressure in the atmosphere, is that all the wintertime storms that would normally come into places like California instead got steered much, much farther north,” Cook said. “It’s these wintertime storms that provide most of the moisture in California. So without getting that rainfall it led to a pretty severe drought.”

This type of high-pressure system is part of normal variation in the atmosphere, and whether or not it will appear in a given year is difficult to predict in computer models of the climate. Models are more attuned to droughts caused by La Niña’s colder sea surface temperatures in the Pacific Ocean, which likely triggered the multi-year Dust Bowl drought throughout the 1930s. In a normal La Niña year, the Pacific Northwest receives more rain than usual and the southwestern states typically dry out.

But a comparison of weather data to models looking at La Niña effects showed that the rain-blocking high-pressure system in the winter of 1933-34 overrode the effects of La Niña for the western states. This dried out areas from northern California to the Rockies that otherwise might have been wetter.

As winter ended, the high-pressure system shifted eastward, interfering with spring and summer rains that typically fall on the central plains. The dry conditions were exacerbated and spread even farther east by dust storms.

“We found that a lot of the drying that occurred in the spring time occurred downwind from where the dust storms originated,” Cook said, “suggesting that it’s actually the dust in the atmosphere that’s driving at least some of the drying in the spring and really allowing this drought event to spread upwards into the central plains.”

Dust clouds reflect sunlight and block solar energy from reaching the surface. That prevents evaporation that would otherwise help form rain clouds, meaning that the presence of the dust clouds themselves leads to less rain, Cook said.

“Previous work and this work offers some evidence that you need this dust feedback to explain the real anomalous nature of the Dust Bowl drought in 1934,” Cook said.

Dust storms like the ones in the 1930s aren’t a problem in North America today. The agricultural practices that gave rise to the Dust Bowl were replaced by those that minimize erosion. Still, agricultural producers need to pay attention to the changing climate and adapt accordingly, not forgetting the lessons of the past, said Seager. “The risk of severe mid-continental droughts is expected to go up over time, not down,” he said.

Icebergs once drifted to Florida, new climate model suggests

This is a map showing the pathway taken by icebergs from Hudson Bay, Canada, to Florida. The blue colors (behind the arrows) are an actual snapshot from the authors' high resolution model showing how much less salty the water is than normal. The more blue the color the less salty it is than normal. In this case, blue all the way along the coast shows that very fresh, cold waters are flowing along the entire east coast from Hudson Bay to Florida. -  UMass Amherst
This is a map showing the pathway taken by icebergs from Hudson Bay, Canada, to Florida. The blue colors (behind the arrows) are an actual snapshot from the authors’ high resolution model showing how much less salty the water is than normal. The more blue the color the less salty it is than normal. In this case, blue all the way along the coast shows that very fresh, cold waters are flowing along the entire east coast from Hudson Bay to Florida. – UMass Amherst

Using a first-of-its-kind, high-resolution numerical model to describe ocean circulation during the last ice age about 21,000 year ago, oceanographer Alan Condron of the University of Massachusetts Amherst has shown that icebergs and meltwater from the North American ice sheet would have regularly reached South Carolina and even southern Florida. The models are supported by the discovery of iceberg scour marks on the sea floor along the entire continental shelf.

Such a view of past meltwater and iceberg movement implies that the mechanisms of abrupt climate change are more complex than previously thought, Condron says. “Our study is the first to show that when the large ice sheet over North America known as the Laurentide ice sheet began to melt, icebergs calved into the sea around Hudson Bay and would have periodically drifted along the east coast of the United States as far south as Miami and the Bahamas in the Caribbean, a distance of more than 3,100 miles, about 5,000 kilometers.”

His work, conducted with Jenna Hill of Coastal Carolina University, is described in the current advance online issue of Nature Geosciences. “Determining how far south of the subpolar gyre icebergs and meltwater penetrated is vital for understanding the sensitivity of North Atlantic Deep Water formation and climate to past changes in high-latitude freshwater runoff,” the authors say.

Hill analyzed high-resolution images of the sea floor from Cape Hatteras to Florida and identified about 400 scour marks on the seabed that were formed by enormous icebergs plowing through mud on the sea floor. These characteristic grooves and pits were formed as icebergs moved into shallower water and their keels bumped and scraped along the ocean floor.

“The depth of the scours tells us that icebergs drifting to southern Florida were at least 1,000 feet, or 300 meters thick,” says Condron. “This is enormous. Such icebergs are only found off the coast of Greenland today.”

To investigate how icebergs might have drifted as far south as Florida, Condron simulated the release of a series of glacial meltwater floods in his high-resolution ocean circulation model at four different levels for two locations, Hudson Bay and the Gulf of St. Lawrence.

Condron reports, “In order for icebergs to drift to Florida, our glacial ocean circulation model tells us that enormous volumes of meltwater, similar to a catastrophic glacial lake outburst flood, must have been discharging into the ocean from the Laurentide ice sheet, from either Hudson Bay or the Gulf of St. Lawrence.”

Further, during these large meltwater flood events, the surface ocean current off the coast of Florida would have undergone a complete, 180-degree flip in direction, so that the warm, northward flowing Gulf Stream would have been replaced by a cold, southward flowing current, he adds.

As a result, waters off the coast of Florida would have been only a few degrees above freezing. Such events would have led to the sudden appearance of massive icebergs along the east coast of the United States all the way to Florida Keys, Condron points out. These events would have been abrupt and short-lived, probably less than a year, he notes.

“This new research shows that much of the meltwater from the Greenland ice sheet may be redistributed by narrow coastal currents and circulate through subtropical regions prior to reaching the subpolar ocean. It’s a more complicated picture than we believed before,” Condron says. He and Hill say that future research on mechanisms of abrupt climate change should take into account coastal boundary currents in redistributing ice sheet runoff and subpolar fresh water.

Team advances understanding of the Greenland Ice Sheet’s meltwater channels

An international team of researchers deployed to western Greenland to study the melt rates of the Greenland Ice Sheet. -  Matt Hoffman, Los Alamos National Laboratory
An international team of researchers deployed to western Greenland to study the melt rates of the Greenland Ice Sheet. – Matt Hoffman, Los Alamos National Laboratory

An international research team’s field work, drilling and measuring melt rates and ice sheet movement in Greenland is showing that things are, in fact, more complicated than we thought.

“Although the Greenland Ice Sheet initially speeds up each summer in its slow-motion race to the sea, the network of meltwater channels beneath the sheet is not necessarily forming the slushy racetrack that had been previously considered,” said Matthew Hoffman, a Los Alamos National Laboratory scientist on the project.

A high-profile paper appearing in Nature this week notes that observations of moulins (vertical conduits connecting water on top of the glacier down to the bed of the ice sheet) and boreholes in Greenland show that subglacial channels ameliorate the speedup caused by water delivery to the base of the ice sheet in the short term. By mid summer, however, the channels stabilize and are unable to grow any larger. In a previous paper appearing in Science, researchers had posited that the undersheet channels were not even a consideration in Greenland, but as happens in the science world, more data fills in the complex mosaic of facts and clarifies the evolution of the meltwater flow rates over the seasons.

In reality, these two papers are not inconsistent – they are studying different places at different times – and they both are consistent in that channelization is less important than previously assumed, said Hoffman.

The Greenland Ice Sheet’s movement speeds up each summer as melt from the surface penetrates kilometer-thick ice through moulins, lubricating the bed of the ice sheet. Greater melt is predicted for Greenland in the future, but its impact on ice sheet flux and associated sea level rise is uncertain: direct observations of the subglacial drainage system are lacking and its evolution over the melt season is poorly understood.

“Everyone wants to know what’s happening under Greenland as it experiences more and more melt,” said study coauthor Ginny Catania, a research scientist at the institute and an associate professor in the University of Texas at Austin’s Jackson School of Geosciences. “This subglacial plumbing may or may not be critical for sea level rise in the next 100 years, but we don’t really know until we fully understand it.”

To resolve these unknowns, the research team drilled and instrumented 13 boreholes through 700-meter thick ice in west Greenland. There they performed the first combined analysis of Greenland ice velocity and water pressure in moulins and boreholes, and they determined that moulin water pressure does not lower over the latter half of the melt season, indicating a limited role of high-efficiency channels in subglacial drainage.

Instead they found that boreholes monitor a hydraulically isolated region of the bed, but decreasing water pressure seen in some boreholes can explain the decreasing ice velocity seen over the melt season.

“Like loosening the seal of a bathtub drain, the hydrologic changes that occur each summer may cause isolated pockets of pressurized water to slowly drain out from under the ice sheet, resulting in more friction,” said Hoffman.

Their observations identify a previously unrecognized role of changes in hydraulically isolated regions of the bed in controlling evolution of subglacial drainage over summer. Understanding this process will be crucial for predicting the effect of increasing melt on summer speedup and associated autumn slowdown of the ice sheet into the future.

###

The research letter is published in this week’s Nature magazine as “Direct observations of evolving subglacial drainage beneath the Greenland Ice Sheet.” The project was an international collaboration between the University of Texas at Austin, Los Alamos National Laboratory, NASA Goddard Space Flight Center, Michigan Technological University, University of Zurich, the Swiss Federal Institute of Technology and Dartmouth College.

This project was supported by United States National Science Foundation, the Swiss National Science Foundation and the National Geographic Society. The work at Los Alamos was supported by NASA Cryospheric Sciences, and through climate modeling programs within the US Department of Energy, Office of Science.

Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the Department of Energy’s National Nuclear Security Administration.

Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

Gas leaks from faulty wells linked to contamination in some groundwater

A study has pinpointed the likely source of most natural gas contamination in drinking-water wells associated with hydraulic fracturing, and it’s not the source many people may have feared.

What’s more, the problem may be fixable: improved construction standards for cement well linings and casings at hydraulic fracturing sites.

A team led by a researcher at The Ohio State University and composed of researchers at Duke, Stanford, Dartmouth, and the University of Rochester devised a new method of geochemical forensics to trace how methane migrates under the earth. The study identified eight clusters of contaminated drinking-water wells in Pennsylvania and Texas.

Most important among their findings, published this week in the Proceedings of the National Academy of Sciences, is that neither horizontal drilling nor hydraulic fracturing of shale deposits seems to have caused any of the natural gas contamination.

“There is no question that in many instances elevated levels of natural gas are naturally occurring, but in a subset of cases, there is also clear evidence that there were human causes for the contamination,” said study leader Thomas Darrah, assistant professor of earth sciences at Ohio State. “However our data suggests that where contamination occurs, it was caused by poor casing and cementing in the wells,” Darrah said.

In hydraulic fracturing, water is pumped underground to break up shale at a depth far below the water table, he explained. The long vertical pipes that carry the resulting gas upward are encircled in cement to keep the natural gas from leaking out along the well. The study suggests that natural gas that has leaked into aquifers is the result of failures in the cement used in the well.

“Many of the leaks probably occur when natural gas travels up the outside of the borehole, potentially even thousands of feet, and is released directly into drinking-water aquifers” said Robert Poreda, professor of geochemistry at the University of Rochester.

“These results appear to rule out the migration of methane up into drinking water aquifers from depth because of horizontal drilling or hydraulic fracturing, as some people feared,” said Avner Vengosh, professor of geochemistry and water quality at Duke.

“This is relatively good news because it means that most of the issues we have identified can potentially be avoided by future improvements in well integrity,” Darrah said.

“In some cases homeowner’s water has been harmed by drilling,” said Robert B. Jackson, professor of environmental and earth sciences at Stanford and Duke. “In Texas, we even saw two homes go from clean to contaminated after our sampling began.”

The method that the researchers used to track the source of methane contamination relies on the basic physics of the noble gases (which happen to leak out along with the methane). Noble gases such as helium and neon are so called because they don’t react much with other chemicals, although they mix with natural gas and can be transported with it.

That means that when they are released underground, they can flow long distances without getting waylaid by microbial activity or chemical reactions along the way. The only important variable is the atomic mass, which determines how the ratios of noble gases change as they tag along with migrating natural gas. These properties allow the researchers to determine the source of fugitive methane and the mechanism by which it was transported into drinking water aquifers.

The researchers were able to distinguish between the signatures of naturally occurring methane and stray gas contamination from shale gas drill sites overlying the Marcellus shale in Pennsylvania and the Barnett shale in Texas.

The researchers sampled water from the sites in 2012 and 2013. Sampling sites included wells where contamination had been debated previously; wells known to have naturally high level of methane and salts, which tend to co-occur in areas overlying shale gas deposits; and wells located both within and beyond a one-kilometer distance from drill sites.

As hydraulic fracturing starts to develop around the globe, including countries South Africa, Argentina, China, Poland, Scotland, and Ireland, Darrah and his colleagues are continuing their work in the United States and internationally. And, since the method that the researchers employed relies on the basic physics of the noble gases, it can be employed anywhere. Their hope is that their findings can help highlight the necessity to improve well integrity.

Yellowstone supereruption would send ash across North America

An example of the possible distribution of ash from a month-long Yellowstone supereruption. The distribution map was generated by a new model developed by the US Geological Survey using wind information from January 2001. The improved computer model, detailed in a new study published in Geochemistry, Geophysics, Geosystems, finds that the hypothetical, large eruption would create a distinctive kind of ash cloud known as an umbrella, which expands evenly in all directions, sending ash across North America. Ash distribution will vary depending on cloud height, eruption duration, diameter of volcanic particles in the cloud, and wind conditions, according to the new study. -  Credit: USGS
An example of the possible distribution of ash from a month-long Yellowstone supereruption. The distribution map was generated by a new model developed by the US Geological Survey using wind information from January 2001. The improved computer model, detailed in a new study published in Geochemistry, Geophysics, Geosystems, finds that the hypothetical, large eruption would create a distinctive kind of ash cloud known as an umbrella, which expands evenly in all directions, sending ash across North America. Ash distribution will vary depending on cloud height, eruption duration, diameter of volcanic particles in the cloud, and wind conditions, according to the new study. – Credit: USGS

In the unlikely event of a volcanic supereruption at Yellowstone National Park, the northern Rocky Mountains would be blanketed in meters of ash, and millimeters would be deposited as far away as New York City, Los Angeles and Miami, according to a new study.

An improved computer model developed by the study’s authors finds that the hypothetical, large eruption would create a distinctive kind of ash cloud known as an umbrella, which expands evenly in all directions, sending ash across North America.

A supereruption is the largest class of volcanic eruption, during which more than 1,000 cubic kilometers (240 cubic miles) of material is ejected. If such a supereruption were to occur, which is extremely unlikely, it could shut down electronic communications and air travel throughout the continent, and alter the climate, the study notes.

A giant underground reservoir of hot and partly molten rock feeds the volcano at Yellowstone National Park. It has produced three huge eruptions about 2.1 million, 1.3 million and 640,000 years ago. Geological activity at Yellowstone shows no signs that volcanic eruptions, large or small, will occur in the near future. The most recent volcanic activity at Yellowstone-a relatively non-explosive lava flow at the Pitchstone Plateau in the southern section of the park-occurred 70,000 years ago.

Researchers at the U.S. Geological Survey used a hypothetical Yellowstone supereruption as a case study to run their new model that calculates ash distribution for eruptions of all sizes. The model, Ash3D, incorporates data on historical wind patterns to calculate the thickness of ash fall for a supereruption like the one that occurred at Yellowstone 640,000 years ago.

The new study provides the first quantitative estimates of the thickness and distribution of ash in cities around the U.S. if the Yellowstone volcanic system were to experience this type of huge, yet unlikely, eruption.

Cities close to the modeled Yellowstone supereruption could be covered by more than a meter (a few feet) of ash. There would be centimeters (a few inches) of ash in the Midwest, while cities on both coasts would see millimeters (a fraction of an inch) of accumulation, according to the new study that was published online today in Geochemistry, Geophysics, Geosystems, a journal of the American Geophysical Union. The paper has been made available at no charge at http://onlinelibrary.wiley.com/doi/10.1002/2014GC005469/abstract.

The model results help scientists understand the extremely widespread distribution of ash deposits from previous large eruptions at Yellowstone. Other USGS scientists are using the Ash3D model to forecast possible ash hazards at currently restless volcanoes in Alaska.

Unlike smaller eruptions, whose ash deposition looks roughly like a fan when viewed from above, the spreading umbrella cloud from a supereruption deposits ash in a pattern more like a bull’s eye – heavy in the center and diminishing in all directions – and is less affected by prevailing winds, according to the new model.

“In essence, the eruption makes its own winds that can overcome the prevailing westerlies, which normally dominate weather patterns in the United States,” said Larry Mastin, a geologist at the USGS Cascades Volcano Observatory in Vancouver, Washington, and the lead author of the new paper. Westerly winds blow from the west.

“This helps explain the distribution from large Yellowstone eruptions of the past, where considerable amounts of ash reached the west coast,” he added.

The three large past eruptions at Yellowstone sent ash over many tens of thousands of square kilometers (thousands of square miles). Ash deposits from these eruptions have been found throughout the central and western United States and Canada.

Erosion has made it difficult for scientists to accurately estimate ash distribution from these deposits. Previous computer models also lacked the ability to accurately determine how the ash would be transported.

Using their new model, the study’s authors found that during very large volcanic eruptions, the expansion rate of the ash cloud’s leading edge can exceed the average ambient wind speed for hours or days depending on the length of the eruption. This outward expansion is capable of driving ash more than 1,500 kilometers (932 miles) upwind – westward — and crosswind – north to south — producing a bull’s eye-like pattern centered on the eruption site.

In the simulated modern-day eruption scenario, cities within 500 kilometers (311 miles) of Yellowstone like Billings, Montana, and Casper, Wyoming, would be covered by centimeters (inches) to more than a meter (more than three feet) of ash. Upper Midwestern cities, like Minneapolis, Minnesota, and Des Moines, Iowa, would receive centimeters (inches), and those on the East and Gulf coasts, like New York and Washington, D.C. would receive millimeters or less (fractions of an inch). California cities would receive millimeters to centimeters (less than an inch to less than two inches) of ash while Pacific Northwest cities like Portland, Oregon, and Seattle, Washington, would receive up to a few centimeters (more than an inch).

Even small accumulations only millimeters or centimeters (less than an inch to an inch) thick could cause major effects around the country, including reduced traction on roads, shorted-out electrical transformers and respiratory problems, according to previous research cited in the new study. Prior research has also found that multiple inches of ash can damage buildings, block sewer and water lines, and disrupt livestock and crop production, the study notes.

The study also found that other eruptions – powerful but much smaller than a Yellowstone supereruption — might also generate an umbrella cloud.

“These model developments have greatly enhanced our ability to anticipate possible effects from both large and small eruptions, wherever they occur,” said Jacob Lowenstern, USGS Scientist-in-Charge of the Yellowstone Volcano Observatory in Menlo Park, California, and a co-author on the new paper.

Severe drought is causing the western US to rise

The severe drought gripping the western United States in recent years is changing the landscape well beyond localized effects of water restrictions and browning lawns. Scientists at Scripps Institution of Oceanography at UC San Diego have now discovered that the growing, broad-scale loss of water is causing the entire western U.S. to rise up like an uncoiled spring.

Investigating ground positioning data from GPS stations throughout the west, Scripps researchers Adrian Borsa, Duncan Agnew, and Dan Cayan found that the water shortage is causing an “uplift” effect up to 15 millimeters (more than half an inch) in California’s mountains and on average four millimeters (0.15 of an inch) across the west. From the GPS data, they estimate the water deficit at nearly 240 gigatons (62 trillion gallons of water), equivalent to a six-inch layer of water spread out over the entire western U.S.

Results of the study, which was supported by the U.S. Geological Survey (USGS), appear in the August 21 online edition of the journal Science.

While poring through various sets of data of ground positions from highly precise GPS stations within the National Science Foundation’s Plate Boundary Observatory and other networks, Borsa, a Scripps assistant research geophysicist, kept noticing the same pattern over the 2003-2014 period: All of the stations moved upwards in the most recent years, coinciding with the timing of the current drought.

Agnew, a Scripps Oceanography geophysics professor who specializes in studying earthquakes and their impact on shaping the earth’s crust, says the GPS data can only be explained by rapid uplift of the tectonic plate upon which the western U.S. rests (Agnew cautions that the uplift has virtually no effect on the San Andreas fault and therefore does not increase the risk of earthquakes).

For Cayan, a research meteorologist with Scripps and USGS, the results paint a new picture of the dire hydrological state of the west.

“These results quantify the amount of water mass lost in the past few years,” said Cayan. “It also represents a powerful new way to track water resources over a very large landscape. We can home in on the Sierra Nevada mountains and critical California snowpack. These results demonstrate that this technique can be used to study changes in fresh water stocks in other regions around the world, if they have a network of GPS sensors.”

Induced quakes rattle less than tectonic quakes, except near epicenter

Induced earthquakes generate significantly lower shaking than tectonic earthquakes with comparable magnitudes, except within 10 km of the epicenter, according to a study to be published online August 19 in the Bulletin of the Seismological Society of America (BSSA). Within 10 km of the epicenter, the reduced intensity of shaking is likely offset by the increased intensity of shaking due to the shallow source depths of injection-induced earthquakes.

Using data from the USGS “Did You Feel It?” system, Seismologist Susan Hough explored the shaking intensities of 11 earthquakes in the central and eastern United States (CEUS) considered likely caused by fluid injection.

“Although moderate injection-induced earthquakes in the CEUS will be widely felt due to low regional attenuation,” writes Hough, “the damage from earthquakes induced by injection will be more concentrated in proximity to the event epicenters than shaking from tectonic earthquakes.”

Scientists warn time to stop drilling in the dark

In areas where shale-drilling/hydraulic fracturing is heavy, a dense web of roads, pipelines and well pads turn continuous forests and grasslands into fragmented islands. -  Simon Fraser University PAMR
In areas where shale-drilling/hydraulic fracturing is heavy, a dense web of roads, pipelines and well pads turn continuous forests and grasslands into fragmented islands. – Simon Fraser University PAMR

The co-authors of a new study, including two Simon Fraser University research associates, cite new reasons why scientists, industry representatives and policymakers must collaborate closely on minimizing damage to the natural world from shale gas development. Viorel Popescu and Maureen Ryan, David H. Smith Conservation Research Fellows in SFU’s Biological Sciences department, are among eight international co-authors of the newly published research in Frontiers in Ecology and the Environment.

Shale gas development is the extraction of natural gas from shale formations via deep injection of high-pressure aqueous chemicals to create fractures (i.e., hydraulic fracturing), which releases trapped gas. With shale gas production projected to increase exponentially internationally during the next 30 years, the scientists say their key findings are cause for significant concern and decisive mitigation measures.

“Our findings are highly relevant to British Columbians given the impetus for developing shale resources in northeastern B.C. and the massive LNG facilities and pipeline infrastructure under development throughout the province,” notes Popescu. The SFU Earth2Ocean Group member is also a research associate in the Centre for Environmental Research at the University of Bucharest in Romania.

Key study findings:

  • One of the greatest threats to animal and plant-life is the cumulative impact of rapid, widespread shale development, with each individual well contributing collectively to air, water, noise and light pollution.

    “Think about the landscape and its habitats as a canvas,” explains Popescu. “At first, the few well pads, roads and pipelines from shale development seem like tiny holes and cuts, and the canvas still holds. But if you look at a heavily developed landscape down the road, you see more holes and cuts than natural habitats. Forests or grasslands that were once continuous are now islands fragmented by a dense web of roads, pipelines and well pads. At what point does the canvas fall apart? And what are the ecological implications for wide-ranging, sensitive species such as caribou or grizzly bears?”

  • Determining the environmental impact of chemical contamination from spills, well-casing failure and other accidents associated with shale gas production must become a top priority.

    Shale-drilling operations for oil and natural gas have increased by more than 700 per cent in the United States since 2007 and Western Canada is undergoing a similar shale gas production boom. But the industry’s effects on nature and wildlife are not well understood. Accurate data on the release of fracturing chemicals into the environment needs to be gathered before understanding can improve.

  • The lack of accessible and reliable information on spills, wastewater disposal and fracturing fluids is greatly impeding improved understanding. This study identifies that only five of 24 American states with active shale gas reservoirs maintain public records of spills and accidents.

The authors reviewed chemical disclosure statements for 150 wells in three top-gas producing American states and found that, on average, two out of three wells were fractured with at least one undisclosed chemical. Some of the wells in the chemical disclosure registry were fractured with fluid containing 20 or more undisclosed chemicals.

The authors call this an arbitrary and inconsistent standard of chemical disclosure. This is particularly worrisome given the chemical makeup of fracturing fluid and wastewater, which can include carcinogens and radioactive substances, is often unknown.

“Past lessons from large scale resource extraction and energy development -large dams, intensive forestry, or biofuel plantations – have shown us that development that outpaces our understanding of ecological impacts can have dire unintended consequences,” notes Ryan. She is a research fellow in the University of Washington’s School of Environmental and Forest Sciences.

“It’s our responsibility to look forward. For example, here in Canada, moving natural gas from northeastern B.C. to the 16 proposed LNG plants would require hundreds of kilometers of new pipeline and road infrastructure, and large port terminals on top of the effects of drilling. We must not just consider the impact of these projects individually, but also try to evaluate the ecological impacts holistically.”

The bend in the Appalachian mountain chain is finally explained

A dense, underground block of volcanic rock (shown in red) helped shape the well-known bend in the Appalachian mountain range. -  Graphic by Michael Osadciw/University of Rochester.
A dense, underground block of volcanic rock (shown in red) helped shape the well-known bend in the Appalachian mountain range. – Graphic by Michael Osadciw/University of Rochester.

The 1500 mile Appalachian mountain chain runs along a nearly straight line from Alabama to Newfoundland-except for a curious bend in Pennsylvania and New York State. Researchers from the College of New Jersey and the University of Rochester now know what caused that bend-a dense, underground block of rigid, volcanic rock forced the chain to shift eastward as it was forming millions of years ago.

According to Cindy Ebinger, a professor of earth and environmental sciences at the University of Rochester, scientists had previously known about the volcanic rock structure under the Appalachians. “What we didn’t understand was the size of the structure or its implications for mountain-building processes,” she said.

The findings have been published in the journal Earth and Planetary Science Letters.

When the North American and African continental plates collided more than 300 million years ago, the North American plate began folding and thrusting upwards as it was pushed westward into the dense underground rock structure-in what is now the northeastern United States. The dense rock created a barricade, forcing the Appalachian mountain range to spring up with its characteristic bend.

The research team-which also included Margaret Benoit, an associate professor of physics at the College of New Jersey, and graduate student Melanie Crampton at the College of New Jersey-studied data collected by the Earthscope project, which is funded by the National Science Foundation. Earthscope makes use of 136 GPS receivers and an array of 400 portable seismometers deployed in the northeast United States to measure ground movement.

Benoit and Ebinger also made use of the North American Gravity Database, a compilation of open-source data from the U.S., Canada, and Mexico. The database, started two decades ago, contains measurements of the gravitational pull over the North American terrain. Most people assume that gravity has a constant value, but when gravity is experimentally measured, it changes from place to place due to variations in the density and thickness of Earth’s rock layers. Certain parts of the Earth are denser than others, causing the gravitational pull to be slightly greater in those places.

Data on the changes in gravitational pull and seismic velocity together allowed the researchers to determine the density of the underground structure and conclude that it is volcanic in origin, with dimensions of 450 kilometers by 100 kilometers. This information, along with data from the Earthscope project ultimately helped the researchers to model how the bend was formed.

Ebinger called the research project a “foundation study” that will improve scientists’ understanding of the Earth’s underlying structures. As an example, Ebinger said their findings could provide useful information in the debate over hydraulic fracturing-popularly known is hydrofracking-in New York State.

Hydrofracking is a mining technique used to extract natural gas from deep in the earth. It involves drilling horizontally into shale formations, then injecting the rock with sand, water, and a cocktail of chemicals to free the trapped gas for removal. The region just west of the Appalachian Basin-the Marcellus Shale formation-is rich in natural gas reserves and is being considered for development by drilling companies.