New study measures methane emissions from natural gas production and offers insights into 2 large sources

A team of researchers from the Cockrell School of Engineering at The University of Texas at Austin and environmental testing firm URS reports that a small subset of natural gas wells are responsible for the majority of methane emissions from two major sources — liquid unloadings and pneumatic controller equipment — at natural gas production sites.

With natural gas production in the United States expected to continue to increase during the next few decades, there is a need for a better understanding of methane emissions during natural gas production. The study team believes this research, published Dec. 9 in Environmental Science & Technology, will help to provide a clearer picture of methane emissions from natural gas production sites.

The UT Austin-led field study closely examined two major sources of methane emissions — liquid unloadings and pneumatic controller equipment — at well pad sites across the United States. Researchers found that 19 percent of the pneumatic devices accounted for 95 percent of the emissions from pneumatic devices, and 20 percent of the wells with unloading emissions that vent to the atmosphere accounted for 65 percent to 83 percent of those emissions.

“To put this in perspective, over the past several decades, 10 percent of the cars on the road have been responsible for the majority of automotive exhaust pollution,” said David Allen, chemical engineering professor at the Cockrell School and principal investigator for the study. “Similarly, a small group of sources within these two categories are responsible for the vast majority of pneumatic and unloading emissions at natural gas production sites.”

Additionally, for pneumatic devices, the study confirmed regional differences in methane emissions first reported by the study team in 2013. The researchers found that methane emissions from pneumatic devices were highest in the Gulf Coast and lowest in the Rocky Mountains.

The study is the second phase of the team’s 2013 study, which included some of the first measurements for methane emissions taken directly at hydraulically fractured well sites. Both phases of the study involved a partnership between the Environmental Defense Fund, participating energy companies, an independent Scientific Advisory Panel and the UT Austin study team.

The unprecedented access to natural gas production facilities and equipment allowed researchers to acquire direct measurements of methane emissions.

Study and Findings on Pneumatic Devices

Pneumatic devices, which use gas pressure to control the opening and closing of valves, emit gas as they operate. These emissions are estimated to be among the larger sources of methane emissions from the natural gas supply chain. The Environmental Protection Agency reports that 477,606 pneumatic (gas actuated) devices are in use at natural gas production sites throughout the U.S.

“Our team’s previous work established that pneumatics are a major contributor to emissions,” Allen said. “Our goal here was to measure a more diverse population of wells to characterize the features of high-emitting pneumatic controllers.”

The research team measured emissions from 377 gas actuated (pneumatic) controllers at natural gas production sites and a small number of oil production sites throughout the U.S.

The researchers sampled all identifiable pneumatic controller devices at each well site, a more comprehensive approach than the random sampling previously conducted. The average methane emissions per pneumatic controller reported in this study are 17 percent higher than the average emissions per pneumatic controller in the 2012 EPA greenhouse gas national emission inventory (released in 2014), but the average from the study is dominated by a small subpopulation of the controllers. Specifically, 19 percent of controllers, with measured emission rates in excess of 6 standard cubic feet per hour (scf/h), accounted for 95 percent of emissions.

The high-emitting pneumatic devices are a combination of devices that are not operating as designed, are used in applications that cause them to release gas frequently or are designed to emit continuously at a high rate.

The researchers also observed regional differences in methane emission levels, with the lowest emissions per device measured in the Rocky Mountains and the highest emissions in the Gulf Coast, similar to the earlier 2013 study. At least some of the regional differences in emission rates can be attributed to the difference in controller type (continuous vent vs. intermittent vent) among regions.

Study and Findings on Liquid Unloadings

After observing variable emissions for liquid unloadings for a limited group of well types in the 2013 study, the research team made more extensive measurements and confirmed that a majority of emissions come from a small fraction of wells that vent frequently. Although it is not surprising to see some correlation between frequency of unloadings and higher annual emissions, the study’s findings indicate that wells with a high frequency of unloadings have annual emissions that are 10 or more times as great as wells that unload less frequently.

The team’s field study, which measured emissions from unloadings from wells at 107 natural gas production wells throughout the U.S., represents the most extensive measurement of emissions associated with liquid unloadings in scientific literature thus far.

A liquid unloading is one method used to clear wells of accumulated liquids to increase production. Because older wells typically produce less gas as they near the end of their life cycle, liquid unloadings happen more often in those wells than in newer wells. The team found a statistical correlation between the age of wells and the frequency of liquid unloadings. The researchers found that the key identifier for high-emitting wells is how many times the well unloads in a given year.

Because liquid unloadings can employ a variety of liquid lifting mechanisms, the study results also reflect differences in liquid unloadings emissions between wells that use two different mechanisms (wells with plunger lifts and wells without plunger lifts). Emissions for unloading events for wells without plunger lifts averaged 21,000 scf (standard cubic feet) to 35,000 scf. For wells with plunger lifts that vent to the atmosphere, emissions averaged 1,000 scf to 10,000 scf of methane per event. Although the emissions per event were higher for wells without plunger lifts, these wells had, on average, fewer events than wells with plunger lifts. Wells without plunger lifts averaged fewer than 10 unloading events per year, and wells with plunger lifts averaged more than 200 events per year.Overall, wells with plunger lifts were estimated to account for 70 percent of emissions from unloadings nationally.

Additionally, researchers found that the Rocky Mountain region, with its large number of wells with a high frequency of unloadings that vent to the atmosphere, accounts for about half of overall emissions from liquid unloadings.

The study team hopes its measurements of liquid unloadings and pneumatic devices will provide a clearer picture of methane emissions from natural gas well sites and about the relationship between well characteristics and emissions.

The study was a cooperative effort involving experts from the Environmental Defense Fund, Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

The University of Texas at Austin is committed to transparency and disclosure of all potential conflicts of interest of its researchers. Lead researcher David Allen serves as chair of the Environmental Protection Agency’s Science Advisory Board and in this role is a paid Special Governmental Employee. He is also a journal editor for the American Chemical Society and has served as a consultant for multiple companies, including Eastern Research Group, ExxonMobil and the Research Triangle Institute. He has worked on other research projects funded by a variety of governmental, nonprofit and private sector sources including the National Science Foundation, the Environmental Protection Agency, the Texas Commission on Environmental Quality, the American Petroleum Institute and an air monitoring and surveillance project that was ordered by the U.S. District Court for the Southern District of Texas. Adam Pacsi and Daniel Zavala-Araiza, who were graduate students at The University of Texas at the time this work was done, have accepted positions at Chevron Energy Technology Company and the Environmental Defense Fund, respectively.

Financial support for this work was provided by the Environmental Defense Fund (EDF), Anadarko Petroleum Corporation, BG Group PLC, Chevron, ConocoPhillips, Encana Oil & Gas (USA) Inc., Pioneer Natural Resources Company, SWEPI LP (Shell), Statoil, Southwestern Energy and XTO Energy, a subsidiary of ExxonMobil.

Major funding for the EDF’s 30-month methane research series, including their portion of the University of Texas study, is provided for by the following individuals and foundations: Fiona and Stan Druckenmiller, the Heising-Simons Foundation, Bill and Susan Oberndorf, Betsy and Sam Reeves, the Robertson Foundation, TomKat Charitable Trust and the Walton Family Foundation.

Technology-dependent emissions of gas extraction in the US

The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. -  Photo: F. Geiger/KIT
The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. – Photo: F. Geiger/KIT

Not all boreholes are the same. Scientists of the Karlsruhe Institute of Technology (KIT) used mobile measurement equipment to analyze gaseous compounds emitted by the extraction of oil and natural gas in the USA. For the first time, organic pollutants emitted during a fracking process were measured at a high temporal resolution. The highest values measured exceeded typical mean values in urban air by a factor of one thousand, as was reported in ACP journal. (DOI 10.5194/acp-14-10977-2014)

Emission of trace gases by oil and gas fields was studied by the KIT researchers in the USA (Utah and Colorado) together with US institutes. Background concentrations and the waste gas plumes of single extraction plants and fracking facilities were analyzed. The air quality measurements of several weeks duration took place under the “Uintah Basin Winter Ozone Study” coordinated by the National Oceanic and Atmospheric Administration (NOAA).

The KIT measurements focused on health-damaging aromatic hydrocarbons in air, such as carcinogenic benzene. Maximum concentrations were determined in the waste gas plumes of boreholes. Some extraction plants emitted up to about a hundred times more benzene than others. The highest values of some milligrams of benzene per cubic meter air were measured downstream of an open fracking facility, where returning drilling fluid is stored in open tanks and basins. Much better results were reached by oil and gas extraction plants and plants with closed production processes. In Germany, benzene concentration at the workplace is subject to strict limits: The Federal Emission Control Ordinance gives an annual benzene limit of five micrograms per cubic meter for the protection of human health, which is smaller than the values now measured at the open fracking facility in the US by a factor of about one thousand. The researchers published the results measured in the journal Atmospheric Chemistry and Physics ACP.

“Characteristic emissions of trace gases are encountered everywhere. These are symptomatic of gas and gas extraction. But the values measured for different technologies differ considerably,” Felix Geiger of the Institute of Meteorology and Climate Research (IMK) of KIT explains. He is one of the first authors of the study. By means of closed collection tanks and so-called vapor capture systems, for instance, the gases released during operation can be collected and reduced significantly.

“The gas fields in the sparsely populated areas of North America are a good showcase for estimating the range of impacts of different extraction and fracking technologies,” explains Professor Johannes Orphal, Head of IMK. “In the densely populated Germany, framework conditions are much stricter and much more attention is paid to reducing and monitoring emissions.”

Fracking is increasingly discussed as a technology to extract fossil resources from unconventional deposits. Hydraulic breaking of suitable shale stone layers opens up the fossil fuels stored there and makes them accessible for economically efficient use. For this purpose, boreholes are drilled into these rock formations. Then, they are subjected to high pressure using large amounts of water and auxiliary materials, such as sand, cement, and chemicals. The oil or gas can flow to the surface through the opened microstructures in the rock. Typically, the return flow of the aqueous fracking liquid with the dissolved oil and gas constituents to the surface lasts several days until the production phase proper of purer oil or natural gas. This return flow is collected and then reused until it finally has to be disposed of. Air pollution mainly depends on the treatment of this return flow at the extraction plant. In this respect, currently practiced fracking technologies differ considerably. For the first time now, the resulting local atmospheric emissions were studied at a high temporary resolution. Based on the results, emissions can be assigned directly to the different plant sections of an extraction plant. For measurement, the newly developed, compact, and highly sensitive instrument, a so-called proton transfer reaction mass spectrometer (PTR-MS), of KIT was installed on board of a minivan and driven closer to the different extraction points, the distances being a few tens of meters. In this way, the waste gas plumes of individual extraction sources and fracking processes were studied in detail.

Warneke, C., Geiger, F., Edwards, P. M., Dube, W., Pétron, G., Kofler, J., Zahn, A., Brown, S. S., Graus, M., Gilman, J. B., Lerner, B. M., Peischl, J., Ryerson, T. B., de Gouw, J. A., and Roberts, J. M.: Volatile organic compound emissions from the oil and natural gas industry in the Uintah Basin, Utah: oil and gas well pad emissions compared to ambient air composition, Atmos. Chem. Phys., 14, 10977-10988, doi:10.5194/acp-14-10977-2014, 2014.

Aiming to improve the air quality in underground mines

Reducing diesel particulate matter emitted by the diesel powered vehicles used for underground mine work is the aim of researchers from Monash University. -  Monash University
Reducing diesel particulate matter emitted by the diesel powered vehicles used for underground mine work is the aim of researchers from Monash University. – Monash University

Reducing diesel particulate matter (DPM) exposure to miners in underground coalmines will be a step closer to reality with the awarding of a research grant to engineers from Monash University.

The $275,000 grant from the Australian Coal Association Research Programme (ACARP) goes to a multi-disciplinary team from the Maintenance Technology Institute (MTI), the Laboratory for Turbulence Research in Aerospace and Combustion (LTRAC) and the Australian Pulp and Paper Institute (APPI).

The grant will allow them to collaborate with leading industry original equipment manufacturers and mine site personnel as part of a broader long-term strategy to minimise DPM emissions in the mining industry.

Joint project leader Associate Professor Damon Honnery said it was important to find a way to reduce miners exposure to DPM which is both effective and cost efficient.

“DPM has recently been classified as a Group 1 carcinogen by the World Health Organisation, and is a significant problem for operators of underground coalmines,” Associate Professor Honnery said.

“Diesel powered vehicles are widely used for underground mine work and are generally fitted with diesel particulate filters (DPFs) to reduce particulate emissions which have very limited service life – typically around one or two shifts – resulting in excessive costs and ineffective control of DPM.”

The new project will complement an earlier ACARP project by the team that focussed on improving the service life of DPFs used in underground coalmines, which found reconditioned filters could be reused up to five times without compromising filter integrity or DPM filtration efficiency.

Fellow Project leader Dr Daya Dayawansa said while the earlier results offer a viable short-term solution to the DPM problem, a medium-term solution requires the careful examination and possible redesign of the entire exhaust conditioning system, in combination with improved diesel particulate filters.

Ultimately, the researchers believe that many diesel engines used in underground mining could be replaced by electric motors, despite the stringent regulations relating to electric systems in the potentially explosive underground atmosphere.

“While filter use will continue to reduce the impact of DPM emission in underground mines, the only truly effective long term solution is to remove the source from the mines altogether. Working with our partners, we hope to achieve this through the development of electric powered vehicles,” Dr Dayawansa said.

First-ever 3D image created of the structure beneath Sierra Negra volcano

This is a photo of the Sierra Negra volcano on Isabela Island in the Galápagos Archipelago. -  Cynthia Ebinger, University of Rochester
This is a photo of the Sierra Negra volcano on Isabela Island in the Galápagos Archipelago. – Cynthia Ebinger, University of Rochester

The Galápagos Islands are home to some of the most active volcanoes in the world, with more than 50 eruptions in the last 200 years. Yet until recently, scientists knew far more about the history of finches, tortoises, and iguanas than of the volcanoes on which these unusual fauna had evolved.

Now research out of the University of Rochester is providing a better picture of the subterranean plumbing system that feeds the Galápagos volcanoes, as well as a major difference with another Pacific Island chain-the Hawaiian Islands. The findings have been published in the Journal of Geophysical Research: Solid Earth.

“With a better understanding of what’s beneath the volcanoes, we’ll now be able to more accurately measure underground activity,” said Cynthia Ebinger, a professor of earth and environmental sciences. “That should help us better anticipate earthquakes and eruptions, and mitigate the hazards associated with them.”

Ebinger’s team, which included Mario Ruiz from the Instituto Geofisico Escuela Politecnica Nacional in Quito, Ecuador, buried 15 seismometers around Sierra Negra, the largest and most active volcano in the Galápagos. The equipment was used to measure the velocity and direction of different sound waves generated by earthquakes as they traveled under Sierra Negra. Since the behavior of the waves varies according to the temperature and types of material they’re passing through, the data collected allowed the researchers to construct a 3D image of the plumbing system beneath the volcano, using a technique similar to a CAT-scan.

Five kilometers down is the beginning of a large magma chamber lying partially within old oceanic crust that had been buried by more than 8 km of eruptive rock layers. And the oceanic crust has what appears to be a thick underplating of rock formed when magma that was working its way toward the surface became trapped under the crust and cooled-very much like the processes that occur under the Hawaiian Islands.

The researchers found that the Galápagos had something else in common with the Hawaiian Islands. Their data suggest the presence of a large chamber filled with crystal-mush magma-cooled magma that includes crystallized minerals.

The Galápagos Islands formed from a hotspot of magma located in an oceanic plate-called Nazca-about 600 miles of Ecuador, in a process very similar to how the Hawaiian Islands were created. Magma rising from the hotspot eventually hardened into an island. Then, as the Nazca plate inched its way westward, new islands formed in the same manner, resulting in the present-day Galápagos Archipelago.

While there are several similarities between the two island chains, Ebinger uncovered a major difference. The older volcanos in the Hawaiian Islands are dormant, because they’ve moved away from the hotspot that provided the source of magma. In the Galápagos, the volcanoes are connected to the same plumbing system. By studying satellite views of the volcanoes, Ebinger and colleagues noticed that, as the magma would sink in one, it would rise in a different volcano-indicating that that some of the youngest volcanoes had magma connections, even if those connections were temporary.

“Not only do we have a better understanding of the physical properties of Sierra Negra,” said Ebinger, “we have increased out knowledge of island volcano systems, in general.”

The Galápagos Islands are home to some of the most active volcanoes in the world, with more than 50 eruptions in the last 200 years. Yet until recently, scientists knew far more about the history of finches, tortoises, and iguanas than of the volcanoes on which these unusual fauna had evolved.

Now research out of the University of Rochester is providing a better picture of the subterranean plumbing system that feeds the Galápagos volcanoes, as well as a major difference with another Pacific Island chain-the Hawaiian Islands.

World’s first magma-enhanced geothermal system created in Iceland

This image shows a flow test of the IDDP-1 well at Krafla. Note the transparent superheated steam at the top of the rock muffler. -  Kristján Einarsson.
This image shows a flow test of the IDDP-1 well at Krafla. Note the transparent superheated steam at the top of the rock muffler. – Kristján Einarsson.

In 2009, a borehole drilled at Krafla, northeast Iceland, as part of the Icelandic Deep Drilling Project (IDDP), unexpectedly penetrated into magma (molten rock) at only 2100 meters depth, with a temperature of 900-1000 C. The borehole, IDDP-1, was the first in a series of wells being drilled by the IDDP in Iceland in the search for high-temperature geothermal resources.

The January 2014 issue of the international journal Geothermics is dedicated to scientific and engineering results arising from that unusual occurrence. This issue is edited by Wilfred Elders, a professor emeritus of geology at the University of California, Riverside, who also co-authored three of the research papers in the special issue with Icelandic colleagues.

“Drilling into magma is a very rare occurrence anywhere in the world and this is only the second known instance, the first one, in 2007, being in Hawaii,” Elders said. “The IDDP, in cooperation with Iceland’s National Power Company, the operator of the Krafla geothermal power plant, decided to investigate the hole further and bear part of the substantial costs involved.”

Accordingly, a steel casing, perforated in the bottom section closest to the magma, was cemented into the well. The hole was then allowed to heat slowly and eventually allowed to flow superheated steam for the next two years, until July 2012, when it was closed down in order to replace some of the surface equipment.

“In the future, the success of this drilling and research project could lead to a revolution in the energy efficiency of high-temperature geothermal areas worldwide,” Elders said.

He added that several important milestones were achieved in this project: despite some difficulties, the project was able to drill down into the molten magma and control it; it was possible to set steel casing in the bottom of the hole; allowing the hole to blow superheated, high-pressure steam for months at temperatures exceeding 450 C, created a world record for geothermal heat (this well was the hottest in the world and one of the most powerful); steam from the IDDP-1 well could be fed directly into the existing power plant at Krafla; and the IDDP-1 demonstrated that a high-enthalpy geothermal system could be successfully utilized.

“Essentially, the IDDP-1 created the world’s first magma-enhanced geothermal system,” Elders said. “This unique engineered geothermal system is the world’s first to supply heat directly from a molten magma.”

Elders explained that in various parts of the world so-called enhanced or engineered geothermal systems are being created by pumping cold water into hot dry rocks at 4-5 kilometers depths. The heated water is pumped up again as hot water or steam from production wells. In recent decades, considerable effort has been invested in Europe, Australia, the United States, and Japan, with uneven, and typically poor, results.

“Although the IDDP-1 hole had to be shut in, the aim now is to repair the well or to drill a new similar hole,” Elders said. “The experiment at Krafla suffered various setbacks that tried personnel and equipment throughout. However, the process itself was very instructive, and, apart from scientific articles published in Geothermics, comprehensive reports on practical lessons learned are nearing completion.”

The IDDP is a collaboration of three energy companies – HS Energy Ltd., National Power Company and Reykjavik Energy – and a government agency, the National Energy Authority of Iceland. It will drill the next borehole, IDDP-2, in southwest Iceland at Reykjanes in 2014-2015. From the onset, international collaboration has been important to the project, and in particular a consortium of U.S. scientists, coordinated by Elders, has been very active, authoring several research papers in the special issue of Geothermics.

Scientists anticipated size and location of 2012 Costa Rica earthquake

Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, performs a GPS survey in Costa Rica's Nicoya Peninsula in 2010. -  Lujia Feng
Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, performs a GPS survey in Costa Rica’s Nicoya Peninsula in 2010. – Lujia Feng

Scientists using GPS to study changes in the Earth’s shape accurately forecasted the size and location of the magnitude 7.6 Nicoya earthquake that occurred in 2012 in Costa Rica.

The Nicoya Peninsula in Costa Rica is one of the few places where land sits atop the portion of a subduction zone where the Earth’s greatest earthquakes take place. Costa Rica’s location therefore makes it the perfect spot for learning how large earthquakes rupture. Because earthquakes greater than about magnitude 7.5 have occurred in this region roughly every 50 years, with the previous event striking in 1950, scientists have been preparing for this earthquake through a number of geophysical studies. The most recent study used GPS to map out the area along the fault storing energy for release in a large earthquake.

“This is the first place where we’ve been able to map out the likely extent of an earthquake rupture along the subduction megathrust beforehand,” said Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology.

The study was published online Dec. 22, 2013, in the journal Nature Geoscience. The research was supported by the National Science Foundation and was a collaboration of researchers from Georgia Tech, the Costa Rica Volcanological and Seismological Observatory (OVSICORI) at Universidad Nacional, University California, Santa Cruz, and the University of South Florida.

Subduction zones are locations where one tectonic plate is forced under another one. The collision of tectonic plates during this process can unleash devastating earthquakes, and sometimes devastating tsunamis. The magnitude 9.0 earthquake off the coast of Japan in 2011 was due to just such a subduction zone eaerthquake. The Cascadia subduction zone in the Pacific Northwest is capable of unleashing a similarly sized quake. Damage from the Nicoya earthquake was not as bad as might be expected from a magnitude 7.6 quake.

“Fortunately there was very little damage considering the earthquake’s size,” said Marino Protti of OVSICORI and the study’s lead author. “The historical pattern of earthquakes not only allowed us to get our instruments ready, it also allowed Costa Ricans to upgrade their buildings to be earthquake safe.”

Plate tectonics are the driving force for subduction zones. As tectonic plates converge, strain temporarily accumulates across the plate boundary when portions of the interface between these tectonic plates, called a megathrust, become locked together. The strain can accumulate to dangerous levels before eventually being released as a massive earthquake.

“The Nicoya Peninsula is an ideal natural lab for studying these events, because the coastline geometry uniquely allows us to get our equipment close to the zone of active strain accumulation,” said Susan Schwartz, professor of earth sciences at the University of California, Santa Cruz, and a co-author of the study.

Through a series of studies starting in the early 1990s using land-based tools, the researchers mapped regions where tectonic plates were completely locked along the subduction interface. Detailed geophysical observations of the region allowed the researchers to create an image of where the faults had locked.

The researchers published a study a few months before the earthquake, describing the particular locked patch with the clearest potential for the next large earthquake in the region. The team projected the total amount of energy that could have developed across that region and forecasted that if the locking remained similar since the last major earthquake in 1950, then there is presently enough energy for an earthquake on the order of magnitude 7.8 there.

Because of limits in technology and scientific understanding about processes controlling fault locking and release, scientists cannot say much about precisely where or when earthquakes will occur. However, earthquakes in Nicoya have occurred about every 50 years, so seismologists had been anticipating another one around 2000, give or take 20 years, Newman said. The earthquake occurred in September of 2012 as a magnitude 7.6 quake.

“It occurred right in the area we determined to be locked and it had almost the size we expected,” Newman said.

The researchers hope to apply what they’ve learned in Costa Rica to other environments. Virtually every damaging subduction zone earthquake occurs far offshore.

“Nicoya is the only place on Earth where we’ve actually been able to get a very accurate image of the locked patch because it occurs directly under land,” Newman said. “If we really want to understand the seismic potential for most of the world, we have to go offshore.”

Scientists have been able to reasonably map portions of these locked areas offshore using data on land, but the resolution is poor, particularly in the regions that are most responsible for generating tsunamis, Newman said. He hopes that his group’s work in Nicoya will be a driver for geodetic studies on the seafloor to observe such Earth deformation. These seafloor geodetic studies are rare and expensive today.

“If we want to understand the potential for large earthquakes, then we really need to start doing more seafloor observations,” Newman said. “It’s a growing push in our community and this study highlights the type of results that one might be able to obtain for most other dangerous environments, including offshore the Pacific Northwest.”

Terahertz time-domain spectroscopy for oil and gas detection

This image shows R0% (vitrinite reflectance) dependence of α (absorption coefficients) of kerogen of different maturities at selected frequencies. -  ©Science China Press
This image shows R0% (vitrinite reflectance) dependence of α (absorption coefficients) of kerogen of different maturities at selected frequencies. – ©Science China Press

A greater understanding of the evolutionary stage of kerogen for hydrocarbon generation would play a role in easing the world’s current energy problem. Professor ZHAO Kun and his group from the Key Laboratory of Oil and Gas Terahertz Spectrum and Photoelectric Detection (CPCIF, China University of Petroleum, Beijing) set out to tackle this problem. After five years of innovative research, they have developed terahertz time-domain spectroscopy (THz-TDS) as an effective method to detect the generation of oil and gas from kerogen. Their work, entitled “Applying terahertz time-domain spectroscopy to probe the evolution of kerogen in close pyrolysis systems”, was published in Science China Physics, Mechanics & Astronomy, 2013, Vol. 56(8).

The evolution stages of kerogen and hydrocarbon generation are critical aspects of oil-gas exploration and source rock evaluation. In sedimentary rock, about 95% of the organic matter is kerogen, the key intermediate in the formation of oil and gas. The specific kerogen type and maturity level will determine the characteristics of the hydrocarbons that will be generated. Previous research has led to two primary observations: (i) kerogen serves as a significant energy source as recoverable shale oil and coal where reserves far exceed the remaining petroleum reserves; and (ii) kerogen possesses a significant sorption capacity for organic compounds. Kerogen is primarily composed of alicyclics, aromatics, and other functional groups. Therefore, the ability to generate oil and gas from kerogen is determined primarily by its specific composition and structure. However, each generation technique has advantages and disadvantages within the specific parameters of the kerogen. Thus, there is a need for new methods to characterize the numerous stages and mechanisms of hydrocarbon generation from kerogen.

Vitrinite reflectance (R0%), defined as the proportion of normal incident light reflected by a polished planar surface of vitrinite (found in kerogen), is commonly used to characterize the maturity stage of kerogen. Those stages are defined as: the immature (IM) stage, where it generally cannot produce oil and gas (R0%<0.5); the early mature (EM) stage, or heavy oil zone (0.5<R0%<0.7); the middle mature (MM) stage, which is a primary zone of crude oil generation, also referred to as the oil window (0.7<R0%<1.2); the late mature (LM) stage, or zone of light oil and natural gas (1.2<R0%2.0).

To meet the challenges of applying optical characterization in oil and gas exploration, we applied THz-TDS as a nondestructive, contact-free tool for identifying the transformational paths and hydrocarbon generation ability of kerogen. Specifically, the absorption coefficients at different temperatures and pressures indicated the maturity regime of the kerogen, which were in good agreement with the results of programmed pyrolysis experiments.

By comparing the kerogen THz curves under different R0% and the maturity stages of the hydrocarbons, we can conclude that a relationship exists between the kerogen THz optical constants and the maturity stage. The THz optical constant curves at a given frequency can be divided into several sections denoted by the IM, EM, MM, LM, and OM stages. The kerogens cannot generate any significant amount of oil or gas when in the IM stage (R0%<0.5). Therefore, the functional groups and characteristics do not alter, which results in little observed change of the THz optical constants. In the primary oil generation zone (0.7<R0%<1.2), methyl, methylene, aromatic hydrocarbon, oxygen, and nitrogen functional groups separate from the kerogen, and oil and gas begin to be generated. The residual kerogen forms macromolecules with aromatic components. From the changes in the molecular structures and features relative to those of the initial kerogen, the values of the first peak of the THz absorption coefficient curve (see Figure) and the real parts of the relative dielectric permittivity curves characterize the oil-generating stage of kerogen. At a more mature stage (R0%<1.2), alkyls in aromatic groups separate from the kerogen and begin to generate hydrocarbons in the primary gas zone (see Figure).

This study was a collaborative effect involving many university and company researchers. It was supported by a grant from the National Key Scientific Instruments and Equipment Development, a 973 grant from the Department of Science and Technology of China, and a grant from the Beijing National Science Foundation. Being nondestructive and contactless, this method has shown great promise to improve kerogen analysis. The technique needs to be applied in more instances that involve reservoir rocks and further research will determine whether it can be established as a key tool in petroleum exploration and impact the oil and gas industry.

Technical reports examine hydraulic fracturing in Michigan

University of Michigan researchers today released seven technical reports that together form the most comprehensive Michigan-focused resource on hydraulic fracturing, the controversial natural gas and oil extraction process commonly known as fracking.

The studies, totaling nearly 200 pages, examine seven critical topics related to the use of hydraulic fracturing in Michigan, with an emphasis on high-volume methods: technology, geology and hydrogeology, environment and ecology, public health, policy and law, economics, and public perceptions.

While considerable natural gas reserves are believed to exist in the state and high-volume hydraulic fracturing has the potential to help access them, possible impacts to the environment and to public health must be addressed, the U-M researchers concluded.

Though modern high-volume hydraulic fracturing is not widely used in Michigan today, a main premise of the U-M study is that the technique could become more widespread due to a desire for job creation, economic growth, energy independence and cleaner fuels.

“There’s a lot of interest in high-volume hydraulic fracturing, but there really isn’t much activity at the moment in Michigan,” said John Callewaert, project director and director of integrated assessment at U-M’s Graham Sustainability Institute, which is overseeing the project. “That’s why now is a good time to do this assessment.”

These reports conclude the first phase of a two-year U-M project known formally as the Hydraulic Fracturing in Michigan Integrated Assessment. The seven documents-which should not be characterized or cited as final products of the integrated assessment-provide a solid informational foundation for the project’s next phase, an analysis of various hydraulic fracturing policy options. That analysis is expected to be completed in mid-2014 and will be shared with government officials, industry experts, other academics, advocacy groups and the general public.

“Nothing like this has been done before in Michigan,” Callewaert said. “Having this comprehensive, state-specific set of reports will be an invaluable resource that will help guide future decision-making on this issue-and hopefully will help Michigan avoid some of the pitfalls encountered in other states.”

Conclusions of the reports, which were written by faculty-led, student-staffed teams from various disciplines, include:

  • Technology. In view of the current low price of natural gas, the high cost of drilling deep shale formations and the absence of new oil discoveries, it is unlikely that there will be significant growth of the oil and gas industry in Michigan in the near-term future. However, considerable reserves of natural gas are believed to exist in deep shale formations such as the Utica-Collingwood, which underlies much of Michigan and eastern Lake Huron and extends into Ontario, Canada.

  • Geology/hydrogeology. A recent flurry of mineral rights acquisitions in the state associated with exploratory drilling suggests the potential for growth in natural gas production through high-volume hydraulic fracturing, though only a handful of such wells have been drilled to date. “Michigan is thus in a unique position to assess the future of high-volume hydraulic fracturing before the gas boom begins.”

  • Environment/ecology. Potential impacts of hydraulic fracturing on the environment are significant and include increased erosion and sedimentation, increased risk of aquatic contamination from chemical spills or equipment runoff, habitat fragmentation and resulting impacts on aquatic and terrestrial organisms, loss of stream riparian zones, and reduction of surface waters available to plants and animals due to the lowering of groundwater levels.

  • Public health. Possible hazards in the surrounding environment include impaired local and regional air quality, water pollution and degradation of ecosystems. Possible hazards in nearby communities include increased traffic and motor vehicle accidents, stress related to risk perception among residents, and boomtown-associated effects such as a strained health care system and road degradation.

  • Policy/law. The state is the primary source of law and policy governing hydraulic fracturing in Michigan. The operator of a high-volume hydraulically fractured well must disclose the hazardous constituents of chemical additives to the state Department of Environmental Quality for each additive within 60 days of well completion. Unlike most other states, DEQ does not require operators to report to FracFocus.org, a nationwide chemical disclosure registry.

  • Economics. The gas extraction industry creates employment and income for Michigan, but the employment effects are modest compared with other industries and not large enough to “make or break” the state’s economy. In the future, the number of technical jobs in the industry will likely increase, while less-skilled laborer positions will decline.

  • Public perceptions. A slight majority of Michigan residents believe the benefits of fracking outweigh the risks, but significant concerns remain about the potential impacts to human health, the environment and groundwater quality. The public tends to view the word “fracking” as the entirety of the natural gas development process, from leasing and permitting, to drilling and well completion, to transporting and storing wastewater and chemicals. Industry and regulatory agencies hold a much narrower definition that is limited to the process of injecting hydraulic fracturing fluids into a well. These differences in perceived meaning can lead to miscommunications that ultimately increase mistrust among stakeholders.

In fracking, water, sand and chemicals (in a mix known as hydraulic fracturing fluid) are injected under high pressure deep underground to crack sedimentary rocks, such as shale, and free trapped natural gas or oil. Though the process has been used for more than half a century to improve well production, recent technical advances have helped unlock vast stores of previously inaccessible natural gas and oil, resulting in a boom in some parts of the United States.

Chief among the technical advances are directional drilling and high-volume hydraulic fracturing, which are often used together. In directional drilling, the well operator bores vertically down to the rock formation, then follows the formation horizontally.
High-volume fracking-the focus of recent attention and public concern-is defined by the state of Michigan as a well that uses more than 100,000 gallons of hydraulic fracturing fluid. For reference, an Olympic-size swimming pool holds about 660,000 gallons of water.

Since the late 1940s, an estimated 12,000 gas and oil wells have been drilled in Michigan using hydraulic fracturing, without any reported contamination issues. Most of those wells have been relatively shallow vertical wells that each used about 50,000 gallons of water.

But recently, a small number of deep, directionally drilled, high-volume hydraulically fractured wells have been completed in the northern part of the Lower Peninsula. Those wells sometimes use several million gallons of water, and one Michigan well required more than 20 million gallons.

Since 2010, when the Petoskey Pioneer Well spurred interest in high-volume hydraulically fractured wells in Michigan, 19 such wells are known to have been completed in the state, according to Sara Gosman, a lecturer at the U-M Law School and author of the technical report on policy/law.

In the public perceptions report, authors Kim Wolske and Andrew Hoffman of the U-M Erb Institute for Global Sustainable Enterprise note that chemical additives in high-volume hydraulic fracturing fluids “remain a primary point of contention for many stakeholders in Michigan.” Many nonprofits and concerned citizens stress the point that operators of high-volume wells are not required to report the composition of those fluids to the state until 60 days after the hydraulic fracturing event.

The often-repeated concern is that if a spill were to occur, responders would not be as well-prepared as they would have been if the fluid composition had been known beforehand, Wolske and Hoffman note.

Though groundwater contamination is often cited as a top concern, surface contamination from spills and improper disposal of waste fluids likely carries the greatest risk for harmful water-quality impacts, due to proximity to potable water resources, according to the geology/hydrogeology report written by Brian Ellis, assistant professor in the Department of Civil and Environmental Engineering.

When a well is fracked, the fluid is injected into rock formations to create cracks and to prop them open. Of the total volume of hydraulic fracturing fluids injected into a well, amounts varying from 10 percent to 70 percent may return to the surface as “flowback water” after the pressure is reduced and gas or oil begin to flow toward the wellhead.

In Michigan high-volume hydraulically fractured wells, the average amount of flowback water returning to the surface is about 37 percent of injected volumes, according to the Ellis report.

The flowback water is highly saline and can contain elevated levels of heavy metals and naturally occurring radioactive elements, in addition to methane and the original chemical additives in the fracturing fluids. In Michigan, common hydraulic fracturing fluid additives include ethylene glycol, hydrochloric acid, isopropyl alcohol, methanol and ammonium persulfate, according to the Ellis report.

“However, since in Michigan all flowback is disposed of by deep-well injection and it is not allowed to sit in open pits, the risk of this type of contamination will be lower than in other states without such disposal opportunities and regulations,” Ellis wrote.

On the topic of potential water contamination, the environment/ecology report notes that Michigan’s dense, interconnected aquatic ecosystems (streams, rivers, lakes, inland and coastal wetlands) and the groundwater aquifers to which they are linked are of particular concern. The connectivity between surface and groundwater bodies “can lead to impacts distant from, as well as close to, drilling sites,” according to the report by G. Allen Burton, professor in the School of Natural Resources and Environment and director of the U-M Water Center, and Knute Nadelhoffer, professor of ecology and evolutionary biology and director of the U-M Biological Station.

The potential migration of methane, the main component of natural gas, into groundwater reservoirs has also received a lot of attention lately.

But the probability of significant methane leakage associated with deep-shale drilling involving hydraulic fracturing in Michigan “is quite low provided that best practices are adhered to,” according to the U-M report on hydraulic fracturing technologies written by John Wilson, a consultant to the U-M Energy Institute, and Johannes Schwank, professor of chemical engineering.

The greatest challenge to understanding the potential public health risks of hydraulic fracturing in Michigan is the lack of state-specific data, according to Niladri Basu, author of the public health technical report and a former faculty member at the U-M School of Public Health. While thousands of hydraulically fractured wells have been drilled in Michigan, the potential public health risks related to these facilities have been poorly documented, Basu wrote.

For example, while operators of high-volume fracking wells are required to disclose the contents of their hydraulic fracturing fluids, operators of the 12,000 or so low-volume wells in the state are not. “There needs to be much greater understanding of what chemicals are being used in every well, with information related to volumes, amounts, disposal plans, etc., made available,” Basu wrote.

The U-M hydraulic fracturing study is expected to cost at least $600,000 and is being funded by U-M through its Graham Sustainability Institute, Energy Institute and Risk Science Center. State regulators, oil and gas industry representatives, staffers from environmental nonprofits, and peer reviewers provided input to the technical reports, and more than 100 public comments were considered.

In addition to the study authors mentioned above, the technical report authors include Roland Zullo, assistant research scientist, U-M Institute for Research on Labor, Employment and the Economy (economics report).

Field geologists (finally) going digital

Not very long ago a professional geologist’s field kit consisted of a Brunton compass, rock hammer, magnifying glass, and field notebook. No longer. In the field and in the labs and classrooms, studying Earth has undergone an explosive change in recent years, fueled by technological leaps in handheld digital devices, especially tablet computers and cameras.

Geologist Terry Pavlis’ digital epiphany came almost 20 years ago when he was in a museum looking at a 19th-century geology exhibit that included a Brunton compass. “Holy moly!” he remembers thinking, “We’re still using this tool.” This is despite the fact that technological changes over the last 10 years have not only made the Brunton compass obsolete, but swept away paper field notebooks as well (the rock hammer and hand-lens magnifier remain unchallenged, however).

The key technologies that replace the 19th-century field tools are the smart phone, PDA, handheld GPS, and tablet PC and iPad. Modern tablets, in particular, can do everything a Brunton compass can, plus take pictures and act as both a notebook and mapping device, and gather precise location data using GPS. They can even be equipped with open-source GIS software.

Pavlis, a geology professor at The University of Texas at El Paso, and Stephen Whitmeyer of James Madison University will be presenting the 21st-century way to do field geology on Monday, 5 Nov., at the meeting of the Geological Society of America (GSA) in Charlotte, N.C. The presentations are a part of a digital poster Pardee Keynote Symposium titled, “Digital Geology Speed-Dating: An Innovative Coupling of Interactive Presentations and Hands-On Workshop.”

“I had a dream we would not be touching paper anymore,” says Pavlis. “I’m now sort of an evangelist on this subject.”

That’s not to say that the conversion to digital field geology is anywhere near complete. The new technology is not quite catching on in some university field courses because the technology is more expensive and becomes obsolete quickly, says Pavlis.

“Field geology courses are expensive enough for students,” he notes. As a result, the matter of teaching field geology with digital tools is actually rather controversial among professors.

Meanwhile, on the classroom side of earth science education, there are new digital tools that bring the field into the classroom. One of them is GigaPans – gigantic panorama images.

“A GigaPan is basically a really big picture that’s made of lots of full-resolution zoomed-in photos,” explains geologist Callan Bentley of Northern Virginia Community College. To make a GigaPan, you need a GigaPan Robot that looks at the scene and breaks it into a grid, then shoots the grid. That can result in hundreds or even thousands of images. The GigaPan system then stitches them together. The resulting stitched image is uploaded to the GigaPan.org website where everybody can see it.

“In geology, we look at things in multiple scales,” says Bentley. “A well-composed GigaPan is very useful.” Bentley will be presenting GigaPans at the same GSA meeting session as Pavlis, along with others using the latest technology to study and teach geology.

GigaPans were developed by Google, NASA, and the robotics lab at Carnegie Mellon University. Bentley got involved when the “Fine Outreach for Science” program recruited him. Since then, he has documenting geology of the Mid-Atlantic region.

“I have used some of it in the classroom,” said Bentley. “I have students look at a scene, make a hypothesis then look closer to test the hypothesis.”