Ice-free Arctic winters could explain amplified warming during Pliocene

Year-round ice-free conditions across the surface of the Arctic Ocean could explain why the Earth was substantially warmer during the Pliocene Epoch than it is today, despite similar concentrations of carbon dioxide in the atmosphere, according to new research carried out at the University of Colorado Boulder.

In early May, instruments at the Mauna Loa Observatory in Hawaii marked a new record: The concentration of carbon dioxide climbed to 400 parts per million for the first time in modern history.

The last time researchers believe the carbon dioxide concentration in the atmosphere reached 400 ppm-between 3 and 5 million years ago during the Pliocene-the Earth was about 3.5 to 9 degrees Fahrenheit warmer (2 to 5 degrees Celsius) than it is today. During that time period, trees overtook the tundra, sprouting right to the edges of the Arctic Ocean, and the seas swelled, pushing ocean levels 65 to 80 feet higher.

Scientists’ understanding of the climate during the Pliocene has largely been pieced together from fossil records preserved in sediments deposited beneath lakes and on the ocean floor.

“When we put 400 ppm carbon dioxide into a model, we don’t get as warm a planet as we see when we look at paleorecords from the Pliocene,” said Jim White, director of CU-Boulder’s Institute of Arctic and Alpine Research and co-author of the new study published online in the journal Palaeogeography, Paleoclimatology, Palaeoecology. “That tells us that there may be something missing in the climate models.”

Scientists have proposed several hypotheses in the past to explain the warmer Pliocene climate. One idea, for example, was that the formation of the Isthmus of Panama, the narrow strip of land linking North and South America, could have altered ocean circulations during the Pliocene, forcing warmer waters toward the Arctic. But many of those hypotheses, including the Panama possibility, have not proved viable.

For the new study, led by Ashley Ballantyne, a former CU-Boulder doctoral student who is now an assistant professor of bioclimatology at the University of Montana, the research team decided to see what would happen if they forced the model to assume that the Arctic was free of ice in the winter as well as the summer during the Pliocene. Without these additional parameters, climate models set to emulate atmospheric conditions during the Pliocene show ice-free summers followed by a layer of ice reforming during the sunless winters.

“We tried a simple experiment in which we said, ‘We don’t know why sea ice might be gone all year round, but let’s just make it go away,’ ” said White, who also is a professor of geological sciences. “And what we found was that we got the right kind of temperature change and we got a dampened seasonal cycle, both of which are things we think we see in the Pliocene.”

In the model simulation, year-round ice-free conditions caused warmer conditions in the Arctic because the open water surface allowed for evaporation. Evaporation requires energy, and the water vapor then stored that energy as heat in the atmosphere. The water vapor also created clouds, which trapped heat near the planet’s surface.

“Basically, when you take away the sea ice, the Arctic Ocean responds by creating a blanket of water vapor and clouds that keeps the Arctic warmer,” White said.

White and his colleagues are now trying to understand what types of conditions could bridge the standard model simulations with the simulations in which ice-free conditions in the Arctic are imposed. If they’re successful, computer models would be able to model the transition between a time when ice reformed in the winter to a time when the ocean remained devoid of ice throughout the year.

Such a model also would offer insight into what could happen in our future. Currently, about 70 percent of sea ice disappears during the summertime before reforming in the winter.

“We’re trying to understand what happened in the past but with a very keen eye to the future and the present,” White said. “The piece that we’re looking at in the future is what is going to happen as the Arctic Ocean warms up and becomes more ice-free in the summertime.

“Will we continue to return to an ice-covered Arctic in the wintertime? Or will we start to see some of the feedbacks that now aren’t very well represented in our climate models? If we do, that’s a big game changer.”

Ice-free Arctic winters could explain amplified warming during Pliocene

Year-round ice-free conditions across the surface of the Arctic Ocean could explain why the Earth was substantially warmer during the Pliocene Epoch than it is today, despite similar concentrations of carbon dioxide in the atmosphere, according to new research carried out at the University of Colorado Boulder.

In early May, instruments at the Mauna Loa Observatory in Hawaii marked a new record: The concentration of carbon dioxide climbed to 400 parts per million for the first time in modern history.

The last time researchers believe the carbon dioxide concentration in the atmosphere reached 400 ppm-between 3 and 5 million years ago during the Pliocene-the Earth was about 3.5 to 9 degrees Fahrenheit warmer (2 to 5 degrees Celsius) than it is today. During that time period, trees overtook the tundra, sprouting right to the edges of the Arctic Ocean, and the seas swelled, pushing ocean levels 65 to 80 feet higher.

Scientists’ understanding of the climate during the Pliocene has largely been pieced together from fossil records preserved in sediments deposited beneath lakes and on the ocean floor.

“When we put 400 ppm carbon dioxide into a model, we don’t get as warm a planet as we see when we look at paleorecords from the Pliocene,” said Jim White, director of CU-Boulder’s Institute of Arctic and Alpine Research and co-author of the new study published online in the journal Palaeogeography, Paleoclimatology, Palaeoecology. “That tells us that there may be something missing in the climate models.”

Scientists have proposed several hypotheses in the past to explain the warmer Pliocene climate. One idea, for example, was that the formation of the Isthmus of Panama, the narrow strip of land linking North and South America, could have altered ocean circulations during the Pliocene, forcing warmer waters toward the Arctic. But many of those hypotheses, including the Panama possibility, have not proved viable.

For the new study, led by Ashley Ballantyne, a former CU-Boulder doctoral student who is now an assistant professor of bioclimatology at the University of Montana, the research team decided to see what would happen if they forced the model to assume that the Arctic was free of ice in the winter as well as the summer during the Pliocene. Without these additional parameters, climate models set to emulate atmospheric conditions during the Pliocene show ice-free summers followed by a layer of ice reforming during the sunless winters.

“We tried a simple experiment in which we said, ‘We don’t know why sea ice might be gone all year round, but let’s just make it go away,’ ” said White, who also is a professor of geological sciences. “And what we found was that we got the right kind of temperature change and we got a dampened seasonal cycle, both of which are things we think we see in the Pliocene.”

In the model simulation, year-round ice-free conditions caused warmer conditions in the Arctic because the open water surface allowed for evaporation. Evaporation requires energy, and the water vapor then stored that energy as heat in the atmosphere. The water vapor also created clouds, which trapped heat near the planet’s surface.

“Basically, when you take away the sea ice, the Arctic Ocean responds by creating a blanket of water vapor and clouds that keeps the Arctic warmer,” White said.

White and his colleagues are now trying to understand what types of conditions could bridge the standard model simulations with the simulations in which ice-free conditions in the Arctic are imposed. If they’re successful, computer models would be able to model the transition between a time when ice reformed in the winter to a time when the ocean remained devoid of ice throughout the year.

Such a model also would offer insight into what could happen in our future. Currently, about 70 percent of sea ice disappears during the summertime before reforming in the winter.

“We’re trying to understand what happened in the past but with a very keen eye to the future and the present,” White said. “The piece that we’re looking at in the future is what is going to happen as the Arctic Ocean warms up and becomes more ice-free in the summertime.

“Will we continue to return to an ice-covered Arctic in the wintertime? Or will we start to see some of the feedbacks that now aren’t very well represented in our climate models? If we do, that’s a big game changer.”

Sequestration and fuel reserves

A technique for trapping the greenhouse gas carbon dioxide deep underground could at the same be used to release the last fraction of natural gas liquids from ailing reservoirs, thus offsetting some of the environmental impact of burning fossil fuels. So says a paper to be published in the peer-reviewed International Journal of Oil, Gas and Coal Technology.

While so-called “fracking” as a method for extracting previously untapped fossil fuel reserves has been in the headlines recently, there are alternatives to obtaining the remaining quantities of hydrocarbons from gas/condensate reservoirs, according to Kashy Aminian of West Virginia University in Morgantown, USA, and colleagues there and at Kuwait University in Safat.

Earlier experiments suggests that using carbon dioxide instead of nitrogen or methane to blast out the hydrocarbon stock from depleted reservoirs might be highly effective and have the added benefit of trapping, or sequestering the carbon dioxide underground. Aminian and colleagues have calculated the economic benefits associated with the enhanced liquid recovery and demonstrated that the approach is technically and financially viable.

The team explains that the mixing of carbon dioxide with the condensate reservoir fluid results in a reduction of the saturation pressure, the liquid drop-out, and the compressibility factor, boosting recovery of useful hydrocarbon and allowing the carbon dioxide to be trapped within. The team found that the process works well regardless of the characteristics of the reservoir or even the rate at which the carbon dioxide is injected into the reservoir, the amount that is recovered remains just as high. Moreover, because of the compressibility of the carbon dioxide it is possible to squeeze out 1.5 to 2 times the volume of reservoir gas for the amount of carbon dioxide pumped in, there is also then the possibility of pumping in an additional 15% once as much reservoir liquid as can be retrieved has been extracted.

Sequestration and fuel reserves

A technique for trapping the greenhouse gas carbon dioxide deep underground could at the same be used to release the last fraction of natural gas liquids from ailing reservoirs, thus offsetting some of the environmental impact of burning fossil fuels. So says a paper to be published in the peer-reviewed International Journal of Oil, Gas and Coal Technology.

While so-called “fracking” as a method for extracting previously untapped fossil fuel reserves has been in the headlines recently, there are alternatives to obtaining the remaining quantities of hydrocarbons from gas/condensate reservoirs, according to Kashy Aminian of West Virginia University in Morgantown, USA, and colleagues there and at Kuwait University in Safat.

Earlier experiments suggests that using carbon dioxide instead of nitrogen or methane to blast out the hydrocarbon stock from depleted reservoirs might be highly effective and have the added benefit of trapping, or sequestering the carbon dioxide underground. Aminian and colleagues have calculated the economic benefits associated with the enhanced liquid recovery and demonstrated that the approach is technically and financially viable.

The team explains that the mixing of carbon dioxide with the condensate reservoir fluid results in a reduction of the saturation pressure, the liquid drop-out, and the compressibility factor, boosting recovery of useful hydrocarbon and allowing the carbon dioxide to be trapped within. The team found that the process works well regardless of the characteristics of the reservoir or even the rate at which the carbon dioxide is injected into the reservoir, the amount that is recovered remains just as high. Moreover, because of the compressibility of the carbon dioxide it is possible to squeeze out 1.5 to 2 times the volume of reservoir gas for the amount of carbon dioxide pumped in, there is also then the possibility of pumping in an additional 15% once as much reservoir liquid as can be retrieved has been extracted.

A journey through Cuba’s culture and geology

Few destinations capture the imagination like Cuba; a forbidden fruit to U.S. citizens since the 1960s. Recently, 14 earth scientists from the U.S.-based Association for Women Geoscientists travelled there to explore its geology and culture.

The expedition is chronicled in the August issue of EARTH Magazine. While Cuba is an intriguing destination as an actor on the global political stage, its geological history captures events that tell scientists even more about the history of the planet.

While there, the scientists studied rocks that captured the extra-terrestrial impact attributed to the demise of the dinosaurs – including shocked quartz and tsunami deposits. The scientists also learned about how local limestone was used to build forts intended to protect Cuba’s harbors from pirate attacks. Their guide even took them to sites that represent the breakup of the supercontinent Pangaea. The rocks observed in Cuba have been shown to be closely related to the Mediterranean.

Any earth scientist would agree the geologic history contained on this island is astounding. More importantly, these scientists visited Cuba to experience UNESCO World Heritage sites, and share in “people-to-people” experiences between two cultures that continue to be divided. Read more about the geological diversity of Cuba, including miles of underground cave networks and risks posed by a San Andreas-like fault at: http://bit.ly/152DT0u.

Don’t miss other exciting stories this month’s issue of Earth available at the Digital Newsstand: http://www.earthmagazine.org/digital. Read about the improvements scientists are making in hurricane forecasts, water challenges faced by a tropical paradise, and the discovery of sauropod embryos in southern China.

A journey through Cuba’s culture and geology

Few destinations capture the imagination like Cuba; a forbidden fruit to U.S. citizens since the 1960s. Recently, 14 earth scientists from the U.S.-based Association for Women Geoscientists travelled there to explore its geology and culture.

The expedition is chronicled in the August issue of EARTH Magazine. While Cuba is an intriguing destination as an actor on the global political stage, its geological history captures events that tell scientists even more about the history of the planet.

While there, the scientists studied rocks that captured the extra-terrestrial impact attributed to the demise of the dinosaurs – including shocked quartz and tsunami deposits. The scientists also learned about how local limestone was used to build forts intended to protect Cuba’s harbors from pirate attacks. Their guide even took them to sites that represent the breakup of the supercontinent Pangaea. The rocks observed in Cuba have been shown to be closely related to the Mediterranean.

Any earth scientist would agree the geologic history contained on this island is astounding. More importantly, these scientists visited Cuba to experience UNESCO World Heritage sites, and share in “people-to-people” experiences between two cultures that continue to be divided. Read more about the geological diversity of Cuba, including miles of underground cave networks and risks posed by a San Andreas-like fault at: http://bit.ly/152DT0u.

Don’t miss other exciting stories this month’s issue of Earth available at the Digital Newsstand: http://www.earthmagazine.org/digital. Read about the improvements scientists are making in hurricane forecasts, water challenges faced by a tropical paradise, and the discovery of sauropod embryos in southern China.

Online tools accelerating earthquake-engineering progress

Santiago Pujol, at far left, a Purdue associate professor of civil engineering, surveys a private residence damaged in a Haiti earthquake. The building was among 170 surveyed by civil engineers studying the effects of the January 2010 earthquake. Such photos and research-related information regarding earthquakes are part of a database maintained and serviced by the National Science Foundation's George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue. (Purdue University photo/Kari T. Nasi)
A publication-quality image is available at https://news.uns.purdue.edu/images/2013/hacker-cyberinfrastructure.jpg -  (Purdue University photo/Kari T. Nasi)
Santiago Pujol, at far left, a Purdue associate professor of civil engineering, surveys a private residence damaged in a Haiti earthquake. The building was among 170 surveyed by civil engineers studying the effects of the January 2010 earthquake. Such photos and research-related information regarding earthquakes are part of a database maintained and serviced by the National Science Foundation’s George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue. (Purdue University photo/Kari T. Nasi)
A publication-quality image is available at https://news.uns.purdue.edu/images/2013/hacker-cyberinfrastructure.jpg – (Purdue University photo/Kari T. Nasi)

A new study has found that online tools, access to experimental data and other services provided through “cyberinfrastructure” are helping to accelerate progress in earthquake engineering and science.

The research is affiliated with the National Science Foundation’s George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue University. NEES includes 14 laboratories for earthquake engineering and tsunami research, tied together with cyberinfrastructure to provide information technology for the network.

The cyberinfrastructure includes a centrally maintained, Web-based science gateway called NEEShub, which houses experimental results and makes them available for reuse by researchers, practitioners and educational communities.

“It’s a one-stop shopping site for the earthquake-engineering community to access really valuable intellectual contributions as well as experimental data generated from projects at the NEES sites,” said Thomas Hacker, an associate professor in the Department of Computer and Information Technology at Purdue and co-leader of information technology for NEES. “The NEES cyberinfrastructure provides critical information technology services in support of earthquake engineering research and helps to accelerate science and engineering progress in a substantial way.”

Findings from a recent study about cyberinfrastructure’s impact on the field were detailed in a paper published in a special issue of the Journal of Structural Engineering, which coincides with a NEES Quake Summit 2013 on Aug. 7-8 in Reno. The paper was authored by Hacker; Rudolf Eigenmann, a professor in Purdue’s School of Electrical and Computer Engineering; and Ellen Rathje, a professor in the Department of Civil, Architectural, and Environmental Engineering at the University of Texas, Austin.

A major element of the NEES cyberinfrastructure is a “project warehouse” that provides a place for researchers to upload project data, documents, papers and dissertations containing important experimental knowledge for the NEES community to access.

“A key factor in our efforts is the very strong involvement of experts in earthquake engineering and civil engineering in every aspect of our IT,” Hacker said. “The software we develop and services we provide are driven by user requirements prioritized by the community. This is an example of a large-scale cyberinfrastructure project that is really working to address big-data needs and developing technologies and solutions that work today. It’s a good example of how cyberinfrastructure can help knit together distributed communities or researchers into something greater than the sum of its parts.”

The effort requires two key aspects: technological elements and sociological elements.

“The technological elements include high-speed networks, laptops, servers and software,” he said. “The sociology includes the software-development process, the way we gather and prioritize user requirements and needs and our work with user communities. To be successful, a cyberinfrastructure effort needs to address both the technology and social elements, which has been our approach.”

The project warehouse and NEEShub collects “metadata,” or descriptive information about research needed to ensure that the information can be accessed in the future.

“Say you have an experiment with sensors over a structure to collect data like voltages over time or force displacements over time,” Eigenmann said. “What’s important for context is not only the data collected, but from which sensor, when the experiment was conducted, where the sensor was placed on the structure. When someone comes along later to reuse the information they need the metadata.”

The resources are curated, meaning the data are organized in a fashion that ensures they haven’t been modified and are valid for reference in the future.
“We take extra steps to ensure the long-term integrity of the data,” Hacker said.

NEEShub contains more than 1.6 million project files stored in more than 398,000 project directories and has been shown to have at least 65,000 users over the past year. Other metrics information is available at http://nees.org/usage.

“We are seeing continued growth in the number of users,” Rathje said. “We are helping to facilitate and enable the discovery process. We have earthquake engineering experts and civil engineering experts closely involved with every aspect of our IT and cyberinfrastructure, and we are constantly getting feedback and prototyping.”

To help quantify the impact on research, projects are ranked by how many times they are downloaded. One project alone has had 3.3 million files downloaded.

“We have a curation dashboard for each project, which gives the curation status of the information so that users know whether it’s ready to be cited and used,” Hacker said.

The site also has a DOI, or digital object identifier, for each project.

“It’s like a permanent identifier that goes with the data set,” he said. “It gives you a permanent link to the data.”
NEES researchers will continue to study the impact of cyberinfrastructure on engineering and scientific progress.

“The use and adoption of cybeinfrastructure by a community is a process,” Hacker said. “At the beginning of the process we can measure the number of visitors and people accessing information. The ultimate impact of the cyberinfrastructure will be reflected in outcomes such as the number of publications that have benefited from using the cyberinfrastructure. It takes several years to follow that process and we are in the middle of that right now, but evidence points to a significant impact.”

Online tools accelerating earthquake-engineering progress

Santiago Pujol, at far left, a Purdue associate professor of civil engineering, surveys a private residence damaged in a Haiti earthquake. The building was among 170 surveyed by civil engineers studying the effects of the January 2010 earthquake. Such photos and research-related information regarding earthquakes are part of a database maintained and serviced by the National Science Foundation's George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue. (Purdue University photo/Kari T. Nasi)
A publication-quality image is available at https://news.uns.purdue.edu/images/2013/hacker-cyberinfrastructure.jpg -  (Purdue University photo/Kari T. Nasi)
Santiago Pujol, at far left, a Purdue associate professor of civil engineering, surveys a private residence damaged in a Haiti earthquake. The building was among 170 surveyed by civil engineers studying the effects of the January 2010 earthquake. Such photos and research-related information regarding earthquakes are part of a database maintained and serviced by the National Science Foundation’s George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue. (Purdue University photo/Kari T. Nasi)
A publication-quality image is available at https://news.uns.purdue.edu/images/2013/hacker-cyberinfrastructure.jpg – (Purdue University photo/Kari T. Nasi)

A new study has found that online tools, access to experimental data and other services provided through “cyberinfrastructure” are helping to accelerate progress in earthquake engineering and science.

The research is affiliated with the National Science Foundation’s George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue University. NEES includes 14 laboratories for earthquake engineering and tsunami research, tied together with cyberinfrastructure to provide information technology for the network.

The cyberinfrastructure includes a centrally maintained, Web-based science gateway called NEEShub, which houses experimental results and makes them available for reuse by researchers, practitioners and educational communities.

“It’s a one-stop shopping site for the earthquake-engineering community to access really valuable intellectual contributions as well as experimental data generated from projects at the NEES sites,” said Thomas Hacker, an associate professor in the Department of Computer and Information Technology at Purdue and co-leader of information technology for NEES. “The NEES cyberinfrastructure provides critical information technology services in support of earthquake engineering research and helps to accelerate science and engineering progress in a substantial way.”

Findings from a recent study about cyberinfrastructure’s impact on the field were detailed in a paper published in a special issue of the Journal of Structural Engineering, which coincides with a NEES Quake Summit 2013 on Aug. 7-8 in Reno. The paper was authored by Hacker; Rudolf Eigenmann, a professor in Purdue’s School of Electrical and Computer Engineering; and Ellen Rathje, a professor in the Department of Civil, Architectural, and Environmental Engineering at the University of Texas, Austin.

A major element of the NEES cyberinfrastructure is a “project warehouse” that provides a place for researchers to upload project data, documents, papers and dissertations containing important experimental knowledge for the NEES community to access.

“A key factor in our efforts is the very strong involvement of experts in earthquake engineering and civil engineering in every aspect of our IT,” Hacker said. “The software we develop and services we provide are driven by user requirements prioritized by the community. This is an example of a large-scale cyberinfrastructure project that is really working to address big-data needs and developing technologies and solutions that work today. It’s a good example of how cyberinfrastructure can help knit together distributed communities or researchers into something greater than the sum of its parts.”

The effort requires two key aspects: technological elements and sociological elements.

“The technological elements include high-speed networks, laptops, servers and software,” he said. “The sociology includes the software-development process, the way we gather and prioritize user requirements and needs and our work with user communities. To be successful, a cyberinfrastructure effort needs to address both the technology and social elements, which has been our approach.”

The project warehouse and NEEShub collects “metadata,” or descriptive information about research needed to ensure that the information can be accessed in the future.

“Say you have an experiment with sensors over a structure to collect data like voltages over time or force displacements over time,” Eigenmann said. “What’s important for context is not only the data collected, but from which sensor, when the experiment was conducted, where the sensor was placed on the structure. When someone comes along later to reuse the information they need the metadata.”

The resources are curated, meaning the data are organized in a fashion that ensures they haven’t been modified and are valid for reference in the future.
“We take extra steps to ensure the long-term integrity of the data,” Hacker said.

NEEShub contains more than 1.6 million project files stored in more than 398,000 project directories and has been shown to have at least 65,000 users over the past year. Other metrics information is available at http://nees.org/usage.

“We are seeing continued growth in the number of users,” Rathje said. “We are helping to facilitate and enable the discovery process. We have earthquake engineering experts and civil engineering experts closely involved with every aspect of our IT and cyberinfrastructure, and we are constantly getting feedback and prototyping.”

To help quantify the impact on research, projects are ranked by how many times they are downloaded. One project alone has had 3.3 million files downloaded.

“We have a curation dashboard for each project, which gives the curation status of the information so that users know whether it’s ready to be cited and used,” Hacker said.

The site also has a DOI, or digital object identifier, for each project.

“It’s like a permanent identifier that goes with the data set,” he said. “It gives you a permanent link to the data.”
NEES researchers will continue to study the impact of cyberinfrastructure on engineering and scientific progress.

“The use and adoption of cybeinfrastructure by a community is a process,” Hacker said. “At the beginning of the process we can measure the number of visitors and people accessing information. The ultimate impact of the cyberinfrastructure will be reflected in outcomes such as the number of publications that have benefited from using the cyberinfrastructure. It takes several years to follow that process and we are in the middle of that right now, but evidence points to a significant impact.”

Potential well water contaminants highest near natural gas drilling

Brian Fontenot, who earned his Ph.D. in quantitative biology from UT Arlington, worked with Kevin Schug, UT Arlington associate professor of chemistry and biochemistry, and a team of researchers to analyze samples from 100 private water wells. -  UT Arlington
Brian Fontenot, who earned his Ph.D. in quantitative biology from UT Arlington, worked with Kevin Schug, UT Arlington associate professor of chemistry and biochemistry, and a team of researchers to analyze samples from 100 private water wells. – UT Arlington

A new study of 100 private water wells in and near the Barnett Shale showed elevated levels of potential contaminants such as arsenic and selenium closest to natural gas extraction sites, according to a team of researchers that was led by UT Arlington associate professor of chemistry and biochemistry Kevin Schug.

The results of the North Texas well study were published online by the journal Environmental Science & Technology Thursday. The peer-reviewed paper focuses on the presence of metals such as arsenic, barium, selenium and strontium in water samples. Many of these heavy metals occur naturally at low levels in groundwater, but disturbances from natural gas extraction activities could cause them to occur at elevated levels.

“This study alone can’t conclusively identify the exact causes of elevated levels of contaminants in areas near natural gas drilling, but it does provide a powerful argument for continued research,” said Brian Fontenot, a UT Arlington graduate with a doctorate in quantitative biology and lead author on the new paper.

He added: “We expect this to be the first of multiple projects that will ultimately help the scientific community, the natural gas industry, and most importantly, the public, understand the effects of natural gas drilling on water quality.”

Researchers believe the increased presence of metals could be due to a variety of factors including: industrial accidents such as faulty gas well casings; mechanical vibrations from natural gas drilling activity disturbing particles in neglected water well equipment; or the lowering of water tables through drought or the removal of water used for the hydraulic fracturing process. Any of these scenarios could release dangerous compounds into shallow groundwater.

Researchers gathered samples from private water wells of varying depth within a 13 county area in or near the Barnett Shale in North Texas over four months in the summer and fall of 2011. Ninety-one samples were drawn from what they termed “active extraction areas,” or areas that had one or more gas wells within a five kilometer radius. Another nine samples were taken from sites either inside the Barnett Shale and more than 14 kilometers from a natural gas drilling site, or from sites outside the Barnett Shale altogether. The locations of those sites were referred to as “non-active/reference areas” in the study.

Researchers accepted no outside funding to ensure the integrity of the study. They compared the samples to historical data on water wells in these counties from the Texas Water Development Board groundwater database for 1989-1999, prior to the proliferation of natural gas drilling.

In addition to standard water quality tests, the researchers used gas chromatography – mass spectrometry (GC-MS), headspace gas chromatography (HS-GC) and inductively coupled plasma-mass spectrometry (ICP-MS). Many of the tests were conducted in the Shimadzu Center for Advanced Analytical Chemistry on the UT Arlington campus.

“Natural gas drilling is one of the most talked about issues in North Texas and throughout the country. This study was an opportunity for us to use our knowledge of chemistry and statistical analysis to put people’s concerns to the test and find out whether they would be backed by scientific data,” said Schug, who is also the Shimadzu Distinguished Professor of Analytical Chemistry in the UT Arlington College of Science.

On average, researchers detected the highest levels of these contaminants within 3 kilometers of natural gas wells, including several samples that had arsenic and selenium above levels considered safe by the Environmental Protection Agency. For example, 29 wells that were within the study’s active natural gas drilling area exceeded the EPA’s Maximum Contaminant Limit of 10 micrograms per liter for arsenic, a potentially dangerous situation.

The areas lying outside of active drilling areas or outside the Barnett Shale did not show the same elevated levels for most of the metals.
Other leaders of the Texas Gas Wells team were Laura Hunt, who conducted her post-doctoral research in biology at UT Arlington, and Zacariah Hildenbrand, who earned his doctorate in biochemistry from the University of Texas at El Paso and performed post-doctoral research at UT Southwestern Medical Center. Hildenbrand is also the founder of Inform Environmental, LLC. Fontenot and Hunt work for the EPA regional office in Dallas, but the study is unaffiliated with the EPA and both received permission to work on this project outside the agency.

Scientists note in the paper that they did not find uniformity among the contamination in the active natural gas drilling areas. In other words, not all gas well sites were associated with higher levels of the metals in well water.

Some of the most notable results were on the following heavy metals:

  • Arsenic occurs naturally in the region’s water and was detected in 99 of the 100 samples. But, the concentrations of arsenic were significantly higher in the active extraction areas compared to non-extraction areas and historical data. The maximum concentration from an extraction area sample was 161 micrograms per liter, or 16 times the EPA safety standard set for drinking water. According to the EPA, people who drink water containing arsenic well in excess of the safety standard for many years “could experience skin damage or problems with their circulatory system, and may have an increased risk of getting cancer.”
  • Selenium was found in 10 samples near extraction sites, and all of those samples showed selenium levels were higher than the historical average. Two samples exceeded the standard for selenium set by the EPA. Circulation problems as well as hair or fingernail loss are some possible consequences of long-term exposure to high levels of selenium, according to the EPA.
  • Strontium was also found in almost all the samples, with concentrations significantly higher than historical levels in the areas of active gas extraction. A toxicological profile by the federal government’s Agency for Toxic Substances and Disease Registry recommends no more than 4,000 micrograms of strontium per liter in drinking water. Seventeen samples from the active extraction area and one from the non-active areas exceeded that recommended limit. Exposure to high levels of stable strontium can result in impaired bone growth in children, according to the toxic substances agency.

“After we put the word out about the study, we received numerous calls from landowner volunteers and their opinions about the natural gas drilling in their communities varied,” Hildenbrand said. “By participating in the study, they were able to get valuable data about their water, whether it be for household or land use.

“Their participation has been incredibly important to this study and has helped us bring to light some of the important environmental questions surrounding this highly contentious issue.”

The paper also recommends further research on levels of methanol and ethanol in water wells. Twenty-nine private water wells in the study contained methanol, with the highest concentrations in the active extraction areas. Twelve samples, four of which were from the non-active extraction sites, contained measurable ethanol. Both ethanol and methanol can occur naturally or as a result of industrial contamination.

Historical data on methanol and ethanol was not available, researchers said in the paper.

The paper is called “An evaluation of water quality in private drinking water wells near natural gas extraction sites in the Barnett Shale formation.” It is available on the Just Accepted page of the journal’s website. A YouTube interview with some of the study’s authors is available here: http://www.youtube.com/watch?v=H1_WDDtWR_k&feature=youtu.be.

Other co-authors include: Qinhong “Max” Hu, associate professor of earth and environmental sciences at UT Arlington; Doug D. Carlton Jr., a Ph.D. student in the chemistry and biochemistry department at UT Arlington; Hyppolite Oka, a recent graduate of the environmental and earth sciences master’s program at UT Arlington; Jayme L. Walton, a recent graduate of the biology master’s program at UT Arlington; and Dan Hopkins, of Carrollton-based Geotech Environmental Equipment, Inc.

Alexandria Osorio and Bryan Bjorndal of Assure Controls, Inc. in Vista, Calif., also are co-authors. The team used Assure’s Qwiklite? system to test for toxicity in well samples and those results are being prepared for a separate publication.

Many from the research team are now conducting well water sampling in the Permian Basin region of Texas, establishing a baseline set of data prior to gas well drilling activities there. That baseline will be used for a direct comparison to samples that will be collected during and after upcoming natural gas extraction. The team hopes that these efforts will shed further light on the relationship between natural gas extraction and ground water quality.

Potential well water contaminants highest near natural gas drilling

Brian Fontenot, who earned his Ph.D. in quantitative biology from UT Arlington, worked with Kevin Schug, UT Arlington associate professor of chemistry and biochemistry, and a team of researchers to analyze samples from 100 private water wells. -  UT Arlington
Brian Fontenot, who earned his Ph.D. in quantitative biology from UT Arlington, worked with Kevin Schug, UT Arlington associate professor of chemistry and biochemistry, and a team of researchers to analyze samples from 100 private water wells. – UT Arlington

A new study of 100 private water wells in and near the Barnett Shale showed elevated levels of potential contaminants such as arsenic and selenium closest to natural gas extraction sites, according to a team of researchers that was led by UT Arlington associate professor of chemistry and biochemistry Kevin Schug.

The results of the North Texas well study were published online by the journal Environmental Science & Technology Thursday. The peer-reviewed paper focuses on the presence of metals such as arsenic, barium, selenium and strontium in water samples. Many of these heavy metals occur naturally at low levels in groundwater, but disturbances from natural gas extraction activities could cause them to occur at elevated levels.

“This study alone can’t conclusively identify the exact causes of elevated levels of contaminants in areas near natural gas drilling, but it does provide a powerful argument for continued research,” said Brian Fontenot, a UT Arlington graduate with a doctorate in quantitative biology and lead author on the new paper.

He added: “We expect this to be the first of multiple projects that will ultimately help the scientific community, the natural gas industry, and most importantly, the public, understand the effects of natural gas drilling on water quality.”

Researchers believe the increased presence of metals could be due to a variety of factors including: industrial accidents such as faulty gas well casings; mechanical vibrations from natural gas drilling activity disturbing particles in neglected water well equipment; or the lowering of water tables through drought or the removal of water used for the hydraulic fracturing process. Any of these scenarios could release dangerous compounds into shallow groundwater.

Researchers gathered samples from private water wells of varying depth within a 13 county area in or near the Barnett Shale in North Texas over four months in the summer and fall of 2011. Ninety-one samples were drawn from what they termed “active extraction areas,” or areas that had one or more gas wells within a five kilometer radius. Another nine samples were taken from sites either inside the Barnett Shale and more than 14 kilometers from a natural gas drilling site, or from sites outside the Barnett Shale altogether. The locations of those sites were referred to as “non-active/reference areas” in the study.

Researchers accepted no outside funding to ensure the integrity of the study. They compared the samples to historical data on water wells in these counties from the Texas Water Development Board groundwater database for 1989-1999, prior to the proliferation of natural gas drilling.

In addition to standard water quality tests, the researchers used gas chromatography – mass spectrometry (GC-MS), headspace gas chromatography (HS-GC) and inductively coupled plasma-mass spectrometry (ICP-MS). Many of the tests were conducted in the Shimadzu Center for Advanced Analytical Chemistry on the UT Arlington campus.

“Natural gas drilling is one of the most talked about issues in North Texas and throughout the country. This study was an opportunity for us to use our knowledge of chemistry and statistical analysis to put people’s concerns to the test and find out whether they would be backed by scientific data,” said Schug, who is also the Shimadzu Distinguished Professor of Analytical Chemistry in the UT Arlington College of Science.

On average, researchers detected the highest levels of these contaminants within 3 kilometers of natural gas wells, including several samples that had arsenic and selenium above levels considered safe by the Environmental Protection Agency. For example, 29 wells that were within the study’s active natural gas drilling area exceeded the EPA’s Maximum Contaminant Limit of 10 micrograms per liter for arsenic, a potentially dangerous situation.

The areas lying outside of active drilling areas or outside the Barnett Shale did not show the same elevated levels for most of the metals.
Other leaders of the Texas Gas Wells team were Laura Hunt, who conducted her post-doctoral research in biology at UT Arlington, and Zacariah Hildenbrand, who earned his doctorate in biochemistry from the University of Texas at El Paso and performed post-doctoral research at UT Southwestern Medical Center. Hildenbrand is also the founder of Inform Environmental, LLC. Fontenot and Hunt work for the EPA regional office in Dallas, but the study is unaffiliated with the EPA and both received permission to work on this project outside the agency.

Scientists note in the paper that they did not find uniformity among the contamination in the active natural gas drilling areas. In other words, not all gas well sites were associated with higher levels of the metals in well water.

Some of the most notable results were on the following heavy metals:

  • Arsenic occurs naturally in the region’s water and was detected in 99 of the 100 samples. But, the concentrations of arsenic were significantly higher in the active extraction areas compared to non-extraction areas and historical data. The maximum concentration from an extraction area sample was 161 micrograms per liter, or 16 times the EPA safety standard set for drinking water. According to the EPA, people who drink water containing arsenic well in excess of the safety standard for many years “could experience skin damage or problems with their circulatory system, and may have an increased risk of getting cancer.”
  • Selenium was found in 10 samples near extraction sites, and all of those samples showed selenium levels were higher than the historical average. Two samples exceeded the standard for selenium set by the EPA. Circulation problems as well as hair or fingernail loss are some possible consequences of long-term exposure to high levels of selenium, according to the EPA.
  • Strontium was also found in almost all the samples, with concentrations significantly higher than historical levels in the areas of active gas extraction. A toxicological profile by the federal government’s Agency for Toxic Substances and Disease Registry recommends no more than 4,000 micrograms of strontium per liter in drinking water. Seventeen samples from the active extraction area and one from the non-active areas exceeded that recommended limit. Exposure to high levels of stable strontium can result in impaired bone growth in children, according to the toxic substances agency.

“After we put the word out about the study, we received numerous calls from landowner volunteers and their opinions about the natural gas drilling in their communities varied,” Hildenbrand said. “By participating in the study, they were able to get valuable data about their water, whether it be for household or land use.

“Their participation has been incredibly important to this study and has helped us bring to light some of the important environmental questions surrounding this highly contentious issue.”

The paper also recommends further research on levels of methanol and ethanol in water wells. Twenty-nine private water wells in the study contained methanol, with the highest concentrations in the active extraction areas. Twelve samples, four of which were from the non-active extraction sites, contained measurable ethanol. Both ethanol and methanol can occur naturally or as a result of industrial contamination.

Historical data on methanol and ethanol was not available, researchers said in the paper.

The paper is called “An evaluation of water quality in private drinking water wells near natural gas extraction sites in the Barnett Shale formation.” It is available on the Just Accepted page of the journal’s website. A YouTube interview with some of the study’s authors is available here: http://www.youtube.com/watch?v=H1_WDDtWR_k&feature=youtu.be.

Other co-authors include: Qinhong “Max” Hu, associate professor of earth and environmental sciences at UT Arlington; Doug D. Carlton Jr., a Ph.D. student in the chemistry and biochemistry department at UT Arlington; Hyppolite Oka, a recent graduate of the environmental and earth sciences master’s program at UT Arlington; Jayme L. Walton, a recent graduate of the biology master’s program at UT Arlington; and Dan Hopkins, of Carrollton-based Geotech Environmental Equipment, Inc.

Alexandria Osorio and Bryan Bjorndal of Assure Controls, Inc. in Vista, Calif., also are co-authors. The team used Assure’s Qwiklite? system to test for toxicity in well samples and those results are being prepared for a separate publication.

Many from the research team are now conducting well water sampling in the Permian Basin region of Texas, establishing a baseline set of data prior to gas well drilling activities there. That baseline will be used for a direct comparison to samples that will be collected during and after upcoming natural gas extraction. The team hopes that these efforts will shed further light on the relationship between natural gas extraction and ground water quality.