Researchers resolve the Karakoram glacier anomaly, a cold case of climate science

The researchers found that low-resolution models and a lack of reliable observational data obscured the Karakoram's dramatic shifts in elevation over a small area and heavy winter snowfall. They created a higher-resolution model that showed the elevation and snow water equivalent for (inlaid boxes, from left to right) the Karakoram range and northwest Himalayas, the central Himalayas that include Mount Everest, and the southeast Himalayas and the Tibetan Plateau. For elevation (left), the high-resolution model showed the sharp variations between roughly 2,500 and 5,000 meters above sea level (yellow to brown) for the Karakoram, while other areas of the map have comparatively more consistent elevations. The model also showed that the Karakoram receive much more annual snowfall (right) than other Himalayan ranges (right), an average of 100 centimeters (brown). The researchers found that the main precipitation season in the Karakoram occurs during the winter and is influenced by cold winds coming from Central Asian countries, as opposed to the heavy summer monsoons that provide the majority of precipitation to the other Himalayan ranges. -  Image by Sarah Kapnick, Program in Atmospheric and Oceanic Sciences
The researchers found that low-resolution models and a lack of reliable observational data obscured the Karakoram’s dramatic shifts in elevation over a small area and heavy winter snowfall. They created a higher-resolution model that showed the elevation and snow water equivalent for (inlaid boxes, from left to right) the Karakoram range and northwest Himalayas, the central Himalayas that include Mount Everest, and the southeast Himalayas and the Tibetan Plateau. For elevation (left), the high-resolution model showed the sharp variations between roughly 2,500 and 5,000 meters above sea level (yellow to brown) for the Karakoram, while other areas of the map have comparatively more consistent elevations. The model also showed that the Karakoram receive much more annual snowfall (right) than other Himalayan ranges (right), an average of 100 centimeters (brown). The researchers found that the main precipitation season in the Karakoram occurs during the winter and is influenced by cold winds coming from Central Asian countries, as opposed to the heavy summer monsoons that provide the majority of precipitation to the other Himalayan ranges. – Image by Sarah Kapnick, Program in Atmospheric and Oceanic Sciences

Researchers from Princeton University and other institutions may have hit upon an answer to a climate-change puzzle that has eluded scientists for years, and that could help understand the future availability of water for hundreds of millions of people.

In a phenomenon known as the “Karakoram anomaly,” glaciers in the Karakoram mountains, a range within the Himalayas, have remained stable and even increased in mass while many glaciers nearby — and worldwide — have receded during the past 150 years, particularly in recent decades. Himalayan glaciers provide freshwater to a densely populated area that includes China, Pakistan and India, and are the source of the Ganges and Indus rivers, two of the world’s major waterways.

While there have been many attempts to explain the stability of the Karakoram glaciers, the researchers report in the journal Nature Geoscience that the ice is sustained by a unique and localized seasonal pattern that keeps the mountain range relatively cold and dry during the summer. Other Himalayan ranges and the Tibetan Plateau — where glaciers have increasingly receded as Earth’s climate has warmed — receive most of their precipitation from heavy summer monsoons out of hot South and Southeast Asian nations such as India. The main precipitation season in the Karakoram, however, occurs during the winter and is influenced by cold winds coming from Central Asian countries such as Afghanistan to the west, while the main Himalayan range blocks the warmer air from the southeast throughout the year.

The researchers determined that snowfall, which is critical to maintaining glacier mass, will remain stable and even increase in magnitude at elevations above 4,500 meters (14,764 feet) in the Karakoram through at least 2100. On the other hand, snowfall over much of the Himalayas and Tibet is projected to decline even as the Indian and Southeast Asian monsoons increase in intensity under climate change.

First author Sarah Kapnick, a postdoctoral research fellow in Princeton’s Program in Atmospheric and Oceanic Sciences, said that a shortage of reliable observational data and the use of low-resolution computer models had obscured the subtleties of the Karakoram seasonal cycle and prevented scientists from unraveling the causes of the anomaly.

For models, the complication is that the Karakoram features dramatic shifts in elevation over a small area, Kapnick said. The range boasts four mountains that are more than 8,000 meters (26,246 feet) high — including K2, the world’s second highest peak — and numerous summits that exceed 7,000 meters, all of which are packed into a length of about 500 kilometers (300 miles).

Kapnick and her co-authors overcame this obstacle with a high-resolution computer model that broke the Karakoram into 50-kilometer pieces, meaning that those sharp fluctuations in altitude were better represented.

In their study, the researchers compared their model with climate models from the United Nations’ Intergovernmental Panel on Climate Change (IPCC), which averages a resolution of 210-kilometer squares, Kapnick said. At that scale, the Karakoram is reduced to an average height that is too low and results in temperatures that are too warm to sustain sufficient levels of snowfall throughout the year, and too sensitive to future temperature increases.

Thus, by the IPCC’s models, it would appear that the Karakoram’s glaciers are imperiled by climate change due to reduced snowfall, Kapnick said. This region has been a great source of controversy ever since the IPCC’s last major report, in 2007, when the panel misreported that Himalayan glaciers would likely succumb to climate change by 2035. More recent papers using current IPCC models have similarly reported snowfall losses in this region because the models do not accurately portray the topography of the Karakoram, Kapnick said.

“The higher resolution allowed us to explore what happens at these higher elevations in a way that hasn’t been able to be done,” Kapnick said. “Something that climate scientists always have to keep in mind is that models are useful for certain types of questions and not necessarily for other types of questions. While the IPCC models can be particularly useful for other parts of the world, you need a higher resolution for this area.”

Jeff Dozier, a professor of snow hydrology, earth system science and remote sensing at the University of California-Santa Barbara, said that the research addresses existing shortcomings in how mountain climates are modeled and predicted, particularly in especially steep and compact ranges. Dozier, who was not involved in the research, conducts some of his research in the Hindu Kush mountains west of the Karakoram.

Crucial information regarding water availability is often lost in computer models, observational data and other tools that typically do not represent ranges such as Karakoram accurately enough, Dozier said. For instance, a severe 2011 drought in Northern Afghanistan was a surprise partly due to erroneous runoff forecasts based on insufficient models and surface data, he said. The high-resolution model Kapnick and her co-authors developed for Karakoram potentially resolves many of the modeling issues related to mountain ranges with similar terrain, he said.

“The Karakoram Anomaly has been a puzzle, and this paper gives a credible explanation,” Dozier said. “Climate in the mountains is obviously affected strongly by the elevation, but most global climate models don’t resolve the topography well enough. So, the higher-resolution model is appropriate. About a billion people worldwide get their water resources from melting snow and many of these billion get their water from High Mountain Asia.”

The researchers used the high-resolution global-climate model GFDL-CM2.5 at the Geophysical Fluid Dynamics Laboratory (GFDL), which is on Princeton’s Forrestal Campus and administered by the National Oceanic and Atmospheric Administration (NOAA). The researchers simulated the global climate — with a focus on the Karakoram — based on observational data from 1861 to 2005, and on the IPCC’s greenhouse-gas projections for 2006-2100, which will be included in its Fifth Assessment Report scheduled for release in November.

The 50-kilometer resolution revealed conditions in Karakoram on a monthly basis, Kapnick said. It was then that she and her colleagues could observe that the monsoon months in Karakoram are not only not characterized by heavy rainfall, but also include frigid westerly winds that keep conditions in the mountain range cold enough for nearly year-round snowfall.

“There is precipitation during the summer, it just doesn’t dominate the seasonal cycle. This region, even at the same elevation as the rest of the Himalayas, is just colder,” Kapnick said.

“The high-resolution model shows us that things don’t happen perfectly across seasons. You can have statistical variations in one month but not another,” she continued. “This allows us to piece out those significant changes from one month to the next.”

Kapnick, who received her bachelor’s degree in mathematics from Princeton in 2004, worked with Thomas Delworth, a NOAA scientist and Princeton lecturer of geosciences and atmospheric and oceanic sciences; Moestasim Ashfaq, a scientist at the Oak Ridge National Laboratory Climate Change Science Institute; Sergey Malyshev, a climate modeler in Princeton’s Department of Ecology and Evolutionary Biology based at GFDL; and P.C.D. “Chris” Milly, a research hydrologist for the U.S. Geological Survey based at GFDL who received his bachelor’s degree in civil engineering from Princeton in 1978.

While the researchers show that the Karakoram will receive consistent — and perhaps increased — snowfall through 2100, more modeling work is needed to understand how the existing glaciers may change over time as a result of melt, avalanches and other factors, Kapnick said.

“Our work is an important piece to understanding the Karakoram anomaly,” Kapnick said. “But that balance of what’s coming off the glacier versus what’s coming in also matters for understanding how the glacier will change in the future.”

The paper, “Snowfall less sensitive to warming in Karakoram than in Himalayas due to a unique seasonal cycle,” was published online in-advance-of-print Oct. 12 by Nature Geoscience.

NASA study finds 1934 had worst drought of last thousand years

A new study using a reconstruction of North American drought history over the last 1,000 years found that the drought of 1934 was the driest and most widespread of the last millennium.

Using a tree-ring-based drought record from the years 1000 to 2005 and modern records, scientists from NASA and Lamont-Doherty Earth Observatory found the 1934 drought was 30 percent more severe than the runner-up drought (in 1580) and extended across 71.6 percent of western North America. For comparison, the average extent of the 2012 drought was 59.7 percent.

“It was the worst by a large margin, falling pretty far outside the normal range of variability that we see in the record,” said climate scientist Ben Cook at NASA’s Goddard Institute for Space Studies in New York. Cook is lead author of the study, which will publish in the Oct. 17 edition of Geophysical Research Letters.

Two sets of conditions led to the severity and extent of the 1934 drought. First, a high-pressure system in winter sat over the west coast of the United States and turned away wet weather – a pattern similar to that which occurred in the winter of 2013-14. Second, the spring of 1934 saw dust storms, caused by poor land management practices, suppress rainfall.

“In combination then, these two different phenomena managed to bring almost the entire nation into a drought at that time,” said co-author Richard Seager, professor at the Lamont-Doherty Earth Observatory of Columbia University in New York. “The fact that it was the worst of the millennium was probably in part because of the human role.”

According to the recent Fifth Assessment Report of the Intergovernmental Panel on Climate Change, or IPCC, climate change is likely to make droughts in North America worse, and the southwest in particular is expected to become significantly drier as are summers in the central plains. Looking back one thousand years in time is one way to get a handle on the natural variability of droughts so that scientists can tease out anthropogenic effects – such as the dust storms of 1934.

“We want to understand droughts of the past to understand to what extent climate change might make it more or less likely that those events occur in the future,” Cook said.

The abnormal high-pressure system is one lesson from the past that informs scientists’ understanding of the current severe drought in California and the western United States.

“What you saw during this last winter and during 1934, because of this high pressure in the atmosphere, is that all the wintertime storms that would normally come into places like California instead got steered much, much farther north,” Cook said. “It’s these wintertime storms that provide most of the moisture in California. So without getting that rainfall it led to a pretty severe drought.”

This type of high-pressure system is part of normal variation in the atmosphere, and whether or not it will appear in a given year is difficult to predict in computer models of the climate. Models are more attuned to droughts caused by La Niña’s colder sea surface temperatures in the Pacific Ocean, which likely triggered the multi-year Dust Bowl drought throughout the 1930s. In a normal La Niña year, the Pacific Northwest receives more rain than usual and the southwestern states typically dry out.

But a comparison of weather data to models looking at La Niña effects showed that the rain-blocking high-pressure system in the winter of 1933-34 overrode the effects of La Niña for the western states. This dried out areas from northern California to the Rockies that otherwise might have been wetter.

As winter ended, the high-pressure system shifted eastward, interfering with spring and summer rains that typically fall on the central plains. The dry conditions were exacerbated and spread even farther east by dust storms.

“We found that a lot of the drying that occurred in the spring time occurred downwind from where the dust storms originated,” Cook said, “suggesting that it’s actually the dust in the atmosphere that’s driving at least some of the drying in the spring and really allowing this drought event to spread upwards into the central plains.”

Dust clouds reflect sunlight and block solar energy from reaching the surface. That prevents evaporation that would otherwise help form rain clouds, meaning that the presence of the dust clouds themselves leads to less rain, Cook said.

“Previous work and this work offers some evidence that you need this dust feedback to explain the real anomalous nature of the Dust Bowl drought in 1934,” Cook said.

Dust storms like the ones in the 1930s aren’t a problem in North America today. The agricultural practices that gave rise to the Dust Bowl were replaced by those that minimize erosion. Still, agricultural producers need to pay attention to the changing climate and adapt accordingly, not forgetting the lessons of the past, said Seager. “The risk of severe mid-continental droughts is expected to go up over time, not down,” he said.

2015 DOE JGI’s science portfolio delves deeper into the Earth’s data mine

The U.S. Department of Energy Joint Genome Institute (DOE JGI), a DOE Office of Science user facility, has announced that 32 new projects have been selected for the 2015 Community Science Program (CSP). From sampling Antarctic lakes to Caribbean waters, and from plant root micro-ecosystems, to the subsurface underneath the water table in forested watersheds, the CSP 2015 projects portfolio highlights diverse environments where DOE mission-relevant science can be extracted.

“These projects catalyze JGI’s strategic shift in emphasis from solving an organism’s genome sequence to enabling an understanding of what this information enables organisms to do,” said Jim Bristow, DOE JGI Science Deputy who oversees the CSP. “To accomplish this, the projects selected combine DNA sequencing with large-scale experimental and computational capabilities, and in some cases include JGI’s new capability to write DNA in addition to reading it. These projects will expand research communities, and help to meet the DOE JGI imperative to translate sequence to function and ultimately into solutions for major energy and environmental problems.”

The CSP 2015 projects were selected by an external review panel from 76 full proposals received that resulted from 85 letters of intent submitted. The total allocation for the CSP 2015 portfolio is expected to exceed 60 trillion bases (terabases or Tb)-or the equivalent of 20,000 human genomes of plant, fungal and microbial genome sequences. The full list of projects may be found at http://jgi.doe.gov/our-projects/csp-plans/fy-2015-csp-plans/. The DOE JGI Community Science Program also accepts proposals for smaller-scale microbial, resequencing and DNA synthesis projects and reviews them twice a year. The CSP advances projects that harness DOE JGI’s capability in massive-scale DNA sequencing, analysis and synthesis in support of the DOE missions in alternative energy, global carbon cycling, and biogeochemistry.

Among the CSP 2015 projects selected is one from Regina Lamendella of Juniata College, who will investigate how microbial communities in Marcellus shale, the country’s largest shale gas field, respond to hydraulic fracturing and natural gas extraction. For example, as fracking uses chemicals, researchers are interested in how the microbial communities can break down environmental contaminants, and how they respond to the release of methane during oil extraction operations.

Some 1,500 miles south from those gas extraction sites, Monica Medina-Munoz of Penn State University will study the effect of thermal stress on the Caribbean coral Orbicella faveolata and the metabolic contribution of its coral host Symbiodinium. The calcium carbonate in coral reefs acts as carbon sinks, but reef health depends on microbial communities. If the photosynthetic symbionts are removed from the coral host, for example, the corals can die and calcification rates decrease. Understanding how to maintain stability in the coral-microbiome community can provide information on the coral’s contribution to the global ocean carbon cycle.

Longtime DOE JGI collaborator Jill Banfield of the University of California (UC), Berkeley is profiling the diversity of microbial communities found in the subsurface from the Rifle aquifer adjacent to the Colorado River. The subsurface is a massive, yet poorly understood, repository of organic carbon as well as greenhouse gases. Another research question, based on having the microbial populations close to both the water table and the river, is how they impact carbon, nitrogen and sulfur cycles. Her project is part of the first coordinated attempt to quantify the metabolic potential of an entire subsurface ecosystem under the aegis of the Lawrence Berkeley National Laboratory’s Subsurface Biogeochemistry Scientific Focus Area.

Banfield also successfully competed for a second CSP project to characterize the tree-root microbial interactions that occur below the soil mantle in the unsaturated zone or vadose zone, which extends into unweathered bedrock. The project’s goal is to understand how microbial communities this deep underground influence tree-based carbon fixation in forested watersheds by the Eel River in northwestern California.

Several fungal projects were selected for the 2015 CSP portfolio, including one led by Kabir Peay of Stanford University. He and his colleagues will study how fungal communities in animal feces decompose organic matter. His project has a stated end goal of developing a model system that emulates the ecosystem at Point Reyes National Seashore, where Tule elk are the largest native herbivores.

Another selected fungal project comes from Timothy James of University of Michigan, who will explore the so-called “dark matter fungi” – those not represented in culture collections. By sequencing several dozen species of unculturable zoosporic fungi from freshwater, soils and animal feces, he and his colleagues hope to develop a kingdom-wide fungal phylogenetic framework.

Christian Wurzbacher of Germany’s the Leibniz Institute of Freshwater Ecology and Inland Fisheries, IGB, will characterize fungi from the deep sea to peatlands to freshwater streams to understand the potentially novel adaptations that are necessary to thrive in their aquatic environments. The genomic information would provide information on their metabolic capabilities for breaking down cellulose, lignin and other plant cell wall components, and animal polymers such as keratin and chitin.

Many of the selected projects focus on DOE JGI Flagship Plant Genomes, with most centered on the poplar (Populus trichocarpa.) For example, longtime DOE JGI collaborator Steve DiFazio of West Virginia University is interested in poplar but will study its reproductive development with the help of a close relative, the willow (Salix purpurea). With its shorter generation time, the plant is a good model system and comparator for understanding sex determination, which can help bioenergy crop breeders by, for example, either accelerating or preventing flowering.

Another project comes from Posy Busby of the University of Washington, who will study the interactions between the poplar tree and its fungal, non-pathogenic symbionts or endophytes. As disease-causing pathogens interact with endophytes in leaves, he noted in his proposal, understanding the roles and functions of endophytes could prove useful to meeting future fuel and food requirements.

Along the lines of poplar endophytes, Carolin Frank at UC Merced will investigate the nitrogen-fixing endophytes in poplar, willow, and pine, with the aim of improving growth in grasses and agricultural crops under nutrient-poor conditions.

Rotem Sorek from the Weizmann Institute of Science in Israel takes a different approach starting from the hypothesis that poplar trees have an adaptive immunity system rooted in genome-encoded immune memory. Through deep sequencing of tissues from single poplar trees (some over a century old, others younger) his team hopes to gain insights into the tree genome’s short-term evolution and how its gene expression profiles change over time, as well as to predict how trees might respond under various climate change scenarios.

Tackling a different DOE JGI Flagship Plant Genome, Debbie Laudencia-Chingcuangco of the USDA-ARS will develop a genome-wide collection of several thousand mutants of the model grass Brachypodium distachyon to help domesticate the grasses that are being considered as candidate bioenergy feedstocks. This work is being done in collaboration with researchers at the Great Lakes Bioenergy Research Center, as the team there considers Brachypodium “critical to achieving its mission of developing productive energy crops that can be easily processed into fuels.”

Continuing the theme of candidate bioenergy grasses, Kankshita Swaminathan from the University of Illinois will study gene expression in polyploidy grasses Miscanthus and sugarcane, comparing them against the closely related diploid grass sorghum to understand how these plants recycle nutrients.

Baohong Zhang of East Carolina University also focused on a bioenergy grass, and his project will look at the microRNAs in switchgrass. These regulatory molecules are each just a couple dozen nucleotides in length and can downregulate (decrease the quantity of) a cellular component. With a library of these small transcripts, he and his team hope to identify the gene expression variation associated with desirable biofuel traits in switchgrass such as increased biomass and responses to drought and salinity stressors.

Nitin Baliga of the Institute of Systems Biology will use DOE JGI genome sequences to build a working model of the networks that regulate lipid accumulation in Chlamydomonas reinhardtii, still another DOE JGI Plant Flagship Genome and a model for characterizing biofuel production by algae.

Other accepted projects include:

The study of the genomes of 32 fungi of the Agaricales order, including 16 fungi to be sequenced for the first time, will be carried out by Jose Maria Barrasa of Spain’s University of Alcala. While many of the basidiomycete fungi involved in wood degradation that have been sequenced are from the Polyporales, he noted in his proposal, many of the fungi involved in breaking down leaf litter and buried wood are from the order Agaricales.

Now at the University of Connecticut, Jonathan Klassen conducted postdoctoral studies at GLBRC researcher Cameron Currie’s lab at University of Wisconsin-Madison. His project will study interactions in ant-microbial community fungus gardens in three states to learn more about how the associated bacterial metagenomes contribute to carbon and nitrogen cycling.

Hinsby Cadillo-Quiroz, at Arizona State University, will conduct a study of the microbial communities in the Amazon peatlands to understand their roles in both emitting greenhouse gases and in storing and cycling carbon. The peatlands are hotspots of soil organic carbon accumulation, and in the tropical regions, they are estimated to hold between 11 percent and 14 percent, or nearly 90 gigatons, of the global carbon stored in soils.

Barbara Campbell, Clemson University will study carbon cycling mechanisms of active bacteria and associated viruses in the freshwater to marine transition zone of the Delaware Bay. Understanding the microbes’ metabolism would help researchers understand they capabilities with regard to dealing with contaminants, and their roles in the nitrogen, sulfur and carbon cycles.

Jim Fredrickson of Pacific Northwest National Laboratory will characterize functional profiles of microbial mats in California, Washington and Yellowstone National Park to understand various functions such as how they produce hydrogen and methane, and break down cellulose.

Joyce Loper of USDA-ARS will carry out a comparative analysis of all Pseudomonas bacteria getting from DOE JGI the sequences of just over 100 type strains to infer a evolutionary history of the this genus — a phylogeny — to characterize the genomic diversity, and determine the distribution of genes linked to key observable traits in this non-uniform group of bacteria.

Holly Simon of Oregon Health & Science University is studying microbial populations in the Columbia River estuary, in part to learn how they enhance greenhouse gas CO2 methane and nitrous oxide production.

Michael Thon from Spain’s University of Salamanca will explore sequences of strains of the Colletotrichum species complex, which include fungal pathogens that infect many crops. One of the questions he and his team will ask is how these fungal strains have adapted to break down the range of plant cell wall compositions.

Kathleen Treseder of UC Irvine will study genes involved in sensitivity to higher temperatures in fungi from a warming experiment in an Alaskan boreal forest. The team’s plan is to fold the genomic information gained into a trait-based ecosystem model called DEMENT to predict carbon dioxide emissions under global warming.

Mary Wildermuth of UC Berkeley will study nearly a dozen genomes of powdery mildew fungi, including three that infect designated bioenergy crops. The project will identify the mechanisms by which the fungi successfully infect plants, information that could lead to the development of crops with improved resistance to fungal infection and limiting fungicide use to allow more sustainable agricultural practices.

Several researchers who have previously collaborated with the DOE JGI have new projects:

Ludmila Chistoserdova from the University of Washington had a pioneering collaboration with the DOE JGI to study microbial communities in Lake Washington. In her new project, she and her team will look at the microbes in the Lake Washington sediment to understand their role in metabolizing the potent greenhouse gas methane.

Rick Cavicchioli of Australia’s University of New South Wales will track how microbial communities change throughout a complete annual cycle in three millennia-old Antarctic lakes and a near-shore marine site. By establishing what the microbes do in different seasons, he noted in his proposal, he and his colleagues hope to learn which microbial processes change and about the factors that control the evolution and speciation of marine-derived communities in cold environments.

With samples collected from surface waters down to the deep ocean, Steve Hallam from Canada’s University of British Columbia will explore metabolic pathways and compounds involved in marine carbon cycling processes to understand how carbon is regulated in the oceans.

The project of Hans-Peter Klenk, of DSMZ in Germany, will generate sequences of 1,000 strains of Actinobacteria, which represent the third most populated bacterial phylum and look for genes that encode cellulose-degrading enzymes or enzymes involved in synthesizing novel, natural products.

Han Wosten of the Netherlands’ Utrecht University will carry out a functional genomics approach to wood degradation by looking at Agaricomycetes, in particular the model white rot fungus Schizophyllum commune and the more potent wood-degrading white rots Phanaerochaete chrysosporium and Pleurotus ostreatus that the DOE JGI has previously sequenced.

Wen-Tso Liu of the University of Illinois and his colleagues want to understand the microbial ecology in anaerobic digesters, key components of the wastewater treatment process. They will study microbial communities in anaerobic digesters from the United States, East Asia and Europe to understand the composition and function of the microbes as they are harnessed for this low-cost municipal wastewater strategy efficiently removes waster and produces methane as a sustainable energy source.

Another project that involves wastewater, albeit indirectly, comes from Erica Young of the University of Wisconsin. She has been studying algae grown in wastewater to track how they use nitrogen and phosphorus, and how cellulose and lipids are produced. Her CSP project will characterize the relationship between the algae and the bacteria that help stabilize these algal communities, particularly the diversity of the bacterial community and the pathways and interactions involved in nutrient uptake and carbon sequestration.

Previous CSP projects and other DOE JGI collaborations are highlighted in some of the DOE JGI Annual User Meeting talks that can be seen here: http://usermeeting.jgi.doe.gov/past-speakers/. The 10th Annual Genomics of Energy and Environment Meeting will be held March 24-26, 2015 in Walnut Creek, Calif. A preliminary speakers list is posted here (http://usermeeting.jgi.doe.gov/) and registration will be opened in the first week of November.

Severe drought is causing the western US to rise

The severe drought gripping the western United States in recent years is changing the landscape well beyond localized effects of water restrictions and browning lawns. Scientists at Scripps Institution of Oceanography at UC San Diego have now discovered that the growing, broad-scale loss of water is causing the entire western U.S. to rise up like an uncoiled spring.

Investigating ground positioning data from GPS stations throughout the west, Scripps researchers Adrian Borsa, Duncan Agnew, and Dan Cayan found that the water shortage is causing an “uplift” effect up to 15 millimeters (more than half an inch) in California’s mountains and on average four millimeters (0.15 of an inch) across the west. From the GPS data, they estimate the water deficit at nearly 240 gigatons (62 trillion gallons of water), equivalent to a six-inch layer of water spread out over the entire western U.S.

Results of the study, which was supported by the U.S. Geological Survey (USGS), appear in the August 21 online edition of the journal Science.

While poring through various sets of data of ground positions from highly precise GPS stations within the National Science Foundation’s Plate Boundary Observatory and other networks, Borsa, a Scripps assistant research geophysicist, kept noticing the same pattern over the 2003-2014 period: All of the stations moved upwards in the most recent years, coinciding with the timing of the current drought.

Agnew, a Scripps Oceanography geophysics professor who specializes in studying earthquakes and their impact on shaping the earth’s crust, says the GPS data can only be explained by rapid uplift of the tectonic plate upon which the western U.S. rests (Agnew cautions that the uplift has virtually no effect on the San Andreas fault and therefore does not increase the risk of earthquakes).

For Cayan, a research meteorologist with Scripps and USGS, the results paint a new picture of the dire hydrological state of the west.

“These results quantify the amount of water mass lost in the past few years,” said Cayan. “It also represents a powerful new way to track water resources over a very large landscape. We can home in on the Sierra Nevada mountains and critical California snowpack. These results demonstrate that this technique can be used to study changes in fresh water stocks in other regions around the world, if they have a network of GPS sensors.”

Sea-level spikes, volcanic risk, volcanos cause drought

Unforeseen, short-term increases in sea level caused by strong winds, pressure changes and fluctuating ocean currents can cause more damage to beaches on the East Coast over the course of a year than a powerful hurricane making landfall, according to a new study. The new research suggests that these sea-level anomalies could be more of a threat to coastal homes and businesses than previously thought, and could become higher and more frequent as a result of climate change, according to a new study accepted for publication in Geophysical Research Letters, a journal of the American Geophysical Union.

From this week’s Eos: Assessing Volcanic Risk in Saudi Arabia: An Integrated Approach


The Kingdom of Saudi Arabia has numerous large volcanic fields, known locally as “harrats.” The largest of these, Harrat Rahat, produced a basaltic fissure eruption in 1256 A.D. with lava flows traveling within 20 kilometers of the city Al-Madinah, which has a current population of 1.5 million plus an additional 3 million pilgrims annually. With more than 950 visible vents and periodic seismic swarms, an understanding of the risk of future eruptions in this volcanic field is vital. The Volcanic Risk in Saudi Arabia (VORISA) project was developed as a multidisciplinary international research collaboration that integrates geological, geophysical, hazard, and risk studies in this important area.

From AGU’s journals: Large volcanic eruptions cause drought in eastern China


In most cases, the annual East Asian Monsoon brings heavy rains and widespread flooding to southeast China and drought conditions to the northeast. At various points throughout history, however, large volcanic eruptions have upset the regular behavior of the monsoon.

Sulfate aerosols injected high into the atmosphere by powerful eruptions can lower the land-sea temperature contrast that powers the monsoon circulation. How this altered aerosol forcing affects precipitation is not entirely clear, however, as climate models do not always agree with observations of the nature and scale of the effect.

Using two independent records of historical volcanic activity along with two different measures of rainfall, including one 3,000-year long record derived from local flood and drought observations, Zhuo et al. analyzes how large volcanic eruptions changed the conditions on the ground for the period 1368 to 1911. Understanding the effect of sulfate aerosols on monsoon behavior is particularly important now, as researchers explore aerosol seeding as a means of climate engineering.

The authors find that large Northern Hemispheric volcanic eruptions cause strong droughts in much of eastern China. The drought begins in the north in the second or third summer following an eruption and slowly moves southward over the next 2 to 3 years. They find that the severity of the drought scales with the amount of aerosol injected into the atmosphere, and that it takes 4 to 5 years for precipitation to recover. The drying pattern agrees with observations from three large modern eruptions.

China’s northeast is the country’s major grain-producing region. The results suggest that any geoengineering schemes meant to mimic the effect of a large volcanic eruption could potentially trigger devastating consequences for China’s food supply.

Warm US West, cold East: A 4,000-year pattern

<IMG SRC="/Images/485889256.jpg" WIDTH="350" HEIGHT="262" BORDER="0" ALT="University of Utah geochemist Gabe Bowen led a new study, published in Nature Communications, showing that the curvy jet stream pattern that brought mild weather to western North America and intense cold to the eastern states this past winter has become more dominant during the past 4,000 years than it was from 8,000 to 4,000 years ago. The study suggests global warming may aggravate the pattern, meaning such severe winter weather extremes may be worse in the future. – Lee J. Siegel, University of Utah.”>
University of Utah geochemist Gabe Bowen led a new study, published in Nature Communications, showing that the curvy jet stream pattern that brought mild weather to western North America and intense cold to the eastern states this past winter has become more dominant during the past 4,000 years than it was from 8,000 to 4,000 years ago. The study suggests global warming may aggravate the pattern, meaning such severe winter weather extremes may be worse in the future. – Lee J. Siegel, University of Utah.

Last winter’s curvy jet stream pattern brought mild temperatures to western North America and harsh cold to the East. A University of Utah-led study shows that pattern became more pronounced 4,000 years ago, and suggests it may worsen as Earth’s climate warms.

“If this trend continues, it could contribute to more extreme winter weather events in North America, as experienced this year with warm conditions in California and Alaska and intrusion of cold Arctic air across the eastern USA,” says geochemist Gabe Bowen, senior author of the study.

The study was published online April 16 by the journal Nature Communications.

“A sinuous or curvy winter jet stream means unusual warmth in the West, drought conditions in part of the West, and abnormally cold winters in the East and Southeast,” adds Bowen, an associate professor of geology and geophysics at the University of Utah. “We saw a good example of extreme wintertime climate that largely fit that pattern this past winter,” although in the typical pattern California often is wetter.

It is not new for scientists to forecast that the current warming of Earth’s climate due to carbon dioxide, methane and other “greenhouse” gases already has led to increased weather extremes and will continue to do so.

The new study shows the jet stream pattern that brings North American wintertime weather extremes is millennia old – “a longstanding and persistent pattern of climate variability,” Bowen says. Yet it also suggests global warming may enhance the pattern so there will be more frequent or more severe winter weather extremes or both.

“This is one more reason why we may have more winter extremes in North America, as well as something of a model for what those extremes may look like,” Bowen says. Human-caused climate change is reducing equator-to-pole temperature differences; the atmosphere is warming more at the poles than at the equator. Based on what happened in past millennia, that could make a curvy jet stream even more frequent and-or intense than it is now, he says.

Bowen and his co-authors analyzed previously published data on oxygen isotope ratios in lake sediment cores and cave deposits from sites in the eastern and western United States and Canada. Those isotopes were deposited in ancient rainfall and incorporated into calcium carbonate. They reveal jet stream directions during the past 8,000 years, a geological time known as middle and late stages of the Holocene Epoch.

Next, the researchers did computer modeling or simulations of jet stream patterns – both curvy and more direct west to east – to show how changes in those patterns can explain changes in the isotope ratios left by rainfall in the old lake and cave deposits.

They found that the jet stream pattern – known technically as the Pacific North American teleconnection – shifted to a generally more “positive phase” – meaning a curvy jet stream – over a 500-year period starting about 4,000 years ago. In addition to this millennial-scale change in jet stream patterns, they also noted a cycle in which increases in the sun’s intensity every 200 years make the jet stream flatter.

Bowen conducted the study with Zhongfang Liu of Tianjin Normal University in China, Kei Yoshimura of the University of Tokyo, Nikolaus Buenning of the University of Southern California, Camille Risi of the French National Center for Scientific Research, Jeffrey Welker of the University of Alaska at Anchorage, and Fasong Yuan of Cleveland State University.

The study was funded by the National Science Foundation, National Natural Science Foundation of China, Japan Society for the Promotion of Science and a joint program by the society and Japan’s Ministry of Education, Culture, Sports, Science and Technology: the Program for Risk Information on Climate Change.

Sinuous Jet Stream Brings Winter Weather Extremes

The Pacific North American teleconnection, or PNA, “is a pattern of climate variability” with positive and negative phases, Bowen says.

“In periods of positive PNA, the jet stream is very sinuous. As it comes in from Hawaii and the Pacific, it tends to rocket up past British Columbia to the Yukon and Alaska, and then it plunges down over the Canadian plains and into the eastern United States. The main effect in terms of weather is that we tend to have cold winter weather throughout most of the eastern U.S. You have a freight car of arctic air that pushes down there.”

Bowen says that when the jet stream is curvy, “the West tends to have mild, relatively warm winters, and Pacific storms tend to occur farther north. So in Northern California, the Pacific Northwest and parts of western interior, it tends to be relatively dry, but tends to be quite wet and unusually warm in northwest Canada and Alaska.”

This past winter, there were times of a strongly curving jet stream, and times when the Pacific North American teleconnection was in its negative phase, which means “the jet stream is flat, mostly west-to-east oriented,” and sometimes split, Bowen says. In years when the jet stream pattern is more flat than curvy, “we tend to have strong storms in Northern California and Oregon. That moisture makes it into the western interior. The eastern U.S. is not affected by arctic air, so it tends to have milder winter temperatures.”

The jet stream pattern – whether curvy or flat – has its greatest effects in winter and less impact on summer weather, Bowen says. The curvy pattern is enhanced by another climate phenomenon, the El Nino-Southern Oscillation, which sends a pool of warm water eastward to the eastern Pacific and affects climate worldwide.

Traces of Ancient Rains Reveal Which Way the Wind Blew

Over the millennia, oxygen in ancient rain water was incorporated into calcium carbonate deposited in cave and lake sediments. The ratio of rare, heavy oxygen-18 to the common isotope oxygen-16 in the calcium carbonate tells geochemists whether clouds that carried the rain were moving generally north or south during a given time.

Previous research determined the dates and oxygen isotope ratios for sediments in the new study, allowing Bowen and colleagues to use the ratios to tell if the jet stream was curvy or flat at various times during the past 8,000 years.

Bowen says air flowing over the Pacific picks up water from the ocean. As a curvy jet stream carries clouds north toward Alaska, the air cools and some of the water falls out as rain, with greater proportions of heavier oxygen-18 falling, thus raising the oxygen-18-to-16 ratio in rain and certain sediments in western North America. Then the jet stream curves south over the middle of the continent, and the water vapor, already depleted in oxygen-18, falls in the East as rain with lower oxygen-18-to-16 ratios.

When the jet stream is flat and moving east-to-west, oxygen-18 in rain is still elevated in the West and depleted in the East, but the difference is much less than when the jet stream is curvy.

By examining oxygen isotope ratios in lake and cave sediments in the West and East, Bowen and colleagues showed that a flatter jet stream pattern prevailed from about 8,000 to 4,000 years ago in North America, but then, over only 500 years, the pattern shifted so that curvy jet streams became more frequent or severe or both. The method can’t distinguish frequency from severity.

The new study is based mainly on isotope ratios at Buckeye Creek Cave, W. Va.; Lake Grinell, N.J.; Oregon Caves National Monument; and Lake Jellybean, Yukon.

Additional data supporting increasing curviness of the jet stream over recent millennia came from seven other sites: Crawford Lake, Ontario; Castor Lake, Wash.; Little Salt Spring, Fla.; Estancia Lake, N.M.; Crevice Lake, Mont.; and Dog and Felker lakes, British Columbia. Some sites provided oxygen isotope data; others showed changes in weather patterns based on tree ring growth or spring deposits.

Simulating the Jet Stream

As a test of what the cave and lake sediments revealed, Bowen’s team did computer simulations of climate using software that takes isotopes into account.

Simulations of climate and oxygen isotope changes in the Middle Holocene and today resemble, respectively, today’s flat and curvy jet stream patterns, supporting the switch toward increasing jet stream sinuosity 4,000 years ago.

Why did the trend start then?

“It was a when seasonality becomes weaker,” Bowen says. The Northern Hemisphere was closer to the sun during the summer 8,000 years ago than it was 4,000 years ago or is now due to a 20,000-year cycle in Earth’s orbit. He envisions a tipping point 4,000 years ago when weakening summer sunlight reduced the equator-to-pole temperature difference and, along with an intensifying El Nino climate pattern, pushed the jet stream toward greater curviness.

Acid mine drainage reduces radioactivity in fracking waste

Much of the naturally occurring radioactivity in fracking wastewater might be removed by blending it with another wastewater from acid mine drainage, according to a Duke University-led study.

“Fracking wastewater and acid mine drainage each pose well-documented environmental and public health risks. But in laboratory tests, we found that by blending them in the right proportions we can bind some of the fracking contaminants into solids that can be removed before the water is discharged back into streams and rivers,” said Avner Vengosh, professor of geochemistry and water quality at Duke’s Nicholas School of the Environment.

“This could be an effective way to treat Marcellus Shale hydraulic fracturing wastewater, while providing a beneficial use for acid mine drainage that currently is contaminating waterways in much of the northeastern United States,” Vengosh said. “It’s a win-win for the industry and the environment.”

Blending fracking wastewater with acid mine drainage also could help reduce the depletion of local freshwater resources by giving drillers a source of usable recycled water for the hydraulic fracturing process, he added.

“Scarcity of fresh water in dry regions or during periods of drought can severely limit shale gas development in many areas of the United States and in other regions of the world where fracking is about to begin,” Vengosh said. “Using acid mine drainage or other sources of recycled or marginal water may help solve this problem and prevent freshwater depletion.”

The peer-reviewed study was published in late December 2013 in the journal Environmental Science & Technology.

In hydraulic fracturing – or fracking, as it is sometimes called – millions of tons of water are injected at high pressure down wells to crack open shale deposits buried deep underground and extract natural gas trapped within the rock. Some of the water flows back up through the well, along with natural brines and the natural gas. This “flowback fluid” typically contains high levels of salts, naturally occurring radioactive materials such as radium, and metals such as barium and strontium.

A study last year by the Duke team showed that standard treatment processes only partially remove these potentially harmful contaminants from Marcellus Shale wastewater before it is discharged back into streams and waterways, causing radioactivity to accumulate in stream sediments near the disposal site.

Acid mine drainage flows out of abandoned coal mines into many streams in the Appalachian Basin. It can be highly toxic to animals, plants and humans, and affects the quality of hundreds of waterways in Pennsylvania and West Virginia.

Because much of the current Marcellus shale gas development is taking place in regions where large amounts of historic coal mining occurred, some experts have suggested that acid mine drainage could be used to frack shale gas wells in place of fresh water.

To test that hypothesis, Vengosh and his team blended different mixtures of Marcellus Shale fracking wastewater and acid mine drainage, all of which were collected from sites in western Pennsylvania and provided to the scientists by the industry.

After 48 hours, the scientists examined the chemical and radiological contents of 26 different mixtures. Geochemical modeling was used to simulate the chemical and physical reactions that had occurred after the blending; the results of the modeling were then verified using x-ray diffraction and by measuring the radioactivity of the newly formed solids.

“Our analysis suggested that several ions, including sulfate, iron, barium and strontium, as well as between 60 and 100 percent of the radium, had precipitated within the first 10 hours into newly formed solids composed mainly of strontium barite,” Vengosh said. These radioactive solids could be removed from the mixtures and safely disposed of at licensed hazardous-waste facilities, he said. The overall salinity of the blended fluids was also reduced, making the treated water suitable for re-use at fracking sites.

“The next step is to test this in the field. While our laboratory tests show that is it technically possible to generate recycled, treated water suitable for hydraulic fracturing, field-scale tests are still necessary to confirm its feasibility under operational conditions,” Vengosh said.

Natural gas saves water, even when factoring in water lost to hydraulic fracturing

For every gallon of water used to produce natural gas through hydraulic fracturing, Texas saved 33 gallons of water by generating electricity with that natural gas instead of coal (in 2011). -  University of Texas at Austin
For every gallon of water used to produce natural gas through hydraulic fracturing, Texas saved 33 gallons of water by generating electricity with that natural gas instead of coal (in 2011). – University of Texas at Austin

A new study finds that in Texas, the U.S. state that annually generates the most electricity, the transition from coal to natural gas for electricity generation is saving water and making the state less vulnerable to drought.

Even though exploration for natural gas through hydraulic fracturing requires significant water consumption in Texas, the new consumption is easily offset by the overall water efficiencies of shifting electricity generation from coal to natural gas. The researchers estimate that water saved by shifting a power plant from coal to natural gas is 25 to 50 times as great as the amount of water used in hydraulic fracturing to extract the natural gas. Natural gas also enhances drought resilience by providing so-called peaking plants to complement increasing wind generation, which doesn’t consume water.

The results of The University of Texas at Austin study are published this week in the journal Environmental Research Letters.

The researchers estimate that in 2011 alone, Texas would have consumed an additional 32 billion gallons of water – enough to supply 870,000 average residents – if all its natural gas-fired power plants were instead coal-fired plants, even after factoring in the additional consumption of water for hydraulic fracturing to extract the natural gas.

Hydraulic fracturing is a process in which water, sand and chemicals are pumped at high pressure into a well to fracture surrounding rocks and allow oil or gas to more easily flow. Hydraulic fracturing and horizontal drilling are the main drivers behind the current boom in U.S. natural gas production.

Environmentalists and others have raised concerns about the amount of water that is consumed. In Texas, concerns are heightened because the use of hydraulic fracturing is expanding rapidly while water supplies are dwindling as the third year of a devastating drought grinds on. Because most electric power plants rely on water for cooling, the electric power supply might be particularly vulnerable to drought.

“The bottom line is that hydraulic fracturing, by boosting natural gas production and moving the state from water-intensive coal technologies, makes our electric power system more drought resilient,” says Bridget Scanlon, senior research scientist at the university’s Bureau of Economic Geology, who led the study.

To study the drought resilience of Texas power plants, Scanlon and her colleagues collected water use data for all 423 of the state’s power plants from the Energy Information Administration and from state agencies including the Texas Commission on Environmental Quality and the Texas Water Development Board, as well as other data.

Since the 1990s, the primary type of power plant built in Texas has been the natural gas combined cycle (NGCC) plant with cooling towers, which uses fuel and cooling water more efficiently than older steam turbine technologies. About a third of Texas power plants are NGCC. NGCC plants consume about a third as much water as coal steam turbine (CST) plants.

The other major type of natural gas plant in the state is a natural gas combustion turbine (NGCT) plant. NGCT plants can also help reduce the state’s water consumption for electricity generation by providing “peaking power” to support expansion of wind energy. Wind turbines don’t require water for cooling; yet wind doesn’t always blow when you need electricity. NGCT generators can be brought online in a matter of seconds to smooth out swings in electricity demand. By combining NGCT generation with wind generation, total water use can be lowered even further compared with coal-fired power generation.

The study focused exclusively on Texas, but the authors believe the results should be applicable to other regions of the U.S., where water consumption rates for the key technologies evaluated – hydraulic fracturing, NGCC plants with cooling towers and traditional coal steam turbine plants – are generally the same.

The Electric Reliability Council of Texas, manager of the state’s electricity grid, projects that if current market conditions continue through 2029, 65 percent of new power generation in the state will come from NGCC plants and 35 percent from natural gas combustion turbine plants, which use no water for cooling, but are less energy efficient than NGCC plants.

“Statewide, we’re on track to continue reducing our water intensity of electricity generation,” says Scanlon.

Hydraulic fracturing accounts for less than 1 percent of the water consumed in Texas. But in some areas where its use is heavily concentrated, it strains local water supplies, as documented in a 2011 study by Jean-Philippe Nicot of the Bureau of Economic Geology. Because natural gas is often used far from where it is originally produced, water savings from shifting to natural gas for electricity generation might not benefit the areas that use more water for hydraulic fracturing.

Potential well water contaminants highest near natural gas drilling

Brian Fontenot, who earned his Ph.D. in quantitative biology from UT Arlington, worked with Kevin Schug, UT Arlington associate professor of chemistry and biochemistry, and a team of researchers to analyze samples from 100 private water wells. -  UT Arlington
Brian Fontenot, who earned his Ph.D. in quantitative biology from UT Arlington, worked with Kevin Schug, UT Arlington associate professor of chemistry and biochemistry, and a team of researchers to analyze samples from 100 private water wells. – UT Arlington

A new study of 100 private water wells in and near the Barnett Shale showed elevated levels of potential contaminants such as arsenic and selenium closest to natural gas extraction sites, according to a team of researchers that was led by UT Arlington associate professor of chemistry and biochemistry Kevin Schug.

The results of the North Texas well study were published online by the journal Environmental Science & Technology Thursday. The peer-reviewed paper focuses on the presence of metals such as arsenic, barium, selenium and strontium in water samples. Many of these heavy metals occur naturally at low levels in groundwater, but disturbances from natural gas extraction activities could cause them to occur at elevated levels.

“This study alone can’t conclusively identify the exact causes of elevated levels of contaminants in areas near natural gas drilling, but it does provide a powerful argument for continued research,” said Brian Fontenot, a UT Arlington graduate with a doctorate in quantitative biology and lead author on the new paper.

He added: “We expect this to be the first of multiple projects that will ultimately help the scientific community, the natural gas industry, and most importantly, the public, understand the effects of natural gas drilling on water quality.”

Researchers believe the increased presence of metals could be due to a variety of factors including: industrial accidents such as faulty gas well casings; mechanical vibrations from natural gas drilling activity disturbing particles in neglected water well equipment; or the lowering of water tables through drought or the removal of water used for the hydraulic fracturing process. Any of these scenarios could release dangerous compounds into shallow groundwater.

Researchers gathered samples from private water wells of varying depth within a 13 county area in or near the Barnett Shale in North Texas over four months in the summer and fall of 2011. Ninety-one samples were drawn from what they termed “active extraction areas,” or areas that had one or more gas wells within a five kilometer radius. Another nine samples were taken from sites either inside the Barnett Shale and more than 14 kilometers from a natural gas drilling site, or from sites outside the Barnett Shale altogether. The locations of those sites were referred to as “non-active/reference areas” in the study.

Researchers accepted no outside funding to ensure the integrity of the study. They compared the samples to historical data on water wells in these counties from the Texas Water Development Board groundwater database for 1989-1999, prior to the proliferation of natural gas drilling.

In addition to standard water quality tests, the researchers used gas chromatography – mass spectrometry (GC-MS), headspace gas chromatography (HS-GC) and inductively coupled plasma-mass spectrometry (ICP-MS). Many of the tests were conducted in the Shimadzu Center for Advanced Analytical Chemistry on the UT Arlington campus.

“Natural gas drilling is one of the most talked about issues in North Texas and throughout the country. This study was an opportunity for us to use our knowledge of chemistry and statistical analysis to put people’s concerns to the test and find out whether they would be backed by scientific data,” said Schug, who is also the Shimadzu Distinguished Professor of Analytical Chemistry in the UT Arlington College of Science.

On average, researchers detected the highest levels of these contaminants within 3 kilometers of natural gas wells, including several samples that had arsenic and selenium above levels considered safe by the Environmental Protection Agency. For example, 29 wells that were within the study’s active natural gas drilling area exceeded the EPA’s Maximum Contaminant Limit of 10 micrograms per liter for arsenic, a potentially dangerous situation.

The areas lying outside of active drilling areas or outside the Barnett Shale did not show the same elevated levels for most of the metals.
Other leaders of the Texas Gas Wells team were Laura Hunt, who conducted her post-doctoral research in biology at UT Arlington, and Zacariah Hildenbrand, who earned his doctorate in biochemistry from the University of Texas at El Paso and performed post-doctoral research at UT Southwestern Medical Center. Hildenbrand is also the founder of Inform Environmental, LLC. Fontenot and Hunt work for the EPA regional office in Dallas, but the study is unaffiliated with the EPA and both received permission to work on this project outside the agency.

Scientists note in the paper that they did not find uniformity among the contamination in the active natural gas drilling areas. In other words, not all gas well sites were associated with higher levels of the metals in well water.

Some of the most notable results were on the following heavy metals:

  • Arsenic occurs naturally in the region’s water and was detected in 99 of the 100 samples. But, the concentrations of arsenic were significantly higher in the active extraction areas compared to non-extraction areas and historical data. The maximum concentration from an extraction area sample was 161 micrograms per liter, or 16 times the EPA safety standard set for drinking water. According to the EPA, people who drink water containing arsenic well in excess of the safety standard for many years “could experience skin damage or problems with their circulatory system, and may have an increased risk of getting cancer.”
  • Selenium was found in 10 samples near extraction sites, and all of those samples showed selenium levels were higher than the historical average. Two samples exceeded the standard for selenium set by the EPA. Circulation problems as well as hair or fingernail loss are some possible consequences of long-term exposure to high levels of selenium, according to the EPA.
  • Strontium was also found in almost all the samples, with concentrations significantly higher than historical levels in the areas of active gas extraction. A toxicological profile by the federal government’s Agency for Toxic Substances and Disease Registry recommends no more than 4,000 micrograms of strontium per liter in drinking water. Seventeen samples from the active extraction area and one from the non-active areas exceeded that recommended limit. Exposure to high levels of stable strontium can result in impaired bone growth in children, according to the toxic substances agency.

“After we put the word out about the study, we received numerous calls from landowner volunteers and their opinions about the natural gas drilling in their communities varied,” Hildenbrand said. “By participating in the study, they were able to get valuable data about their water, whether it be for household or land use.

“Their participation has been incredibly important to this study and has helped us bring to light some of the important environmental questions surrounding this highly contentious issue.”

The paper also recommends further research on levels of methanol and ethanol in water wells. Twenty-nine private water wells in the study contained methanol, with the highest concentrations in the active extraction areas. Twelve samples, four of which were from the non-active extraction sites, contained measurable ethanol. Both ethanol and methanol can occur naturally or as a result of industrial contamination.

Historical data on methanol and ethanol was not available, researchers said in the paper.

The paper is called “An evaluation of water quality in private drinking water wells near natural gas extraction sites in the Barnett Shale formation.” It is available on the Just Accepted page of the journal’s website. A YouTube interview with some of the study’s authors is available here: http://www.youtube.com/watch?v=H1_WDDtWR_k&feature=youtu.be.

Other co-authors include: Qinhong “Max” Hu, associate professor of earth and environmental sciences at UT Arlington; Doug D. Carlton Jr., a Ph.D. student in the chemistry and biochemistry department at UT Arlington; Hyppolite Oka, a recent graduate of the environmental and earth sciences master’s program at UT Arlington; Jayme L. Walton, a recent graduate of the biology master’s program at UT Arlington; and Dan Hopkins, of Carrollton-based Geotech Environmental Equipment, Inc.

Alexandria Osorio and Bryan Bjorndal of Assure Controls, Inc. in Vista, Calif., also are co-authors. The team used Assure’s Qwiklite? system to test for toxicity in well samples and those results are being prepared for a separate publication.

Many from the research team are now conducting well water sampling in the Permian Basin region of Texas, establishing a baseline set of data prior to gas well drilling activities there. That baseline will be used for a direct comparison to samples that will be collected during and after upcoming natural gas extraction. The team hopes that these efforts will shed further light on the relationship between natural gas extraction and ground water quality.

Widely used index may have overestimated drought

For decades, scientists have used sophisticated instruments and computer models to predict the nature of droughts. With the threat of climate change looming large, the majority of these models have steadily predicted an increasingly frequent and severe global drought cycle. But a recent study from a team of researchers at Princeton University and the Australian National University suggests that one of these widely used tools – the Palmer Drought Severity Index (PDSI) – may be incorrect.

The PDSI was developed in the 1960s as a way to convert multiyear temperature and precipitation data into a single number representing relative wetness for each region of the United States. The PDSI, however, does not originally account for potential evaporation, which depends on solar radiation, wind speed and humidity. The new model developed by Justin Sheffield, a hydrologist at Princeton and the lead author of the study, and his team accounts for this deficiency, and is subsequently producing different numbers. Has the reported increase in drought over the last 60 years been overestimated? And what might that mean for the future?