International research team discovers new mineral

Qingsongite is a recently discovered mineral. Coesite and osbornite are also ultra-high pressure minerals. -  L. Dobrzhinetskaya, UC Riverside.
Qingsongite is a recently discovered mineral. Coesite and osbornite are also ultra-high pressure minerals. – L. Dobrzhinetskaya, UC Riverside.

Geologists at the University of California, Riverside have discovered a new mineral, cubic boron nitride, which they have named “qingsongite.”

The discovery, made in 2009, was officially approved this week by the International Mineralogical Association.

The UC Riverside geologists, Larissa Dobrzhinetskaya and Harry Green in the Department of Earth Sciences, were joined by scientists at the Lawrence Livermore National Laboratory, the University of Maine and from institutions in China and Germany in making the discovery.

“The uniqueness of qingsongite is that it is the first boron mineral that was found to be formed at extreme conditions in deep Earth,” Dobrzhinetskaya said. “All other known boron minerals are found at Earth’s surface.”

The mineral was found in the southern Tibetan mountains of China within chromium-rich rocks of the paleooceanic crust that was subducted to a depth of 190 miles and recrystallized there at a temperature of about 2372 degrees Fahrenheit and pressure of about 118430 atmospheres.

“About 180 million years ago the rocks were returned back to shallow levels of the Earth by plate tectonic processes leading to the closure of the huge Paleo-Thethys ocean – an ancient Paleozoic ocean – and the collision of India with the Asian lithospheric plate,” Dobrzhinetskaya explained.

Until now, cubic boron nitride, created first in the laboratory in 1957, was known as an important technological material. Because its atomic structure bears resemblance to carbon bonds in diamond, it has high density and could be as hard as diamond.

To date, more than 4700 species of minerals have been recognized, with at least 100 proposals for new minerals and their names submitted each year to the International Mineralogical Association for approval.

Qingsongite was named after Qingsong Fang (1939-2010), a professor at the Institute of Geology, the Chinese Academy of Geological Sciences, who found the first diamond in the Tibetan chromium-rich rocks in the late 1970s, and contributed to the discovery of four new mineral species.

Revised location of 1906 rupture of San Andreas Fault in Portola Valley

New evidence suggests the 1906 earthquake ruptured the San Andreas Fault in a single trace through Portola Village, current day Town of Portola Valley, and indicates a revised location for the fault trace.

Portola Valley, south of San Francisco, has been extensively studied and the subject of the first geological map published in California. Yet studies have offered conflicting conclusions, caused in part by a misprinted photograph and unpublished data, as to the location and nature of the 1906 surface rupture through the area.

“It is critical for the residents and leaders of Portola Valley to know the exact location of the fault – an active fault near public buildings and structures,” said co-author Chester T. Wrucke, a retired geologist with the U.S. Geological Survey and resident of Portola Valley. Independent researcher Robert T. Wrucke and engineering geologist Ted Sayre, with Cotton Shires and Associates, are co-authors of the study, published by the Bulletin of the Seismological Society of America (BSSA).

Using a new high-resolution imaging technology, known as bare-earth airborne LiDAR (Light Detection And Ranging), combined with field observations and an extensive review of archival photography, researchers reinterpreted previous documentation to locate the 1906 fault trace.

“People back then were hampered by thick vegetation to see a critical area,” said Chester Wrucke. “Modern technology – LiDAR – and modern techniques made it possible for us to see the bare ground, interpret correctly where old photographs were taken and identify the fault trace.”

The 1906 earthquake changed the landscape of Portola Valley, breaking rock formations, cracking roads, creating landslides and forcing other changes masked by vegetation. With easy access to the area, local professors and photographers from Stanford created a rich trove of field observations, photos and drawings.

J.C. Banner, then a geology professor at Stanford, was among the scientists who, along with his students, submitted their observations of the 1906 fault rupture to the California Earthquake Commission to include in an official compilation of the cause and affects of the earthquake. While the compilation, published in 1908, contained a final conclusion that the earthquake ruptured along a single fault trace in Portola Valley, a key map of that trace – Map 22 — included unintentional errors of the fault location.

Studies of the area resumed 50 years later, and those studies relied on literature, including Map 22. Subsequent studies published variations of Map 22, further altering the assumed location of the fault and suggesting the earthquake ruptured along multiple traces of the fault.

The authors sought to answer a seemingly simple question – where did the fault cross Alpine Road? “With variations in the literature and interpretation of the data, we decided to pay close attention to the original work,” said Robert Wrucke.

The authors relied on Branner’s description, together with 1906 photographs, a hand-drawn map, a student notebook and an analysis of changes to Alpine Road for clues to confirm the location of where the fault crossed Alpine Road.

Scanning archives to study all available photos from 1906 and notes from observers, the researchers compared geological features to LiDAR images. Their forensic analysis suggests the primary rupture in 1906 in Portola Valley was along the western of two main traces of the San Andreas Fault. Their analysis shows that there was no step-over within the town to the other trace.

“The biggest practical benefit of knowing the correct fault position is the ability to keep proposed news buildings off the critical rupture zone,” said Sayre.

“We had the luxury of LiDAR and were able to meld LiDAR data with old photos and made a breakthrough,” said Bob Wrucke. “Modern technology helps with geological interpretation. Our experience may be useful for others in situations where there’s confusion.”

Climate change occurring 10 times faster than at any time in past 65 million years

The planet is undergoing one of the largest changes in climate since the dinosaurs went extinct. But what might be even more troubling for humans, plants and animals is the speed of the change. Stanford climate scientists warn that the likely rate of change over the next century will be at least 10 times quicker than any climate shift in the past 65 million years.

If the trend continues at its current rapid pace, it will place significant stress on terrestrial ecosystems around the world, and many species will need to make behavioral, evolutionary or geographic adaptations to survive.

Although some of the changes the planet will experience in the next few decades are already “baked into the system,” how different the climate looks at the end of the 21st century will depend largely on how humans respond.

The findings come from a review of climate research by Noah Diffenbaugh, an associate professor of environmental Earth system science, and Chris Field, a professor of biology and of environmental Earth system science and the director of the Department of Global Ecology at the Carnegie Institution. The work is part of a special report on climate change in the current issue of Science.

Diffenbaugh and Field, both senior fellows at the Stanford Woods Institute for the Environment, conducted the targeted but broad review of scientific literature on aspects of climate change that can affect ecosystems, and investigated how recent observations and projections for the next century compare to past events in Earth’s history.

For instance, the planet experienced a 5 degree Celsius hike in temperature 20,000 years ago, as Earth emerged from the last ice age. This is a change comparable to the high-end of the projections for warming over the 20th and 21st centuries.

The geologic record shows that, 20,000 years ago, as the ice sheet that covered much of North America receded northward, plants and animals recolonized areas that had been under ice. As the climate continued to warm, those plants and animals moved northward, to cooler climes.

“We know from past changes that ecosystems have responded to a few degrees of global temperature change over thousands of years,” said Diffenbaugh. “But the unprecedented trajectory that we’re on now is forcing that change to occur over decades. That’s orders of magnitude faster, and we’re already seeing that some species are challenged by that rate of change.”

Some of the strongest evidence for how the global climate system responds to high levels of carbon dioxide comes from paleoclimate studies. Fifty-five million years ago, carbon dioxide in the atmosphere was elevated to a level comparable to today. The Arctic Ocean did not have ice in the summer, and nearby land was warm enough to support alligators and palm trees.

“There are two key differences for ecosystems in the coming decades compared with the geologic past,” Diffenbaugh said. “One is the rapid pace of modern climate change. The other is that today there are multiple human stressors that were not present 55 million years ago, such as urbanization and air and water pollution.”

Record-setting heat

Diffenbaugh and Field also reviewed results from two-dozen climate models to describe possible climate outcomes from present day to the end of the century. In general, extreme weather events, such as heat waves and heavy rainfall, are expected to become more severe and more frequent.

For example, the researchers note that, with continued emissions of greenhouse gases at the high end of the scenarios, annual temperatures over North America, Europe and East Asia will increase 2-4 degrees C by 2046-2065. With that amount of warming, the hottest summer of the last 20 years is expected to occur every other year, or even more frequently.

By the end of the century, should the current emissions of greenhouse gases remain unchecked, temperatures over the northern hemisphere will tip 5-6 degrees C warmer than today’s averages. In this case, the hottest summer of the last 20 years becomes the new annual norm.

“It’s not easy to intuit the exact impact from annual temperatures warming by 6 C,” Diffenbaugh said. “But this would present a novel climate for most land areas. Given the impacts those kinds of seasons currently have on terrestrial forests, agriculture and human health, we’ll likely see substantial stress from severely hot conditions.”

The scientists also projected the velocity of climate change, defined as the distance per year that species of plants and animals would need to migrate to live in annual temperatures similar to current conditions. Around the world, including much of the United States, species face needing to move toward the poles or higher in the mountains by at least one kilometer per year. Many parts of the world face much larger changes.

The human element

Some climate changes will be unavoidable, because humans have already emitted greenhouse gases into the atmosphere, and the atmosphere and oceans have already been heated.

“There is already some inertia in place,” Diffenbaugh said. “If every new power plant or factory in the world produced zero emissions, we’d still see impact from the existing infrastructure, and from gases already released.”

The more dramatic changes that could occur by the end of the century, however, are not written in stone. There are many human variables at play that could slow the pace and magnitude of change – or accelerate it.

Consider the 2.5 billion people who lack access to modern energy resources. This energy poverty means they lack fundamental benefits for illumination, cooking and transportation, and they’re more susceptible to extreme weather disasters. Increased energy access will improve their quality of life – and in some cases their chances of survival – but will increase global energy consumption and possibly hasten warming.

Diffenbaugh said that the range of climate projections offered in the report can inform decision-makers about the risks that different levels of climate change pose for ecosystems.

“There’s no question that a climate in which every summer is hotter than the hottest of the last 20 years poses real risks for ecosystems across the globe,” Diffenbaugh said. “However, there are opportunities to decrease those risks, while also ensuring access to the benefits of energy consumption.”

‘Highway from hell’ fueled Costa Rican volcano

Volcanologist Philipp Ruprecht analyzed crystals formed as Irazú's magma cooled to establish how fast it traveled. -  Kim Martineau
Volcanologist Philipp Ruprecht analyzed crystals formed as Irazú’s magma cooled to establish how fast it traveled. – Kim Martineau

If some volcanoes operate on geologic timescales, Costa Rica’s Irazú had something of a short fuse. In a new study in the journal Nature, scientists suggest that the 1960s eruption of Costa Rica’s largest stratovolcano was triggered by magma rising from the mantle over a few short months, rather than thousands of years or more, as many scientists have thought. The study is the latest to suggest that deep, hot magma can set off an eruption fairly quickly, potentially providing an extra tool for detecting an oncoming volcanic disaster.

“If we had had seismic instruments in the area at the time we could have seen these deep magmas coming,” said the study’s lead author, Philipp Ruprecht, a volcanologist at Columbia University’s Lamont-Doherty Earth Observatory. “We could have had an early warning of months, instead of days or weeks.”

Towering more than 10,000 feet and covering almost 200 square miles, Irazú erupts about every 20 years or less, with varying degrees of damage. When it awakened in 1963, it erupted for two years, killing at least 20 people and burying hundreds of homes in mud and ash. Its last eruption, in 1994, did little damage.


Irazú sits on the Pacific Ring of Fire, where oceanic crust is slowly sinking beneath the continents, producing some of earth’s most spectacular fireworks. Conventional wisdom holds that the mantle magma feeding those eruptions rises and lingers for long periods of time in a mixing chamber several miles below the volcano. But ash from Irazú’s prolonged explosion is the latest to suggest that some magma may travel directly from the upper mantle, covering more than 20 miles in a few months.

“There has to be a conduit from the mantle to the magma chamber,” said study co-author Terry Plank, a geochemist at Lamont-Doherty. “We like to call it the highway from hell.”

Their evidence comes from crystals of the mineral olivine separated from the ashes of Irazú’s 1963-1965 eruption, collected on a 2010 expedition to the volcano. As magma rising from the mantle cools, it forms crystals that preserve the conditions in which they formed. Unexpectedly, Irazú’s crystals revealed spikes of nickel, a trace element found in the mantle. The spikes told the researchers that some of Irazú’s erupted magma was so fresh the nickel had not had a chance to diffuse.


“The study provides one more piece of evidence that it’s possible to get magma from the mantle to the surface in very short order,” said John Pallister, who heads the U.S. Geological Survey (USGS) Volcano Disaster Assistance Program in Vancouver, Wash. “It tells us there’s a potentially shorter time span we need to worry about.”

Deep, fast-rising magma has been linked to other big events. In 1991, Mount Pinatubo in the Philippines spewed so much gas and ash into the atmosphere that it cooled Earth’s climate. In the weeks before the eruption, seismographs recorded hundreds of deep earthquakes that USGS geologist Randall White later attributed to magma rising from the mantle-crust boundary. In 2010, a chain of eruptions at Iceland’s Eyjafjallajökull volcano that caused widespread flight cancellations also indicated that some magma was coming from down deep. Small earthquakes set off by the eruptions suggested that the magma in Eyjafjallajökull’s last two explosions originated 12 miles and 15 miles below the surface, according to a 2012 study by University of Cambridge researcher Jon Tarasewicz in Geophysical Research Letters.

Volcanoes give off many warning signs before a blow-up. Their cones bulge with magma. They vent carbon dioxide and sulfur into the air, and throw off enough heat that satellites can detect their changing temperature. Below ground, tremors and other rumblings can be detected by seismographs. When Indonesia’s Mount Merapi roared to life in late October 2010, officials led a mass evacuation later credited with saving as many as 20,000 lives.

Still, the forecasting of volcanic eruptions is not an exact science. Even if more seismographs could be placed along the flanks of volcanoes to detect deep earthquakes, it is unclear if scientists would be able to translate the rumblings into a projected eruption date. Most problematically, many apparent warning signs do not lead to an eruption, putting officials in a bind over whether to evacuate nearby residents.

“[Several months] leaves a lot of room for error,” said Erik Klemetti, a volcanologist at Denison University who writes the “Eruptions” blog for Wired magazine. “In volcanic hazards you have very few shots to get people to leave.”

Scientists may be able to narrow the window by continuing to look for patterns between eruptions and the earthquakes that precede them. The Nature study also provides a real-world constraint for modeling how fast magma travels to the surface.

“If this interpretation is correct, you start having a speed limit that your models of magma transport have to catch,” said Tom Sisson, a USGS volcanologist based at Menlo Park, Calif.

Olivine minerals with nickel spikes similar to Irazú’s have been found in the ashes of arc volcanoes in Mexico, Siberia and the Cascades of the U.S. Pacific Northwest, said Lamont geochemist Susanne Straub, whose ideas inspired the study. “It’s clearly not a local phenomenon,” she said. The researchers are currently analyzing crystals from past volcanic eruptions in Alaska’s Aleutian Islands, Chile and Tonga, but are unsure how many will bear Irazú’s fast-rising magma signature. “Some may be capable of producing highways from hell and some may not,” said Ruprecht.

Simulations aiding study of earthquake dampers for structures

Earthquake-engineering researchers at the Harbin Institute of Technology in China work to set up a structure on a shake table for experiments to study the effects of earthquakes. Purdue University civil engineering students are working  with counterparts at the institute to study the reliability of models for testing a type of powerful damping system that might be installed in buildings and bridges to reduce structural damage and injuries during earthquakes. -  Photo courtesy of Harbin Institute of Technology
Earthquake-engineering researchers at the Harbin Institute of Technology in China work to set up a structure on a shake table for experiments to study the effects of earthquakes. Purdue University civil engineering students are working with counterparts at the institute to study the reliability of models for testing a type of powerful damping system that might be installed in buildings and bridges to reduce structural damage and injuries during earthquakes. – Photo courtesy of Harbin Institute of Technology

Researchers have demonstrated the reliability and efficiency of “real-time hybrid simulation” for testing a type of powerful damping system that might be installed in buildings and bridges to reduce structural damage and injuries during earthquakes.

The magnetorheological-fluid dampers are shock-absorbing devices containing a liquid that becomes far more viscous when a magnetic field is applied.

“It normally feels like a thick fluid, but when you apply a magnetic field it transforms into a peanut-butter consistency, which makes it generate larger forces when pushed through a small orifice,” said Shirley Dyke, a professor of mechanical engineering and civil engineering at Purdue University.

This dramatic increase in viscosity enables the devices to exert powerful forces and to modify a building’s stiffness in response to motion during an earthquake. The magnetorheological-fluid dampers, or MR dampers, have seen limited commercial use and are not yet being used routinely in structures.

Research led by Dyke and doctoral students Gaby Ou and Ali Ozdagli has now shown real-time hybrid simulations are reliable in studying the dampers. The research is affiliated with the National Science Foundation’s George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), a shared network of laboratories based at Purdue.

Dyke and her students are working with researchers at the Harbin Institute of Technology in China, home to one of only a few large-scale shake-table facilities in the world.

Findings will be discussed during the NEES Quake Summit 2013 on Aug. 7 and 8 in Reno. A research paper also was presented in May during a meeting in Italy related to a consortium called SERIES (Seismic Engineering Research Infrastructures for European Synergies). The paper was authored by Ou, Dyke, Ozdagli, and researchers Bin Wu and Bo Li from the Harbin Institute.

“The results indicate that the real-time hybrid simulation concept can be considered as a reliable and efficient testing method,” Ou said.

The simulations are referred to as hybrid because they combine computational models with data from physical tests.

“You have physical models and computational models being combined for one test,” Dyke said.

Researchers are able to perform structural tests at slow speed, but testing in real-time – or the actual speed of an earthquake – sheds new light on how the MR dampers perform in structures. The real-time ability has only recently become feasible due to technological advances in computing.

“Sometimes real-time testing is necessary, and that’s where we focus our efforts,” said Dyke, who organized a workshop on the subject to be held during the NEES meeting in Reno. “This hybrid approach is taking off lately. People are getting very excited about it.”

Ozdagli also is presenting related findings next week during the 2013 Conference of the ASCE Engineering Mechanics Institute in Evanston, Ill.

The simulations can be performed in conjunction with research using full-scale building tests. However, there are very few large-scale facilities in the world, and the testing is time-consuming and expensive.

“The real-time hybrid simulations allow you to do many tests to prepare for the one test using a full-scale facility,” Dyke said. “The nice thing is that you can change the numerical model any way you want. You can make it a four-story structure one day and the next day it’s a 10-story structure. You can test an unlimited number of cases with a single physical setup.”

The researchers will present two abstracts during the Reno meeting. One focuses on how the simulation method has been improved and the other describes the overall validation of real-time hybrid simulations.

To prove the reliability of the approach the researchers are comparing pure computational models, pure physical shake-table tests and then the real-time hybrid simulation. Research results from this three-way comparison are demonstrating that the hybrid simulations are accurate.

Ou has developed a mathematical approach to cancel out “noise” that makes it difficult to use testing data. She combined mathematical tools for a new “integrated control strategy” for the hybrid simulation.

“She found that by integrating several techniques in the right mix you can get better performance than in prior tests,” Dyke said.

The researchers have validated the simulations.

“It’s a viable method that can be used by other researchers for many different purposes and in many different laboratories,” Dyke said.

Ice-free Arctic winters could explain amplified warming during Pliocene

Year-round ice-free conditions across the surface of the Arctic Ocean could explain why the Earth was substantially warmer during the Pliocene Epoch than it is today, despite similar concentrations of carbon dioxide in the atmosphere, according to new research carried out at the University of Colorado Boulder.

In early May, instruments at the Mauna Loa Observatory in Hawaii marked a new record: The concentration of carbon dioxide climbed to 400 parts per million for the first time in modern history.

The last time researchers believe the carbon dioxide concentration in the atmosphere reached 400 ppm-between 3 and 5 million years ago during the Pliocene-the Earth was about 3.5 to 9 degrees Fahrenheit warmer (2 to 5 degrees Celsius) than it is today. During that time period, trees overtook the tundra, sprouting right to the edges of the Arctic Ocean, and the seas swelled, pushing ocean levels 65 to 80 feet higher.

Scientists’ understanding of the climate during the Pliocene has largely been pieced together from fossil records preserved in sediments deposited beneath lakes and on the ocean floor.

“When we put 400 ppm carbon dioxide into a model, we don’t get as warm a planet as we see when we look at paleorecords from the Pliocene,” said Jim White, director of CU-Boulder’s Institute of Arctic and Alpine Research and co-author of the new study published online in the journal Palaeogeography, Paleoclimatology, Palaeoecology. “That tells us that there may be something missing in the climate models.”

Scientists have proposed several hypotheses in the past to explain the warmer Pliocene climate. One idea, for example, was that the formation of the Isthmus of Panama, the narrow strip of land linking North and South America, could have altered ocean circulations during the Pliocene, forcing warmer waters toward the Arctic. But many of those hypotheses, including the Panama possibility, have not proved viable.

For the new study, led by Ashley Ballantyne, a former CU-Boulder doctoral student who is now an assistant professor of bioclimatology at the University of Montana, the research team decided to see what would happen if they forced the model to assume that the Arctic was free of ice in the winter as well as the summer during the Pliocene. Without these additional parameters, climate models set to emulate atmospheric conditions during the Pliocene show ice-free summers followed by a layer of ice reforming during the sunless winters.

“We tried a simple experiment in which we said, ‘We don’t know why sea ice might be gone all year round, but let’s just make it go away,’ ” said White, who also is a professor of geological sciences. “And what we found was that we got the right kind of temperature change and we got a dampened seasonal cycle, both of which are things we think we see in the Pliocene.”

In the model simulation, year-round ice-free conditions caused warmer conditions in the Arctic because the open water surface allowed for evaporation. Evaporation requires energy, and the water vapor then stored that energy as heat in the atmosphere. The water vapor also created clouds, which trapped heat near the planet’s surface.

“Basically, when you take away the sea ice, the Arctic Ocean responds by creating a blanket of water vapor and clouds that keeps the Arctic warmer,” White said.

White and his colleagues are now trying to understand what types of conditions could bridge the standard model simulations with the simulations in which ice-free conditions in the Arctic are imposed. If they’re successful, computer models would be able to model the transition between a time when ice reformed in the winter to a time when the ocean remained devoid of ice throughout the year.

Such a model also would offer insight into what could happen in our future. Currently, about 70 percent of sea ice disappears during the summertime before reforming in the winter.

“We’re trying to understand what happened in the past but with a very keen eye to the future and the present,” White said. “The piece that we’re looking at in the future is what is going to happen as the Arctic Ocean warms up and becomes more ice-free in the summertime.

“Will we continue to return to an ice-covered Arctic in the wintertime? Or will we start to see some of the feedbacks that now aren’t very well represented in our climate models? If we do, that’s a big game changer.”

Sequestration and fuel reserves

A technique for trapping the greenhouse gas carbon dioxide deep underground could at the same be used to release the last fraction of natural gas liquids from ailing reservoirs, thus offsetting some of the environmental impact of burning fossil fuels. So says a paper to be published in the peer-reviewed International Journal of Oil, Gas and Coal Technology.

While so-called “fracking” as a method for extracting previously untapped fossil fuel reserves has been in the headlines recently, there are alternatives to obtaining the remaining quantities of hydrocarbons from gas/condensate reservoirs, according to Kashy Aminian of West Virginia University in Morgantown, USA, and colleagues there and at Kuwait University in Safat.

Earlier experiments suggests that using carbon dioxide instead of nitrogen or methane to blast out the hydrocarbon stock from depleted reservoirs might be highly effective and have the added benefit of trapping, or sequestering the carbon dioxide underground. Aminian and colleagues have calculated the economic benefits associated with the enhanced liquid recovery and demonstrated that the approach is technically and financially viable.

The team explains that the mixing of carbon dioxide with the condensate reservoir fluid results in a reduction of the saturation pressure, the liquid drop-out, and the compressibility factor, boosting recovery of useful hydrocarbon and allowing the carbon dioxide to be trapped within. The team found that the process works well regardless of the characteristics of the reservoir or even the rate at which the carbon dioxide is injected into the reservoir, the amount that is recovered remains just as high. Moreover, because of the compressibility of the carbon dioxide it is possible to squeeze out 1.5 to 2 times the volume of reservoir gas for the amount of carbon dioxide pumped in, there is also then the possibility of pumping in an additional 15% once as much reservoir liquid as can be retrieved has been extracted.

A journey through Cuba’s culture and geology

Few destinations capture the imagination like Cuba; a forbidden fruit to U.S. citizens since the 1960s. Recently, 14 earth scientists from the U.S.-based Association for Women Geoscientists travelled there to explore its geology and culture.

The expedition is chronicled in the August issue of EARTH Magazine. While Cuba is an intriguing destination as an actor on the global political stage, its geological history captures events that tell scientists even more about the history of the planet.

While there, the scientists studied rocks that captured the extra-terrestrial impact attributed to the demise of the dinosaurs – including shocked quartz and tsunami deposits. The scientists also learned about how local limestone was used to build forts intended to protect Cuba’s harbors from pirate attacks. Their guide even took them to sites that represent the breakup of the supercontinent Pangaea. The rocks observed in Cuba have been shown to be closely related to the Mediterranean.

Any earth scientist would agree the geologic history contained on this island is astounding. More importantly, these scientists visited Cuba to experience UNESCO World Heritage sites, and share in “people-to-people” experiences between two cultures that continue to be divided. Read more about the geological diversity of Cuba, including miles of underground cave networks and risks posed by a San Andreas-like fault at: http://bit.ly/152DT0u.

Don’t miss other exciting stories this month’s issue of Earth available at the Digital Newsstand: http://www.earthmagazine.org/digital. Read about the improvements scientists are making in hurricane forecasts, water challenges faced by a tropical paradise, and the discovery of sauropod embryos in southern China.

Online tools accelerating earthquake-engineering progress

Santiago Pujol, at far left, a Purdue associate professor of civil engineering, surveys a private residence damaged in a Haiti earthquake. The building was among 170 surveyed by civil engineers studying the effects of the January 2010 earthquake. Such photos and research-related information regarding earthquakes are part of a database maintained and serviced by the National Science Foundation's George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue. (Purdue University photo/Kari T. Nasi)
A publication-quality image is available at https://news.uns.purdue.edu/images/2013/hacker-cyberinfrastructure.jpg -  (Purdue University photo/Kari T. Nasi)
Santiago Pujol, at far left, a Purdue associate professor of civil engineering, surveys a private residence damaged in a Haiti earthquake. The building was among 170 surveyed by civil engineers studying the effects of the January 2010 earthquake. Such photos and research-related information regarding earthquakes are part of a database maintained and serviced by the National Science Foundation’s George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue. (Purdue University photo/Kari T. Nasi)
A publication-quality image is available at https://news.uns.purdue.edu/images/2013/hacker-cyberinfrastructure.jpg – (Purdue University photo/Kari T. Nasi)

A new study has found that online tools, access to experimental data and other services provided through “cyberinfrastructure” are helping to accelerate progress in earthquake engineering and science.

The research is affiliated with the National Science Foundation’s George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue University. NEES includes 14 laboratories for earthquake engineering and tsunami research, tied together with cyberinfrastructure to provide information technology for the network.

The cyberinfrastructure includes a centrally maintained, Web-based science gateway called NEEShub, which houses experimental results and makes them available for reuse by researchers, practitioners and educational communities.

“It’s a one-stop shopping site for the earthquake-engineering community to access really valuable intellectual contributions as well as experimental data generated from projects at the NEES sites,” said Thomas Hacker, an associate professor in the Department of Computer and Information Technology at Purdue and co-leader of information technology for NEES. “The NEES cyberinfrastructure provides critical information technology services in support of earthquake engineering research and helps to accelerate science and engineering progress in a substantial way.”

Findings from a recent study about cyberinfrastructure’s impact on the field were detailed in a paper published in a special issue of the Journal of Structural Engineering, which coincides with a NEES Quake Summit 2013 on Aug. 7-8 in Reno. The paper was authored by Hacker; Rudolf Eigenmann, a professor in Purdue’s School of Electrical and Computer Engineering; and Ellen Rathje, a professor in the Department of Civil, Architectural, and Environmental Engineering at the University of Texas, Austin.

A major element of the NEES cyberinfrastructure is a “project warehouse” that provides a place for researchers to upload project data, documents, papers and dissertations containing important experimental knowledge for the NEES community to access.

“A key factor in our efforts is the very strong involvement of experts in earthquake engineering and civil engineering in every aspect of our IT,” Hacker said. “The software we develop and services we provide are driven by user requirements prioritized by the community. This is an example of a large-scale cyberinfrastructure project that is really working to address big-data needs and developing technologies and solutions that work today. It’s a good example of how cyberinfrastructure can help knit together distributed communities or researchers into something greater than the sum of its parts.”

The effort requires two key aspects: technological elements and sociological elements.

“The technological elements include high-speed networks, laptops, servers and software,” he said. “The sociology includes the software-development process, the way we gather and prioritize user requirements and needs and our work with user communities. To be successful, a cyberinfrastructure effort needs to address both the technology and social elements, which has been our approach.”

The project warehouse and NEEShub collects “metadata,” or descriptive information about research needed to ensure that the information can be accessed in the future.

“Say you have an experiment with sensors over a structure to collect data like voltages over time or force displacements over time,” Eigenmann said. “What’s important for context is not only the data collected, but from which sensor, when the experiment was conducted, where the sensor was placed on the structure. When someone comes along later to reuse the information they need the metadata.”

The resources are curated, meaning the data are organized in a fashion that ensures they haven’t been modified and are valid for reference in the future.
“We take extra steps to ensure the long-term integrity of the data,” Hacker said.

NEEShub contains more than 1.6 million project files stored in more than 398,000 project directories and has been shown to have at least 65,000 users over the past year. Other metrics information is available at http://nees.org/usage.

“We are seeing continued growth in the number of users,” Rathje said. “We are helping to facilitate and enable the discovery process. We have earthquake engineering experts and civil engineering experts closely involved with every aspect of our IT and cyberinfrastructure, and we are constantly getting feedback and prototyping.”

To help quantify the impact on research, projects are ranked by how many times they are downloaded. One project alone has had 3.3 million files downloaded.

“We have a curation dashboard for each project, which gives the curation status of the information so that users know whether it’s ready to be cited and used,” Hacker said.

The site also has a DOI, or digital object identifier, for each project.

“It’s like a permanent identifier that goes with the data set,” he said. “It gives you a permanent link to the data.”
NEES researchers will continue to study the impact of cyberinfrastructure on engineering and scientific progress.

“The use and adoption of cybeinfrastructure by a community is a process,” Hacker said. “At the beginning of the process we can measure the number of visitors and people accessing information. The ultimate impact of the cyberinfrastructure will be reflected in outcomes such as the number of publications that have benefited from using the cyberinfrastructure. It takes several years to follow that process and we are in the middle of that right now, but evidence points to a significant impact.”

Potential well water contaminants highest near natural gas drilling

Brian Fontenot, who earned his Ph.D. in quantitative biology from UT Arlington, worked with Kevin Schug, UT Arlington associate professor of chemistry and biochemistry, and a team of researchers to analyze samples from 100 private water wells. -  UT Arlington
Brian Fontenot, who earned his Ph.D. in quantitative biology from UT Arlington, worked with Kevin Schug, UT Arlington associate professor of chemistry and biochemistry, and a team of researchers to analyze samples from 100 private water wells. – UT Arlington

A new study of 100 private water wells in and near the Barnett Shale showed elevated levels of potential contaminants such as arsenic and selenium closest to natural gas extraction sites, according to a team of researchers that was led by UT Arlington associate professor of chemistry and biochemistry Kevin Schug.

The results of the North Texas well study were published online by the journal Environmental Science & Technology Thursday. The peer-reviewed paper focuses on the presence of metals such as arsenic, barium, selenium and strontium in water samples. Many of these heavy metals occur naturally at low levels in groundwater, but disturbances from natural gas extraction activities could cause them to occur at elevated levels.

“This study alone can’t conclusively identify the exact causes of elevated levels of contaminants in areas near natural gas drilling, but it does provide a powerful argument for continued research,” said Brian Fontenot, a UT Arlington graduate with a doctorate in quantitative biology and lead author on the new paper.

He added: “We expect this to be the first of multiple projects that will ultimately help the scientific community, the natural gas industry, and most importantly, the public, understand the effects of natural gas drilling on water quality.”

Researchers believe the increased presence of metals could be due to a variety of factors including: industrial accidents such as faulty gas well casings; mechanical vibrations from natural gas drilling activity disturbing particles in neglected water well equipment; or the lowering of water tables through drought or the removal of water used for the hydraulic fracturing process. Any of these scenarios could release dangerous compounds into shallow groundwater.

Researchers gathered samples from private water wells of varying depth within a 13 county area in or near the Barnett Shale in North Texas over four months in the summer and fall of 2011. Ninety-one samples were drawn from what they termed “active extraction areas,” or areas that had one or more gas wells within a five kilometer radius. Another nine samples were taken from sites either inside the Barnett Shale and more than 14 kilometers from a natural gas drilling site, or from sites outside the Barnett Shale altogether. The locations of those sites were referred to as “non-active/reference areas” in the study.

Researchers accepted no outside funding to ensure the integrity of the study. They compared the samples to historical data on water wells in these counties from the Texas Water Development Board groundwater database for 1989-1999, prior to the proliferation of natural gas drilling.

In addition to standard water quality tests, the researchers used gas chromatography – mass spectrometry (GC-MS), headspace gas chromatography (HS-GC) and inductively coupled plasma-mass spectrometry (ICP-MS). Many of the tests were conducted in the Shimadzu Center for Advanced Analytical Chemistry on the UT Arlington campus.

“Natural gas drilling is one of the most talked about issues in North Texas and throughout the country. This study was an opportunity for us to use our knowledge of chemistry and statistical analysis to put people’s concerns to the test and find out whether they would be backed by scientific data,” said Schug, who is also the Shimadzu Distinguished Professor of Analytical Chemistry in the UT Arlington College of Science.

On average, researchers detected the highest levels of these contaminants within 3 kilometers of natural gas wells, including several samples that had arsenic and selenium above levels considered safe by the Environmental Protection Agency. For example, 29 wells that were within the study’s active natural gas drilling area exceeded the EPA’s Maximum Contaminant Limit of 10 micrograms per liter for arsenic, a potentially dangerous situation.

The areas lying outside of active drilling areas or outside the Barnett Shale did not show the same elevated levels for most of the metals.
Other leaders of the Texas Gas Wells team were Laura Hunt, who conducted her post-doctoral research in biology at UT Arlington, and Zacariah Hildenbrand, who earned his doctorate in biochemistry from the University of Texas at El Paso and performed post-doctoral research at UT Southwestern Medical Center. Hildenbrand is also the founder of Inform Environmental, LLC. Fontenot and Hunt work for the EPA regional office in Dallas, but the study is unaffiliated with the EPA and both received permission to work on this project outside the agency.

Scientists note in the paper that they did not find uniformity among the contamination in the active natural gas drilling areas. In other words, not all gas well sites were associated with higher levels of the metals in well water.

Some of the most notable results were on the following heavy metals:

  • Arsenic occurs naturally in the region’s water and was detected in 99 of the 100 samples. But, the concentrations of arsenic were significantly higher in the active extraction areas compared to non-extraction areas and historical data. The maximum concentration from an extraction area sample was 161 micrograms per liter, or 16 times the EPA safety standard set for drinking water. According to the EPA, people who drink water containing arsenic well in excess of the safety standard for many years “could experience skin damage or problems with their circulatory system, and may have an increased risk of getting cancer.”
  • Selenium was found in 10 samples near extraction sites, and all of those samples showed selenium levels were higher than the historical average. Two samples exceeded the standard for selenium set by the EPA. Circulation problems as well as hair or fingernail loss are some possible consequences of long-term exposure to high levels of selenium, according to the EPA.
  • Strontium was also found in almost all the samples, with concentrations significantly higher than historical levels in the areas of active gas extraction. A toxicological profile by the federal government’s Agency for Toxic Substances and Disease Registry recommends no more than 4,000 micrograms of strontium per liter in drinking water. Seventeen samples from the active extraction area and one from the non-active areas exceeded that recommended limit. Exposure to high levels of stable strontium can result in impaired bone growth in children, according to the toxic substances agency.

“After we put the word out about the study, we received numerous calls from landowner volunteers and their opinions about the natural gas drilling in their communities varied,” Hildenbrand said. “By participating in the study, they were able to get valuable data about their water, whether it be for household or land use.

“Their participation has been incredibly important to this study and has helped us bring to light some of the important environmental questions surrounding this highly contentious issue.”

The paper also recommends further research on levels of methanol and ethanol in water wells. Twenty-nine private water wells in the study contained methanol, with the highest concentrations in the active extraction areas. Twelve samples, four of which were from the non-active extraction sites, contained measurable ethanol. Both ethanol and methanol can occur naturally or as a result of industrial contamination.

Historical data on methanol and ethanol was not available, researchers said in the paper.

The paper is called “An evaluation of water quality in private drinking water wells near natural gas extraction sites in the Barnett Shale formation.” It is available on the Just Accepted page of the journal’s website. A YouTube interview with some of the study’s authors is available here: http://www.youtube.com/watch?v=H1_WDDtWR_k&feature=youtu.be.

Other co-authors include: Qinhong “Max” Hu, associate professor of earth and environmental sciences at UT Arlington; Doug D. Carlton Jr., a Ph.D. student in the chemistry and biochemistry department at UT Arlington; Hyppolite Oka, a recent graduate of the environmental and earth sciences master’s program at UT Arlington; Jayme L. Walton, a recent graduate of the biology master’s program at UT Arlington; and Dan Hopkins, of Carrollton-based Geotech Environmental Equipment, Inc.

Alexandria Osorio and Bryan Bjorndal of Assure Controls, Inc. in Vista, Calif., also are co-authors. The team used Assure’s Qwiklite? system to test for toxicity in well samples and those results are being prepared for a separate publication.

Many from the research team are now conducting well water sampling in the Permian Basin region of Texas, establishing a baseline set of data prior to gas well drilling activities there. That baseline will be used for a direct comparison to samples that will be collected during and after upcoming natural gas extraction. The team hopes that these efforts will shed further light on the relationship between natural gas extraction and ground water quality.