No laughing matter: Nitrous oxide rose at end of last ice age

Researchers measured increases in atmospheric nitrous oxide concentrations about 16,000 to 10,000 years ago using ice from Taylor Glacier in Antarctica. -  Adrian Schilt
Researchers measured increases in atmospheric nitrous oxide concentrations about 16,000 to 10,000 years ago using ice from Taylor Glacier in Antarctica. – Adrian Schilt

Nitrous oxide (N2O) is an important greenhouse gas that doesn’t receive as much notoriety as carbon dioxide or methane, but a new study confirms that atmospheric levels of N2O rose significantly as the Earth came out of the last ice age and addresses the cause.

An international team of scientists analyzed air extracted from bubbles enclosed in ancient polar ice from Taylor Glacier in Antarctica, allowing for the reconstruction of the past atmospheric composition. The analysis documented a 30 percent increase in atmospheric nitrous oxide concentrations from 16,000 years ago to 10,000 years ago. This rise in N2O was caused by changes in environmental conditions in the ocean and on land, scientists say, and contributed to the warming at the end of the ice age and the melting of large ice sheets that then existed.

The findings add an important new element to studies of how Earth may respond to a warming climate in the future. Results of the study, which was funded by the U.S. National Science Foundation and the Swiss National Science Foundation, are being published this week in the journal Nature.

“We found that marine and terrestrial sources contributed about equally to the overall increase of nitrous oxide concentrations and generally evolved in parallel at the end of the last ice age,” said lead author Adrian Schilt, who did much of the work as a post-doctoral researcher at Oregon State University. Schilt then continued to work on the study at the Oeschger Centre for Climate Change Research at the University of Bern in Switzerland.

“The end of the last ice age represents a partial analog to modern warming and allows us to study the response of natural nitrous oxide emissions to changing environmental conditions,” Schilt added. “This will allow us to better understand what might happen in the future.”

Nitrous oxide is perhaps best known as laughing gas, but it is also produced by microbes on land and in the ocean in processes that occur naturally, but can be enhanced by human activity. Marine nitrous oxide production is linked closely to low oxygen conditions in the upper ocean and global warming is predicted to intensify the low-oxygen zones in many of the world’s ocean basins. N2O also destroys ozone in the stratosphere.

“Warming makes terrestrial microbes produce more nitrous oxide,” noted co-author Edward Brook, an Oregon State paleoclimatologist whose research team included Schilt. “Greenhouse gases go up and down over time, and we’d like to know more about why that happens and how it affects climate.”

Nitrous oxide is among the most difficult greenhouse gases to study in attempting to reconstruct the Earth’s climate history through ice core analysis. The specific technique that the Oregon State research team used requires large samples of pristine ice that date back to the desired time of study – in this case, between about 16,000 and 10,000 years ago.

The unusual way in which Taylor Glacier is configured allowed the scientists to extract ice samples from the surface of the glacier instead of drilling deep in the polar ice cap because older ice is transported upward near the glacier margins, said Brook, a professor in Oregon State’s College of Earth, Ocean, and Atmospheric Sciences.

The scientists were able to discern the contributions of marine and terrestrial nitrous oxide through analysis of isotopic ratios, which fingerprint the different sources of N2O in the atmosphere.

“The scientific community knew roughly what the N2O concentration trends were prior to this study,” Brook said, “but these findings confirm that and provide more exact details about changes in sources. As nitrous oxide in the atmosphere continues to increase – along with carbon dioxide and methane – we now will be able to more accurately assess where those contributions are coming from and the rate of the increase.”

Atmospheric N2O was roughly 200 parts per billion at the peak of the ice age about 20,000 years ago then rose to 260 ppb by 10,000 years ago. As of 2014, atmospheric N2Owas measured at about 327 ppb, an increase attributed primarily to agricultural influences.

Although the N2O increase at the end of the last ice age was almost equally attributable to marine and terrestrial sources, the scientists say, there were some differences.

“Our data showed that terrestrial emissions changed faster than marine emissions, which was highlighted by a fast increase of emissions on land that preceded the increase in marine emissions,” Schilt pointed out. “It appears to be a direct response to a rapid temperature change between 15,000 and 14,000 years ago.”

That finding underscores the complexity of analyzing how Earth responds to changing conditions that have to account for marine and terrestrial influences; natural variability; the influence of different greenhouse gases; and a host of other factors, Brook said.

“Natural sources of N2O are predicted to increase in the future and this study will help up test predictions on how the Earth will respond,” Brook said.

Technology-dependent emissions of gas extraction in the US

The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. -  Photo: F. Geiger/KIT
The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. – Photo: F. Geiger/KIT

Not all boreholes are the same. Scientists of the Karlsruhe Institute of Technology (KIT) used mobile measurement equipment to analyze gaseous compounds emitted by the extraction of oil and natural gas in the USA. For the first time, organic pollutants emitted during a fracking process were measured at a high temporal resolution. The highest values measured exceeded typical mean values in urban air by a factor of one thousand, as was reported in ACP journal. (DOI 10.5194/acp-14-10977-2014)

Emission of trace gases by oil and gas fields was studied by the KIT researchers in the USA (Utah and Colorado) together with US institutes. Background concentrations and the waste gas plumes of single extraction plants and fracking facilities were analyzed. The air quality measurements of several weeks duration took place under the “Uintah Basin Winter Ozone Study” coordinated by the National Oceanic and Atmospheric Administration (NOAA).

The KIT measurements focused on health-damaging aromatic hydrocarbons in air, such as carcinogenic benzene. Maximum concentrations were determined in the waste gas plumes of boreholes. Some extraction plants emitted up to about a hundred times more benzene than others. The highest values of some milligrams of benzene per cubic meter air were measured downstream of an open fracking facility, where returning drilling fluid is stored in open tanks and basins. Much better results were reached by oil and gas extraction plants and plants with closed production processes. In Germany, benzene concentration at the workplace is subject to strict limits: The Federal Emission Control Ordinance gives an annual benzene limit of five micrograms per cubic meter for the protection of human health, which is smaller than the values now measured at the open fracking facility in the US by a factor of about one thousand. The researchers published the results measured in the journal Atmospheric Chemistry and Physics ACP.

“Characteristic emissions of trace gases are encountered everywhere. These are symptomatic of gas and gas extraction. But the values measured for different technologies differ considerably,” Felix Geiger of the Institute of Meteorology and Climate Research (IMK) of KIT explains. He is one of the first authors of the study. By means of closed collection tanks and so-called vapor capture systems, for instance, the gases released during operation can be collected and reduced significantly.

“The gas fields in the sparsely populated areas of North America are a good showcase for estimating the range of impacts of different extraction and fracking technologies,” explains Professor Johannes Orphal, Head of IMK. “In the densely populated Germany, framework conditions are much stricter and much more attention is paid to reducing and monitoring emissions.”

Fracking is increasingly discussed as a technology to extract fossil resources from unconventional deposits. Hydraulic breaking of suitable shale stone layers opens up the fossil fuels stored there and makes them accessible for economically efficient use. For this purpose, boreholes are drilled into these rock formations. Then, they are subjected to high pressure using large amounts of water and auxiliary materials, such as sand, cement, and chemicals. The oil or gas can flow to the surface through the opened microstructures in the rock. Typically, the return flow of the aqueous fracking liquid with the dissolved oil and gas constituents to the surface lasts several days until the production phase proper of purer oil or natural gas. This return flow is collected and then reused until it finally has to be disposed of. Air pollution mainly depends on the treatment of this return flow at the extraction plant. In this respect, currently practiced fracking technologies differ considerably. For the first time now, the resulting local atmospheric emissions were studied at a high temporary resolution. Based on the results, emissions can be assigned directly to the different plant sections of an extraction plant. For measurement, the newly developed, compact, and highly sensitive instrument, a so-called proton transfer reaction mass spectrometer (PTR-MS), of KIT was installed on board of a minivan and driven closer to the different extraction points, the distances being a few tens of meters. In this way, the waste gas plumes of individual extraction sources and fracking processes were studied in detail.

Warneke, C., Geiger, F., Edwards, P. M., Dube, W., Pétron, G., Kofler, J., Zahn, A., Brown, S. S., Graus, M., Gilman, J. B., Lerner, B. M., Peischl, J., Ryerson, T. B., de Gouw, J. A., and Roberts, J. M.: Volatile organic compound emissions from the oil and natural gas industry in the Uintah Basin, Utah: oil and gas well pad emissions compared to ambient air composition, Atmos. Chem. Phys., 14, 10977-10988, doi:10.5194/acp-14-10977-2014, 2014.

ASU, IBM move ultrafast, low-cost DNA sequencing technology a step closer to reality

Led by ASU Regents' professor Stuart Lindsay, a team of scientists from Arizona State University's Biodesign Institute and IBM's T.J. Watson Research Center have developed a prototype DNA reader that could make whole genome profiling an everyday practice in medicine. -  Biodesign Institute at Arizona State University
Led by ASU Regents’ professor Stuart Lindsay, a team of scientists from Arizona State University’s Biodesign Institute and IBM’s T.J. Watson Research Center have developed a prototype DNA reader that could make whole genome profiling an everyday practice in medicine. – Biodesign Institute at Arizona State University

A team of scientists from Arizona State University’s Biodesign Institute and IBM’s T.J. Watson Research Center have developed a prototype DNA reader that could make whole genome profiling an everyday practice in medicine.

“Our goal is to put cheap, simple and powerful DNA and protein diagnostic devices into every single doctor’s office,” said Stuart Lindsay, an ASU physics professor and director of Biodesign’s Center for Single Molecule Biophysics. Such technology could help usher in the age of personalized medicine, where information from an individual’s complete DNA and protein profiles could be used to design treatments specific to their individual makeup.

Such game-changing technology is needed to make genome sequencing a reality. The current hurdle is to do so for less than $1,000, an amount for which insurance companies are more likely to provide reimbursement.

In their latest research breakthrough, the team fashioned a tiny, DNA reading device a thousands of times smaller than width of a single human hair.

The device is sensitive enough to distinguish the individual chemical bases of DNA (known by their abbreviated letters of A, C, T or G) when they are pumped past the reading head.

Proof-of-concept was demonstrated, by using solutions of the individual DNA bases, which gave clear signals sensitive enough to detect tiny amounts of DNA (nanomolar concentrations), even better than today’s state-of-the-art, so called next-generation DNA sequencing technology.

Making the solid-state device is just like making a sandwich, just with ultra high-tech semiconductor tools used to slice and stack the atomic-sized layers of meats and cheeses like the butcher shop’s block. The secret is to make slice and stack the layers just so, to turn the chemical information of the DNA into a change in the electrical signal.

First, they made a “sandwich” composed of two metal electrodes separated by a two-nanometer thick insulating layer (a single nanometer is 10,000 times smaller than a human hair), made by using a semiconductor technology called atomic layer deposition.

Then a hole is cut through the sandwich: DNA bases inside the hole are read as they pass the gap between the metal layers.

“The technology we’ve developed might just be the first big step in building a single-molecule sequencing device based on ordinary computer chip technology,” said Lindsay.

“Previous attempts to make tunnel junctions for reading DNA had one electrode facing another across a small gap between the electrodes, and the gaps had to be adjusted by hand. This made it impossible to use computer chip manufacturing methods to make devices,” said Lindsay.

“Our approach of defining the gap using a thin layer of dielectric (insulating) material between the electrodes and exposing this gap by drilling a hole through the layers is much easier,” he said. “What is more, the recognition tunneling technology we have developed allows us to make a relatively large gap (of two nanometers) compared to the much smaller gaps required previously for tunnel current read-out (which were less than a single nanometer wide). The ability to use larger gaps for tunneling makes the manufacture of the device much easier and gives DNA molecules room to pass the electrodes.”

Specifically, when a current is passed through the nanopore, as the DNA passes through, it causes a spike in the current unique to each chemical base (A, C, T or G) within the DNA molecule. A few more modifications are made to polish and finish the device manufacturing.

The team encountered considerable device-to-device variation, so calibration will be needed to make the technology more robust. And the final big step – of reducing the diameter of the hole through the device to that of a single DNA molecule – has yet to be taken.

But overall, the research team has developed a scalable manufacturing process to make a device that can work reliably for hours at a time, identifying each of the DNA chemical bases while flowing through the two-nanometer gap.

The research team is also working on modifying the technique to read other single molecules, which could be used in an important technology for drug development.

The latest developments could also bring in big business for ASU. Lindsay, dubbed a “serial entrepreneur” by the media, has a new spinout venture, called Recognition Analytix, that hopes to follow the success of Molecular Imaging Corp, a similar instrument company he co-founded in 1993, and sold to Agilent Technologies in 2005.

Geologists discover ancient buried canyon in South Tibet

This photo shows the Yarlung Tsangpo Valley close to the Tsangpo Gorge, where it is rather narrow and underlain by only about 250 meters of sediments. The mountains in the upper left corner belong to the Namche Barwa massif. Previously, scientists had suspected that the debris deposited by a glacier in the foreground was responsible for the formation of the steep Tsangpo Gorge -- the new discoveries falsify this hypothesis. -  Ping Wang
This photo shows the Yarlung Tsangpo Valley close to the Tsangpo Gorge, where it is rather narrow and underlain by only about 250 meters of sediments. The mountains in the upper left corner belong to the Namche Barwa massif. Previously, scientists had suspected that the debris deposited by a glacier in the foreground was responsible for the formation of the steep Tsangpo Gorge — the new discoveries falsify this hypothesis. – Ping Wang

A team of researchers from Caltech and the China Earthquake Administration has discovered an ancient, deep canyon buried along the Yarlung Tsangpo River in south Tibet, north of the eastern end of the Himalayas. The geologists say that the ancient canyon–thousands of feet deep in places–effectively rules out a popular model used to explain how the massive and picturesque gorges of the Himalayas became so steep, so fast.

“I was extremely surprised when my colleagues, Jing Liu-Zeng and Dirk Scherler, showed me the evidence for this canyon in southern Tibet,” says Jean-Philippe Avouac, the Earle C. Anthony Professor of Geology at Caltech. “When I first saw the data, I said, ‘Wow!’ It was amazing to see that the river once cut quite deeply into the Tibetan Plateau because it does not today. That was a big discovery, in my opinion.”

Geologists like Avouac and his colleagues, who are interested in tectonics–the study of the earth’s surface and the way it changes–can use tools such as GPS and seismology to study crustal deformation that is taking place today. But if they are interested in studying changes that occurred millions of years ago, such tools are not useful because the activity has already happened. In those cases, rivers become a main source of information because they leave behind geomorphic signatures that geologists can interrogate to learn about the way those rivers once interacted with the land–helping them to pin down when the land changed and by how much, for example.

“In tectonics, we are always trying to use rivers to say something about uplift,” Avouac says. “In this case, we used a paleocanyon that was carved by a river. It’s a nice example where by recovering the geometry of the bottom of the canyon, we were able to say how much the range has moved up and when it started moving.”

The team reports its findings in the current issue of Science.

Last year, civil engineers from the China Earthquake Administration collected cores by drilling into the valley floor at five locations along the Yarlung Tsangpo River. Shortly after, former Caltech graduate student Jing Liu-Zeng, who now works for that administration, returned to Caltech as a visiting associate and shared the core data with Avouac and Dirk Scherler, then a postdoc in Avouac’s group. Scherler had previously worked in the far western Himalayas, where the Indus River has cut deeply into the Tibetan Plateau, and immediately recognized that the new data suggested the presence of a paleocanyon.

Liu-Zeng and Scherler analyzed the core data and found that at several locations there were sedimentary conglomerates, rounded gravel and larger rocks cemented together, that are associated with flowing rivers, until a depth of 800 meters or so, at which point the record clearly indicated bedrock. This suggested that the river once carved deeply into the plateau.

To establish when the river switched from incising bedrock to depositing sediments, they measured two isotopes, beryllium-10 and aluminum-26, in the lowest sediment layer. The isotopes are produced when rocks and sediment are exposed to cosmic rays at the surface and decay at different rates once buried, and so allowed the geologists to determine that the paleocanyon started to fill with sediment about 2.5 million years ago.

The researchers’ reconstruction of the former valley floor showed that the slope of the river once increased gradually from the Gangetic Plain to the Tibetan Plateau, with no sudden changes, or knickpoints. Today, the river, like most others in the area, has a steep knickpoint where it meets the Himalayas, at a place known as the Namche Barwa massif. There, the uplift of the mountains is extremely rapid (on the order of 1 centimeter per year, whereas in other areas 5 millimeters per year is more typical) and the river drops by 2 kilometers in elevation as it flows through the famous Tsangpo Gorge, known by some as the Yarlung Tsangpo Grand Canyon because it is so deep and long.

Combining the depth and age of the paleocanyon with the geometry of the valley, the geologists surmised that the river existed in this location prior to about 3 million years ago, but at that time, it was not affected by the Himalayas. However, as the Indian and Eurasian plates continued to collide and the mountain range pushed northward, it began impinging on the river. Suddenly, about 2.5 million years ago, a rapidly uplifting section of the mountain range got in the river’s way, damming it, and the canyon subsequently filled with sediment.

“This is the time when the Namche Barwa massif started to rise, and the gorge developed,” says Scherler, one of two lead authors on the paper and now at the GFZ German Research Center for Geosciences in Potsdam, Germany.

That picture of the river and the Tibetan Plateau, which involves the river incising deeply into the plateau millions of years ago, differs quite a bit from the typically accepted geologic vision. Typically, geologists believe that when rivers start to incise into a plateau, they eat at the edges, slowly making their way into the plateau over time. However, the rivers flowing across the Himalayas all have strong knickpoints and have not incised much at all into the Tibetan Plateau. Therefore, the thought has been that the rapid uplift of the Himalayas has pushed the rivers back, effectively pinning them, so that they have not been able to make their way into the plateau. But that explanation does not work with the newly discovered paleocanyon.

The team’s new hypothesis also rules out a model that has been around for about 15 years, called tectonic aneurysm, which suggests that the rapid uplift seen at the Namche Barwa massif was triggered by intense river incision. In tectonic aneurysm, a river cuts down through the earth’s crust so fast that it causes the crust to heat up, making a nearby mountain range weaker and facilitating uplift.

The model is popular among geologists, and indeed Avouac himself published a modeling paper in 1996 that showed the viability of the mechanism. “But now we have discovered that the river was able to cut into the plateau way before the uplift happened,” Avouac says, “and this shows that the tectonic aneurysm model was actually not at work here. The rapid uplift is not a response to river incision.”

###

The other lead author on the paper, “Tectonic control of the Yarlung Tsangpo Gorge, revealed by a 2.5 Myr old buried canyon in Southern Tibet,” is Ping Wang of the State Key Laboratory of Earthquake Dynamics, in Beijing, China. Additional authors include Jürgen Mey, of the University of Potsdam, in Germany; and Yunda Zhang and Dingguo Shi of the Chengdu Engineering Corporation, in China. The work was supported by the National Natural Science Foundation of China, the State Key Laboratory for Earthquake Dynamics, and the Alexander von Humboldt Foundation.

Climate capers of the past 600,000 years

The researchers remove samples from a core segment taken from Lake Van at the center for Marine environmental sciences MARUM in Bremen, where all of the cores from the PALEOVAN project are stored. -  Photo: Nadine Pickarski/Uni Bonn
The researchers remove samples from a core segment taken from Lake Van at the center for Marine environmental sciences MARUM in Bremen, where all of the cores from the PALEOVAN project are stored. – Photo: Nadine Pickarski/Uni Bonn

If you want to see into the future, you have to understand the past. An international consortium of researchers under the auspices of the University of Bonn has drilled deposits on the bed of Lake Van (Eastern Turkey) which provide unique insights into the last 600,000 years. The samples reveal that the climate has done its fair share of mischief-making in the past. Furthermore, there have been numerous earthquakes and volcanic eruptions. The results of the drilling project also provide a basis for assessing the risk of how dangerous natural hazards are for today’s population. In a special edition of the highly regarded publication Quaternary Science Reviews, the scientists have now published their findings in a number of journal articles.

In the sediments of Lake Van, the lighter-colored, lime-containing summer layers are clearly distinguishable from the darker, clay-rich winter layers — also called varves. In 2010, from a floating platform an international consortium of researchers drilled a 220 m deep sediment profile from the lake floor at a water depth of 360 m and analyzed the varves. The samples they recovered are a unique scientific treasure because the climate conditions, earthquakes and volcanic eruptions of the past 600,000 years can be read in outstanding quality from the cores.

The team of scientists under the auspices of the University of Bonn has analyzed some 5,000 samples in total. “The results show that the climate over the past hundred thousand years has been a roller coaster. Within just a few decades, the climate could tip from an ice age into a warm period,” says Doctor Thomas Litt of the University of Bonn’s Steinmann Institute and spokesman for the PALEOVAN international consortium of researchers. Unbroken continental climate archives from the ice age which encompass several hundred thousand years are extremely rare on a global scale. “There has never before in all of the Middle East and Central Asia been a continental drilling operation going so far back into the past,” says Doctor Litt. In the northern hemisphere, climate data from ice-cores drilled in Greenland encompass the last 120,000 years. The Lake Van project closes a gap in the scientific climate record.

The sediments reveal six cycles of cold and warm periods


Scientists found evidence for a total of six cycles of warm and cold periods in the sediments of Lake Van. The University of Bonn paleoecologist and his colleagues analyzed the pollen preserved in the sediments. Under a microscope they were able to determine which plants around the eastern Anatolian Lake the pollen came from. “Pollen is amazingly durable and is preserved over very long periods when protected in the sediments,” Doctor Litt explained. Insight into the age of the individual layers was gleaned through radiometric age measurements that use the decay of radioactive elements as a geologic clock. Based on the type of pollen and the age, the scientists were able to determine when oak forests typical of warm periods grew around Lake Van and when ice-age steppe made up of grasses, mugwort and goosefoot surrounded the lake.

Once they determine the composition of the vegetation present and the requirements of the plants, the scientists can reconstruct with a high degree of accuracy the temperature and amount of rainfall during different epochs. These analyses enable the team of researchers to read the varves of Lake Van like thousands of pages of an archive. With these data, the team was able to demonstrate that fluctuations in climate were due in large part to periodic changes in the Earth’s orbit parameters and the commensurate changes in solar insolation levels. However, the influence of North Atlantic currents was also evident. “The analysis of the Lake Van sediments has presented us with an image of how an ecosystem reacts to abrupt changes in climate. This fundamental data will help us to develop potential scenarios of future climate effects,” says Doctor Litt.

Risks of earthquakes and volcanic eruptions in the region of Van

Such risk assessments can also be made for other natural forces. “Deposits of volcanic ash with thicknesses of up to 10 m in the Lake Van sediments show us that approximately 270,000 years ago there was a massive eruption,” the University of Bonn paleoecologist said. The team struck some 300 different volcanic events in its drillings. Statistically, that corresponds to one explosive volcanic eruption in the region every 2000 years. Deformations in the sediment layers show that the area is subject to frequent, strong earthquakes. “The area around Lake Van is very densely populated. The data from the core samples show that volcanic activity and earthquakes present a relatively high risk for the region,” Doctor Litt says. According to media reports, in 2011 a 7.2 magnitude earthquake in the Van province claimed the lives of more than 500 people and injured more than 2,500.

Publication: “Results from the PALEOVAN drilling project: A 600,000 year long continental archive in the Near East”, Quaternary Science Reviews, Volume 104, online publication: (http://dx.doi.org/10.1016/j.quascirev.2014.09.026)

Rare 2.5-billion-year-old rocks reveal hot spot of sulfur-breathing bacteria

Gold miners prospecting in a mountainous region of Brazil drilled this 590-foot cylinder of bedrock from the Neoarchaean Eon, which provides rare evidence of conditions on Earth 2.5 billion years ago. -  Alan J. Kaufman
Gold miners prospecting in a mountainous region of Brazil drilled this 590-foot cylinder of bedrock from the Neoarchaean Eon, which provides rare evidence of conditions on Earth 2.5 billion years ago. – Alan J. Kaufman

Wriggle your toes in a marsh’s mucky bottom sediment and you’ll probably inhale a rotten egg smell, the distinctive odor of hydrogen sulfide gas. That’s the biochemical signature of sulfur-using bacteria, one of Earth’s most ancient and widespread life forms.

Among scientists who study the early history of our 4.5 billion-year-old planet, there is a vigorous debate about the evolution of sulfur-dependent bacteria. These simple organisms arose at a time when oxygen levels in the atmosphere were less than one-thousandth of what they are now. Living in ocean waters, they respired (or breathed in) sulfate, a form of sulfur, instead of oxygen. But how did that sulfate reach the ocean, and when did it become abundant enough for living things to use it?

New research by University of Maryland geology doctoral student Iadviga Zhelezinskaia offers a surprising answer. Zhelezinskaia is the first researcher to analyze the biochemical signals of sulfur compounds found in 2.5 billion-year-old carbonate rocks from Brazil. The rocks were formed on the ocean floor in a geologic time known as the Neoarchaean Eon. They surfaced when prospectors drilling for gold in Brazil punched a hole into bedrock and pulled out a 590-foot-long core of ancient rocks.

In research published Nov. 7, 2014 in the journal Science, Zhelezinskaia and three co-authors–physicist John Cliff of the University of Western Australia and geologists Alan Kaufman and James Farquhar of UMD–show that bacteria dependent on sulfate were plentiful in some parts of the Neoarchaean ocean, even though sea water typically contained about 1,000 times less sulfate than it does today.

“The samples Iadviga measured carry a very strong signal that sulfur compounds were consumed and altered by living organisms, which was surprising,” says Farquhar. “She also used basic geochemical models to give an idea of how much sulfate was in the oceans, and finds the sulfate concentrations are very low, much lower than previously thought.”

Geologists study sulfur because it is abundant and combines readily with other elements, forming compounds stable enough to be preserved in the geologic record. Sulfur has four naturally occurring stable isotopes–atomic signatures left in the rock record that scientists can use to identify the elements’ different forms. Researchers measuring sulfur isotope ratios in a rock sample can learn whether the sulfur came from the atmosphere, weathering rocks or biological processes. From that information about the sulfur sources, they can deduce important information about the state of the atmosphere, oceans, continents and biosphere when those rocks formed.

Farquhar and other researchers have used sulfur isotope ratios in Neoarchaean rocks to show that soon after this period, Earth’s atmosphere changed. Oxygen levels soared from just a few parts per million to almost their current level, which is around 21 percent of all the gases in the atmosphere. The Brazilian rocks Zhelezinskaia sampled show only trace amounts of oxygen, a sign they were formed before this atmospheric change.

With very little oxygen, the Neoarchaean Earth was a forbidding place for most modern life forms. The continents were probably much drier and dominated by volcanoes that released sulfur dioxide, carbon dioxide, methane and other greenhouse gases. Temperatures probably ranged between 0 and 100 degrees Celsius (32 to 212 degrees Fahrenheit), warm enough for liquid oceans to form and microbes to grow in them.

Rocks 2.5 billion years old or older are extremely rare, so geologists’ understanding of the Neoarchaean are based on a handful of samples from a few small areas, such as Western Australia, South Africa and Brazil. Geologists theorize that Western Australia and South Africa were once part of an ancient supercontinent called Vaalbara. The Brazilian rock samples are comparable in age, but they may not be from the same supercontinent, Zhelezinskaia says.

Most of the Neoarchaean rocks studied are from Western Australia and South Africa and are black shale, which forms when fine dust settles on the sea floor. The Brazilian prospector’s core contains plenty of black shale and a band of carbonate rock, formed below the surface of shallow seas, in a setting that probably resembled today’s Bahama Islands. Black shale usually contains sulfur-bearing pyrite, but carbonate rock typically does not, so geologists have not focused on sulfur signals in Neoarchaean carbonate rocks until now.

Zhelezinskaia “chose to look at a type of rock that others generally avoided, and what she saw was spectacularly different,” said Kaufman. “It really opened our eyes to the implications of this study.”

The Brazilian carbonate rocks’ isotopic ratios showed they formed in ancient seabed containing sulfate from atmospheric sources, not continental rock. And the isotopic ratios also showed that Neoarchaean bacteria were plentiful in the sediment, respiring sulfate and emitted hydrogen sulfide–the same process that goes on today as bacteria recycle decaying organic matter into minerals and gases.

How could the sulfur-dependent bacteria have thrived during a geologic time when sulfur levels were so low? “It seems that they were in shallow water, where evaporation may have been high enough to concentrate the sulfate, and that would make it abundant enough to support the bacteria,” says Zhelezinskaia.

Zhelezinskaia is now analyzing carbonate rocks of the same age from Western Australia and South Africa, to see if the pattern holds true for rocks formed in other shallow water environments. If it does, the results may change scientists’ understanding of one of Earth’s earliest biological processes.

“There is an ongoing debate about when sulfate-reducing bacteria arose and how that fits into the evolution of life on our planet,” says Farquhar. “These rocks are telling us the bacteria were there 2.5 billion years ago, and they were doing something significant enough that we can see them today.”

###

This research was supported by the Fulbright Program (Grantee ID 15110620), the NASA Astrobiology Institute (Grant No. NNA09DA81A) and the National Science Foundation Frontiers in Earth-System Dynamics program (Grant No. 432129). The content of this article does not necessarily reflect the views of these organizations.

“Large sulfur isotope fractionations associated with Neoarchaean microbial sulfate reductions,” Iadviga Zhelezinskaia, Alan J. Kaufman, James Farquhar and John Cliff, was published Nov. 7, 2014 in Science. Download the abstract after 2 p.m. U.S. Eastern time, Nov. 6, 2014: http://www.sciencemag.org/lookup/doi/10.1126/science.1256211

James Farquhar home page

http://www.geol.umd.edu/directory.php?id=13

Alan J. Kaufman home page

http://www.geol.umd.edu/directory.php?id=15

Iadviga Zhelezinskaia home page

http://www.geol.umd.edu/directory.php?id=66

Media Relations Contact: Abby Robinson, 301-405-5845, abbyr@umd.edu

Writer: Heather Dewar

Space-based methane maps find largest US signal in Southwest

An unexpectedly high amount of the climate-changing gas methane, the main component of natural gas, is escaping from the Four Corners region in the U.S. Southwest, according to a new study by the University of Michigan and NASA.

The researchers mapped satellite data to uncover the nation’s largest methane signal seen from space. They measured levels of the gas emitted from all sources, and found more than half a teragram per year coming from the area where Arizona, New Mexico, Colorado and Utah meet. That’s about as much methane as the entire coal, oil, and gas industries of the United Kingdom give off each year.

Four Corners sits on North America’s most productive coalbed methane basin. Coalbed methane is a variety of the gas that’s stuck to the surface of coal. It is dangerous to miners (not to mention canaries), but in recent decades, it’s been tapped as a resource.

“There’s so much coalbed methane in the Four Corners area, it doesn’t need to be that crazy of a leak rate to produce the emissions that we see. A lot of the infrastructure is likely contributing,” said Eric Kort, assistant professor of atmospheric, oceanic and space sciences at the U-M College of Engineering.

Kort, first author of a paper on the findings published in Geophysical Research Letters, says the controversial natural gas extraction technique of hydraulic fracturing is not the main culprit.

“We see this large signal and it’s persistent since 2003,” Kort said. “That’s a pre- fracking timeframe in this region. While fracking has become a focal point in conversations about methane emissions, it certainly appears from this and other studies that in the U.S., fossil fuel extraction activities across the board likely emit higher than inventory estimates.”

While the signal represents the highest concentration of methane seen from space, the researchers caution that Four Corners isn’t necessarily the highest emitting region.

“One has to be somewhat careful in equating abundances with emissions,” said study contributor Christian Frankenberg at Jet Propulsion Laboratory. “The Four Corners methane source is in a relatively isolated area with little other methane emissions, hence causing a well distinguishable hot-spot in methane abundances. Local or more diffuse emissions in other areas, such as the eastern U.S., may be convoluted with other nearby sources

Natural gas is often touted as more sustainable than coal and oil because it releases fewer pollutants when it burns. But when it leaks into the air before it gets to the pilot light, methane has 30 times the short-term heat-trapping effects of carbon dioxide. Policymakers, energy companies and environmentalists alike are aiming to reduce methane emissions as a way to curb climate change. But pinpointing plumes—a first step to stopping them—has been a difficult task with today’s tools.

The research team demonstrated a new approach to finding leaks. They used a satellite instrument—the European Space Agency’s SCIAMACHY—to get regional methane measurements over the entire United States. They ran the data through a mathematical model to account for mountains and valleys, which can trap methane. That’s how they identified the anomaly at Four Corners. Then they zoomed in on that region and ran another mathematical model to control for wind, to make sure that didn’t negate the original signal. It didn’t.

“We didn’t know this was a region we should look at. We found it from space,” Kort said. “We’ve demonstrated that satellite measurements can help identify, locate and quantify anomalous methane emissions in regions that are unexpected.”

Methane gets into the atmosphere from both natural and human-made sources. Wetlands and landfills release it, as do certain bacteria. Agriculture is a big contributor. So are gas and oil drilling and distribution. Inventories such as those the EPA compiles make estimates based on measurements from a sampling of these sources. In previous work, air measurements from planes and a sparse network of monitoring towers have revealed that the inventory-based numbers are coming in low—roughly 50 percent low. But towers and planes can’t see everywhere to figure out exactly where all the methane is coming from. With limited observations there can be blind spots, the researchers say.

This study used satellite data from 2003 to 2009. In later years, they were able to validate the satellite measurements with a year of ground-based data.

SCIAMACHY is no longer operating, so there aren’t equivalent satellites to provide this information for other parts of the world. For the Four Corners region, Kort will be taking readings from an airplane next year, to get even closer to identifying the leaks.

###

The study is titled “Four Corners: the largest US methane anomaly viewed from space.” The research was funded by NASA and Los Alamos National Lab.

Eric Kort: http://aoss.engin.umich.edu/people/eakort

Abstract: http://onlinelibrary.wiley.com/doi/10.1002/2014GL061503/abstract

Team advances understanding of the Greenland Ice Sheet’s meltwater channels

An international team of researchers deployed to western Greenland to study the melt rates of the Greenland Ice Sheet. -  Matt Hoffman, Los Alamos National Laboratory
An international team of researchers deployed to western Greenland to study the melt rates of the Greenland Ice Sheet. – Matt Hoffman, Los Alamos National Laboratory

An international research team’s field work, drilling and measuring melt rates and ice sheet movement in Greenland is showing that things are, in fact, more complicated than we thought.

“Although the Greenland Ice Sheet initially speeds up each summer in its slow-motion race to the sea, the network of meltwater channels beneath the sheet is not necessarily forming the slushy racetrack that had been previously considered,” said Matthew Hoffman, a Los Alamos National Laboratory scientist on the project.

A high-profile paper appearing in Nature this week notes that observations of moulins (vertical conduits connecting water on top of the glacier down to the bed of the ice sheet) and boreholes in Greenland show that subglacial channels ameliorate the speedup caused by water delivery to the base of the ice sheet in the short term. By mid summer, however, the channels stabilize and are unable to grow any larger. In a previous paper appearing in Science, researchers had posited that the undersheet channels were not even a consideration in Greenland, but as happens in the science world, more data fills in the complex mosaic of facts and clarifies the evolution of the meltwater flow rates over the seasons.

In reality, these two papers are not inconsistent – they are studying different places at different times – and they both are consistent in that channelization is less important than previously assumed, said Hoffman.

The Greenland Ice Sheet’s movement speeds up each summer as melt from the surface penetrates kilometer-thick ice through moulins, lubricating the bed of the ice sheet. Greater melt is predicted for Greenland in the future, but its impact on ice sheet flux and associated sea level rise is uncertain: direct observations of the subglacial drainage system are lacking and its evolution over the melt season is poorly understood.

“Everyone wants to know what’s happening under Greenland as it experiences more and more melt,” said study coauthor Ginny Catania, a research scientist at the institute and an associate professor in the University of Texas at Austin’s Jackson School of Geosciences. “This subglacial plumbing may or may not be critical for sea level rise in the next 100 years, but we don’t really know until we fully understand it.”

To resolve these unknowns, the research team drilled and instrumented 13 boreholes through 700-meter thick ice in west Greenland. There they performed the first combined analysis of Greenland ice velocity and water pressure in moulins and boreholes, and they determined that moulin water pressure does not lower over the latter half of the melt season, indicating a limited role of high-efficiency channels in subglacial drainage.

Instead they found that boreholes monitor a hydraulically isolated region of the bed, but decreasing water pressure seen in some boreholes can explain the decreasing ice velocity seen over the melt season.

“Like loosening the seal of a bathtub drain, the hydrologic changes that occur each summer may cause isolated pockets of pressurized water to slowly drain out from under the ice sheet, resulting in more friction,” said Hoffman.

Their observations identify a previously unrecognized role of changes in hydraulically isolated regions of the bed in controlling evolution of subglacial drainage over summer. Understanding this process will be crucial for predicting the effect of increasing melt on summer speedup and associated autumn slowdown of the ice sheet into the future.

###

The research letter is published in this week’s Nature magazine as “Direct observations of evolving subglacial drainage beneath the Greenland Ice Sheet.” The project was an international collaboration between the University of Texas at Austin, Los Alamos National Laboratory, NASA Goddard Space Flight Center, Michigan Technological University, University of Zurich, the Swiss Federal Institute of Technology and Dartmouth College.

This project was supported by United States National Science Foundation, the Swiss National Science Foundation and the National Geographic Society. The work at Los Alamos was supported by NASA Cryospheric Sciences, and through climate modeling programs within the US Department of Energy, Office of Science.

Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the Department of Energy’s National Nuclear Security Administration.

Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

Fracking’s environmental impacts scrutinized

Greenhouse gas emissions from the production and use of shale gas would be comparable to conventional natural gas, but the controversial energy source actually faired better than renewables on some environmental impacts, according to new research.

The UK holds enough shale gas to supply its entire gas demand for 470 years, promising to solve the country’s energy crisis and end its reliance on fossil-fuel imports from unstable markets. But for many, including climate scientists and environmental groups, shale gas exploitation is viewed as environmentally dangerous and would result in the UK reneging on its greenhouse gas reduction obligations under the Climate Change Act.

University of Manchester scientists have now conducted one of the most thorough examinations of the likely environmental impacts of shale gas exploitation in the UK in a bid to inform the debate. Their research has just been published in the leading academic journal Applied Energy and study lead author, Professor Adisa Azapagic, will outline the findings at the Labour Party Conference in Manchester, England, today (Monday, 22 September).

“While exploration is currently ongoing in the UK, commercial extraction of shale gas has not yet begun, yet its potential has stirred controversy over its environmental impacts, its safety and the difficulty of justifying its use to a nation conscious of climate change,” said Professor Azapagic.

“There are many unknowns in the debate surrounding shale gas, so we have attempted to address some of these unknowns by estimating its life cycle environmental impacts from ‘cradle to grave’. We looked at 11 different impacts from the extraction of shale gas using hydraulic fracturing – known as ‘fracking’- as well as from its processing and use to generate electricity.”

The researchers compared shale gas to other fossil-fuel alternatives, such as conventional natural gas and coal, as well as low-carbon options, including nuclear, offshore wind and solar power (solar photovoltaics).

The results of the research suggest that the average emissions of greenhouse gases from shale gas over its entire life cycle are about 460 grams of carbon dioxide-equivalent per kilowatt-hour of electricity generated. This, the authors say, is comparable to the emissions from conventional natural gas. For most of the other life-cycle environmental impacts considered by the team, shale gas was also comparable to conventional natural gas.

But the study also found that shale gas was better than offshore wind and solar for four out of 11 impacts: depletion of natural resources, toxicity to humans, as well as the impact on freshwater and marine organisms. Additionally, shale gas was better than solar (but not wind) for ozone layer depletion and eutrophication (the effect of nutrients such as phosphates, on natural ecosystems).

On the other hand, shale gas was worse than coal for three impacts: ozone layer depletion, summer smog and terrestrial eco-toxicity.

Professor Azapagic said: “Some of the impacts of solar power are actually relatively high, so it is not a complete surprise that shale gas is better in a few cases. This is mainly because manufacturing solar panels is very energy and resource-intensive, while their electrical output is quite low in a country like the UK, as we don’t have as much sunshine. However, our research shows that the environmental impacts of shale gas can vary widely, depending on the assumptions for various parameters, including the composition and volume of the fracking fluid used, disposal routes for the drilling waste and the amount of shale gas that can be recovered from a well.

“Assuming the worst case conditions, several of the environmental impacts from shale gas could be worse than from any other options considered in the research, including coal. But, under the best-case conditions, shale gas may be preferable to imported liquefied natural gas.”

The authors say their results highlight the need for tight regulation of shale gas exploration – weak regulation, they claim, may result in shale gas having higher impacts than coal power, resulting in a failure to meet climate change and sustainability imperatives and undermining the deployment of low-carbon technologies.

Professor Azapagic added: “Whether shale gas is an environmentally sound option depends on the perceived importance of different environmental impacts and the regulatory structure under which shale gas operates.

“From the government policy perspective – focusing mainly on economic growth and energy security – it appears likely that shale gas represents a good option for the UK energy sector, assuming that it can be extracted at reasonable cost.

“However, a wider view must also consider other aspects of widespread use of shale gas, including the impact on climate change, as well as many other environmental considerations addressed in our study. Ultimately, the environmental impacts from shale gas will depend on which options it is displacing and how tight the regulation is.”

Study co-author Dr Laurence Stamford, from Manchester’s School of Chemical Engineering and Analytical Science, said: “Appropriate regulation should introduce stringent controls on the emissions from shale gas extraction and disposal of drilling waste. It should also discourage extraction from sites where there is little shale gas in order to avoid the high emissions associated with a low-output well.

He continued: “If shale gas is extracted under tight regulations and is reasonably cheap, there is no obvious reason, as yet, why it should not make some contribution to our energy mix. However, regulation should also ensure that investment in sustainable technologies is not reduced at the expense of shale gas.”

Contaminated water in 2 states linked to faulty shale gas wells

Faulty well integrity, not hydraulic fracturing deep underground, is the primary cause of drinking water contamination from shale gas extraction in parts of Pennsylvania and Texas, according to a new study by researchers from five universities.

The scientists from Duke, Ohio State, Stanford, Dartmouth and the University of Rochester
published their peer-reviewed study Sept. 15 in the Proceedings of the National Academy of Sciences. Using noble gas and hydrocarbon tracers, they analyzed the gas content of more than 130 drinking water wells in the two states.

“We found eight clusters of wells — seven in Pennsylvania and one in Texas — with contamination, including increased levels of natural gas from the Marcellus shale in Pennsylvania and from shallower, intermediate layers in both states,” said Thomas H. Darrah, assistant professor of earth science at Ohio State, who led the study while he was a research scientist at Duke.

“Our data clearly show that the contamination in these clusters stems from well-integrity problems such as poor casing and cementing,” Darrah said.

“These results appear to rule out the possibility that methane has migrated up into drinking water aquifers because of horizontal drilling or hydraulic fracturing, as some people feared,” said Avner Vengosh, professor of geochemistry and water quality at Duke.

In four of the affected clusters, the team’s noble gas analysis shows that methane from drill sites escaped into drinking water wells from shallower depths through faulty or insufficient rings of cement surrounding a gas well’s shaft. In three clusters, the tests suggest the methane leaked through faulty well casings. In one cluster, it was linked to an underground well failure.

“People’s water has been harmed by drilling,” said Robert B. Jackson, professor of environmental and earth sciences at Stanford and Duke. “In Texas, we even saw two homes go from clean to contaminated after our sampling began.”

“The good news is that most of the issues we have identified can potentially be avoided by future improvements in well integrity,” Darrah stressed.

Using both noble gas and hydrocarbon tracers — a novel combination that enabled the researchers to identify and distinguish between the signatures of naturally occurring methane and stray gas contamination from shale gas drill sites — the team analyzed gas content in 113 drinking-water wells and one natural methane seep overlying the Marcellus shale in Pennsylvania, and in 20 wells overlying the Barnett shale in Texas. Sampling was conducted in 2012 and 2013. Sampling sites included wells where contamination had been debated previously; wells known to have naturally high level of methane and salts, which tend to co-occur in areas overlying shale gas deposits; and wells located both within and beyond a one-kilometer distance from drill sites.

Noble gases such as helium, neon or argon are useful for tracing fugitive methane because although they mix with natural gas and can be transported with it, they are inert and are not altered by microbial activity or oxidation. By measuring changes in ratios in these tag-along noble gases, researchers can determine the source of fugitive methane and the mechanism by which it was transported into drinking water aquifers — whether it migrated there as a free gas or was dissolved in water.

“This is the first study to provide a comprehensive analysis of noble gases and their isotopes in groundwater near shale gas wells,” said Darrah, who is continuing the analysis in his lab at Ohio State. “Using these tracers, combined with the isotopic and chemical fingerprints of hydrocarbons in the water and its salt content, we can pinpoint the sources and pathways of methane contamination, and determine if it is natural or not.”