Adjusting Earth’s thermostat, with caution

David Keith, Gordon McKay Professor of Applied Physics at Harvard SEAS and professor of public policy at Harvard Kennedy School, coauthored several papers on climate engineering with colleagues at Harvard and beyond. -  Eliza Grinnell, SEAS Communications.
David Keith, Gordon McKay Professor of Applied Physics at Harvard SEAS and professor of public policy at Harvard Kennedy School, coauthored several papers on climate engineering with colleagues at Harvard and beyond. – Eliza Grinnell, SEAS Communications.

A vast majority of scientists believe that the Earth is warming at an unprecedented rate and that human activity is almost certainly the dominant cause. But on the topics of response and mitigation, there is far less consensus.

One of the most controversial propositions for slowing the increase in temperatures here on Earth is to manipulate the atmosphere above. Specifically, some scientists believe it should be possible to offset the warming effect of greenhouses gases by reflecting more of the sun’s energy back into space.

The potential risks–and benefits–of solar radiation management (SRM) are substantial. So far, however, all of the serious testing has been confined to laboratory chambers and theoretical models. While those approaches are valuable, they do not capture the full range of interactions among chemicals, the impact of sunlight on these reactions, or multiscale variations in the atmosphere.

Now, a team of researchers from the Harvard School of Engineering and Applied Sciences (SEAS) has outlined how a small-scale “stratospheric perturbation experiment” could work. By proposing, in detail, a way to take the science of geoengineering to the skies, they hope to stimulate serious discussion of the practice by policymakers and scientists.

Ultimately, they say, informed decisions on climate policy will need to rely on the best information available from controlled and cautious field experiments.

The paper is among several published today in a special issue of the Philosophical Transactions of the Royal Society A that examine the nuances, the possible consequences, and the current state of scientific understanding of climate engineering. David Keith, whose work features prominently in the issue, is Gordon McKay Professor of Applied Physics at Harvard SEAS and a professor of public policy at Harvard Kennedy School. His coauthors on the topic of field experiments include James Anderson, Philip S. Weld Professor of Applied Chemistry at Harvard SEAS and in Harvard’s Department of Chemistry and Chemical Biology; and other colleagues at Harvard SEAS.

“The idea of conducting experiments to alter atmospheric processes is justifiably controversial, and our experiment, SCoPEx, is just a proposal,” Keith emphasizes. “It will continue to evolve until it is funded, and we will only move ahead if the funding is substantially public, with a formal approval process and independent risk assessment.”

With so much at stake, Keith believes transparency is essential. But the science of climate engineering is also widely misunderstood.

“People often claim that you cannot test geoengineering except by doing it at full scale,” says Keith. “This is nonsense. It is possible to do a small-scale test, with quite low risks, that measures key aspects of the risk of geoengineering–in this case the risk of ozone loss.”

Such controlled experiments, targeting key questions in atmospheric chemistry, Keith says, would reduce the number of “unknown unknowns” and help to inform science-based policy.

The experiment Keith and Anderson’s team is proposing would involve only a tiny amount of material–a few hundred grams of sulfuric acid, an amount Keith says is roughly equivalent to what a typical commercial aircraft releases in a few minutes while flying in the stratosphere. It would provide important insight into how much SRM would reduce radiative heating, the concentration of water vapor in the stratosphere, and the processes that determine water vapor transport–which affects the concentration of ozone.

In addition to the experiment proposed in that publication, another paper coauthored by Keith and collaborators at the California Institute of Technology (CalTech) collects and reviews a number of other experimental methods, to demonstrate the diversity of possible approaches.

“There is a wide range of experiments that could be done that would significantly reduce our uncertainty about the risks and effectiveness of solar geoengineering,” Keith says. “Many could be done with very small local risks.”

A third paper explores how solar geoengineering might actually be implemented, if an international consensus were reached, and suggests that a gradual implementation that aims to limit the rate of climate change would be a plausible strategy.

“Many people assume that solar geoengineering would be used to suddenly restore the Earth’s climate to preindustrial temperatures,” says Keith, “but it’s very unlikely that it would make any policy sense to try to do so.”

Keith also points to another paper in the Royal Society’s special issue–one by Andy Parker at the Belfer Center for Science and International Affairs at Harvard Kennedy School. Parker’s paper furthers the discussion of governance and good practices in geoengineering research in the absence of both national legislation and international agreement, a topic raised last year in Science by Keith and Edward Parson of UCLA.

“The scientific aspects of geoengineering research must, by necessity, advance in tandem with a thorough discussion of the social science and policy,” Keith warns. “Of course, these risks must also be weighed against the risk of doing nothing.”

For further information, see: “Stratospheric controlled perturbation experiment (SCoPEx): A small-scale experiment to improve understanding of the risks of solar geoengineering” doi: 10.1098/rsta.2014.0059

By John Dykema, project scientist at Harvard SEAS; David Keith, Gordon McKay Professor of Applied Physics at Harvard SEAS and professor of public policy at Harvard Kennedy School; James Anderson, Philip S. Weld Professor of Applied Chemistry at Harvard SEAS and in Harvard’s Department of Chemistry and Chemical Biology; and Debra Weisenstein, research management specialist at Harvard SEAS.

“Field experiments on solar geoengineering: Report of a workshop exploring a representative research portfolio”
doi: 10.1098/rsta.2014.0175

By David Keith; Riley Duren, chief systems engineer at the NASA Jet Propulsion Laboratory at CalTech; and Douglas MacMartin, senior research associate and lecturer at CalTech.

“Solar geoengineering to limit the rate of temperature change”
doi: 10.1098/rsta.2014.0134

By Douglas MacMartin; Ken Caldeira, senior scientist at the Carnegie Institute for Science and professor of environmental Earth system sciences at Stanford University; and David Keith.

“Governing solar geoengineering research as it leaves the laboratory”
doi: 10.1098/rsta.2014.0173

By Andy Parker, associate of the Belfer Center at Harvard Kennedy School.

Groundwater warming up in synch

For their study, the researchers were able to fall back on uninterrupted long-term temperature measurements of groundwater flows around the cities of Cologne and Karlsruhe, where the operators of the local waterworks have been measuring the temperature of the groundwater, which is largely uninfluenced by humans, for forty years. This is unique and a rare commodity for the researchers. “For us, the data was a godsend,” stresses Peter Bayer, a senior assistant at ETH Zurich’s Geological Institute. Even with some intensive research, they would not have been able to find a comparable series of measurements. Evidently, it is less interesting or too costly for waterworks to measure groundwater temperatures systematically for a lengthy period of time. “Or the data isn’t digitalised and only archived on paper,” suspects the hydrogeologist.

Damped image of atmospheric warming

Based on the readings, the researchers were able to demonstrate that the groundwater is not just warming up; the warming stages observed in the atmosphere are also echoed. “Global warming is reflected directly in the groundwater, albeit damped and with a certain time lag,” says Bayer, summarising the main results that the project has yielded. The researchers published their study in the journal Hydrology and Earth System Sciences.

The data also reveals that the groundwater close to the surface down to a depth of around sixty metres has warmed up statistically significantly in the course of global warming over the last forty years. This water heating follows the warming pattern of the local and regional climate, which in turn mirrors that of global warming.

The groundwater reveals how the atmosphere has made several temperature leaps at irregular intervals. These “regime shifts” can also be observed in the global climate, as the researchers write in their study. Bayer was surprised at how quickly the groundwater responded to climate change.

Heat exchange with the subsoil


The earth’s atmosphere has warmed up by an average of 0.13 degrees Celsius per decade in the last fifty years. And this warming doesn’t stop at the subsoil, either, as other climate scientists have demonstrated in the last two decades with drillings all over the world. However, the researchers only tended to consider soils that did not contain any water or where there were no groundwater flow.

While the fact that the groundwater has not escaped climate change was revealed by researchers from Eawag and ETH Zurich in a study published three years ago, it only concerned “artificial” groundwater. In order to enhance it, river water is trickled off in certain areas. The temperature profile of the groundwater generated as a result thus matches that of the river water.

The new study, however, examines groundwater that has barely been influenced by humans. According to Bayer, it is plausible that the natural groundwater flow is also warming up in the course of climate change. “The difference in temperature between the atmosphere and the subsoil balances out naturally.” The energy transfer takes place via thermal conduction and the groundwater flow, much like a heat exchanger, which enables the heat transported to spread in the subsoil and level out.

The consequences of these findings, however, are difficult to gauge. The warmer temperatures might influence subterranean ecosystems on the one hand and groundwater-dependent biospheres on the other, which include cold areas in flowing waters where the groundwater discharges. For cryophilic organisms such as certain fish, groundwater warming could have negative consequences.

Consequences difficult to gauge

Higher groundwater temperatures also influence the water’s chemical composition, especially the chemical equilibria of nitrate or carbonate. After all, chemical reactions usually take place more quickly at higher temperatures. Bacterial activity might also increase at rising water temperatures. If the groundwater becomes warmer, undesirable bacteria such as gastro-intestinal disease pathogens might multiply more effectively. However, the scientists can also imagine positive effects. “The groundwater’s excess heat could be used geothermally for instance,” adds Kathrin Menberg, the first author of the study.

Volcano hazards and the role of westerly wind bursts in El Niño

On June 27, lava from Kīlauea, an active volcano on the island of Hawai'i, began flowing to the northeast, threatening the residents in a community in the District of Puna. -  USGS
On June 27, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in a community in the District of Puna. – USGS

On 27 June, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in Pāhoa, a community in the District of Puna, as well as the only highway accessible to this area. Scientists from the U.S. Geological Survey’s Hawaiian Volcano Observatory (HVO) and the Hawai’i County Civil Defense have been monitoring the volcano’s lava flow and communicating with affected residents through public meetings since 24 August. Eos recently spoke with Michael Poland, a geophysicist at HVO and a member of the Eos Editorial Advisory Board, to discuss how he and his colleagues communicated this threat to the public.

Drilling a Small Basaltic Volcano to Reveal Potential Hazards


Drilling into the Rangitoto Island Volcano in the Auckland Volcanic Field in New Zealand offers insight into a small monogenetic volcano, and may improve understanding of future hazards.

From AGU’s journals: El Niño fades without westerly wind bursts

The warm and wet winter of 1997 brought California floods, Florida tornadoes, and an ice storm in the American northeast, prompting climatologists to dub it the El Niño of the century. Earlier this year, climate scientists thought the coming winter might bring similar extremes, as equatorial Pacific Ocean conditions resembled those seen in early 1997. But the signals weakened by summer, and the El Niño predictions were downgraded. Menkes et al. used simulations to examine the differences between the two years.

The El Niño-Southern Oscillation is defined by abnormally warm sea surface temperatures in the eastern Pacific Ocean and weaker than usual trade winds. In a typical year, southeast trade winds push surface water toward the western Pacific “warm pool”–a region essential to Earth’s climate. The trade winds dramatically weaken or even reverse in El Niño years, and the warm pool extends its reach east.

Scientists have struggled to predict El Niño due to irregularities in the shape, amplitude, and timing of the surges of warm water. Previous studies suggested that short-lived westerly wind pulses (i.e. one to two weeks long) could contribute to this irregularity by triggering and sustaining El Niño events.

To understand the vanishing 2014 El Niño, the authors used computer simulations and examined the wind’s role. The researchers find pronounced differences between 1997 and 2014. Both years saw strong westerly wind events between January and March, but those disappeared this year as spring approached. In contrast, the westerly winds persisted through summer in 1997.

In the past, it was thought that westerly wind pulses were three times as likely to form if the warm pool extended east of the dateline. That did not occur this year. The team says their analysis shows that El Niño’s strength might depend on these short-lived and possibly unpredictable pulses.

###

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing more than 62,000 members in 144 countries. Join our conversation on Facebook, Twitter, YouTube, and other social media channels.

Worldwide retreat of glaciers confirmed in unprecedented detail

The worldwide retreat of glaciers is confirmed in unprecedented detail. This new book presents an overview and detailed assessment of changes in the world's glaciers by using satellite imagery -  Springer
The worldwide retreat of glaciers is confirmed in unprecedented detail. This new book presents an overview and detailed assessment of changes in the world’s glaciers by using satellite imagery – Springer

Taking their name from the old Scottish term glim, meaning a passing look or glance, in 1994 a team of scientists began developing a world-wide initiative to study glaciers using satellite data. Now 20 years later, the international GLIMS (Global Land Ice Measurements from Space) initiative observes the world’s glaciers primarily using data from optical satellite instruments such as ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) and Landsat.

More than 150 scientists from all over the world have contributed to the new book Global Land Ice Measurements from Space, the most comprehensive report to date on global glacier changes. While the shrinking of glaciers on all continents is already known from ground observations of individual glaciers, by using repeated satellite observations GLIMS has firmly established that glaciers are shrinking globally. Although some glaciers are maintaining their size, most glaciers are dwindling. The foremost cause of the worldwide reductions in glaciers is global warming, the team writes.

Full color throughout, the book has 25 regional chapters that illustrate glacier changes from the Arctic to the Antarctic. Other chapters provide a thorough theoretical background on glacier monitoring and mapping, remote sensing techniques, uncertainties, and interpretation of the observations in a climatic context. The book highlights many other glacier research applications of satellite data, including measurement of glacier thinning from repeated satellite-based digital elevation models (DEMs) and calculation of surface flow velocities from repeated satellite images.

These tools are key to understanding local and regional variations in glacier behavior, the team writes. The high sensitivity of glaciers to climate change has substantially decreased their volume and changed the landscape over the past decades, affecting both regional water availability and the hazard potential of glaciers. The growing GLIMS database about glaciers also contributed to the Intergovernmental Panel on Climate Change (IPCC)’s Fifth Assessment Report issued in 2013. The IPCC report concluded that most of the world’s glaciers have been losing ice at an increasing rate in recent decades.

More than 60 institutions across the globe are involved in GLIMS. Jeffrey S. Kargel of the Department of Hydrology and Water Resources at the University of Arizona coordinates the project. The GLIMS glacier database and GLIMS web site are developed and maintained by the National Snow and Ice Data Center (NSIDC) at the University of Colorado in Boulder.

Global Land Ice Measurements from Space</em?

Hardcover $279.00; £180.00; € 199,99

Springer and Praxis Publishing (2014) ISBN 978-3-540-79817-0

Also available as an eBook

Massive geographic change may have triggered explosion of animal life

A new analysis from The University of Texas at Austin's Institute for Geophysics suggests a deep oceanic gateway, shown in blue, developed between the Pacific and Iapetus oceans immediately before the Cambrian sea level rise and explosion of life in the fossil record, isolating Laurentia from the supercontinent Gondwanaland. -  Ian Dalziel
A new analysis from The University of Texas at Austin’s Institute for Geophysics suggests a deep oceanic gateway, shown in blue, developed between the Pacific and Iapetus oceans immediately before the Cambrian sea level rise and explosion of life in the fossil record, isolating Laurentia from the supercontinent Gondwanaland. – Ian Dalziel

A new analysis of geologic history may help solve the riddle of the “Cambrian explosion,” the rapid diversification of animal life in the fossil record 530 million years ago that has puzzled scientists since the time of Charles Darwin.

A paper by Ian Dalziel of The University of Texas at Austin’s Jackson School of Geosciences, published in the November issue of Geology, a journal of the Geological Society of America, suggests a major tectonic event may have triggered the rise in sea level and other environmental changes that accompanied the apparent burst of life.

The Cambrian explosion is one of the most significant events in Earth’s 4.5-billion-year history. The surge of evolution led to the sudden appearance of almost all modern animal groups. Fossils from the Cambrian explosion document the rapid evolution of life on Earth, but its cause has been a mystery.

The sudden burst of new life is also called “Darwin’s dilemma” because it appears to contradict Charles Darwin’s hypothesis of gradual evolution by natural selection.

“At the boundary between the Precambrian and Cambrian periods, something big happened tectonically that triggered the spreading of shallow ocean water across the continents, which is clearly tied in time and space to the sudden explosion of multicellular, hard-shelled life on the planet,” said Dalziel, a research professor at the Institute for Geophysics and a professor in the Department of Geological Sciences.

Beyond the sea level rise itself, the ancient geologic and geographic changes probably led to a buildup of oxygen in the atmosphere and a change in ocean chemistry, allowing more complex life-forms to evolve, he said.

The paper is the first to integrate geological evidence from five present-day continents — North America, South America, Africa, Australia and Antarctica — in addressing paleogeography at that critical time.

Dalziel proposes that present-day North America was still attached to the southern continents until sometime into the Cambrian period. Current reconstructions of the globe’s geography during the early Cambrian show the ancient continent of Laurentia — the ancestral core of North America — as already having separated from the supercontinent Gondwanaland.

In contrast, Dalziel suggests the development of a deep oceanic gateway between the Pacific and Iapetus (ancestral Atlantic) oceans isolated Laurentia in the early Cambrian, a geographic makeover that immediately preceded the global sea level rise and apparent explosion of life.

“The reason people didn’t make this connection before was because they hadn’t looked at all the rock records on the different present-day continents,” he said.

The rock record in Antarctica, for example, comes from the very remote Ellsworth Mountains.

“People have wondered for a long time what rifted off there, and I think it was probably North America, opening up this deep seaway,” Dalziel said. “It appears ancient North America was initially attached to Antarctica and part of South America, not to Europe and Africa, as has been widely believed.”

Although the new analysis adds to evidence suggesting a massive tectonic shift caused the seas to rise more than half a billion years ago, Dalziel said more research is needed to determine whether this new chain of paleogeographic events can truly explain the sudden rise of multicellular life in the fossil record.

“I’m not claiming this is the ultimate explanation of the Cambrian explosion,” Dalziel said. “But it may help to explain what was happening at that time.”

###

To read the paper go to http://geology.gsapubs.org/content/early/2014/09/25/G35886.1.abstract

New study finds oceans arrived early to Earth

In this illustration of the early solar system, the dashed white line represents the snow line -- the transition from the hotter inner solar system, where water ice is not stable (brown) to the outer Solar system, where water ice is stable (blue). Two possible ways that the inner solar system received water are: water molecules sticking to dust grains inside the 'snow line' (as shown in the inset) and carbonaceous chondrite material flung into the inner solar system by the effect of gravity from protoJupiter. With either scenario, water must accrete to the inner planets within the first ca. 10 million years of solar system formation. -  Illustration by Jack Cook, Woods Hole Oceanographic Institution
In this illustration of the early solar system, the dashed white line represents the snow line — the transition from the hotter inner solar system, where water ice is not stable (brown) to the outer Solar system, where water ice is stable (blue). Two possible ways that the inner solar system received water are: water molecules sticking to dust grains inside the ‘snow line’ (as shown in the inset) and carbonaceous chondrite material flung into the inner solar system by the effect of gravity from protoJupiter. With either scenario, water must accrete to the inner planets within the first ca. 10 million years of solar system formation. – Illustration by Jack Cook, Woods Hole Oceanographic Institution

Earth is known as the Blue Planet because of its oceans, which cover more than 70 percent of the planet’s surface and are home to the world’s greatest diversity of life. While water is essential for life on the planet, the answers to two key questions have eluded us: where did Earth’s water come from and when?

While some hypothesize that water came late to Earth, well after the planet had formed, findings from a new study led by scientists at the Woods Hole Oceanographic Institution (WHOI) significantly move back the clock for the first evidence of water on Earth and in the inner solar system.

“The answer to one of the basic questions is that our oceans were always here. We didn’t get them from a late process, as was previously thought,” said Adam Sarafian, the lead author of the paper published Oct. 31, 2014, in the journal Science and a MIT/WHOI Joint Program student in the Geology and Geophysics Department.

One school of thought was that planets originally formed dry, due to the high-energy, high-impact process of planet formation, and that the water came later from sources such as comets or “wet” asteroids, which are largely composed of ices and gases.

“With giant asteroids and meteors colliding, there’s a lot of destruction,” said Horst Marschall, a geologist at WHOI and coauthor of the paper. “Some people have argued that any water molecules that were present as the planets were forming would have evaporated or been blown off into space, and that surface water as it exists on our planet today, must have come much, much later—hundreds of millions of years later.”

The study’s authors turned to another potential source of Earth’s water— carbonaceous chondrites. The most primitive known meteorites, carbonaceous chondrites, were formed in the same swirl of dust, grit, ice and gasses that gave rise to the sun some 4.6 billion years ago, well before the planets were formed.

“These primitive meteorites resemble the bulk solar system composition,” said WHOI geologist and coauthor Sune Nielsen. “They have quite a lot of water in them, and have been thought of before as candidates for the origin of Earth’s water.

In order to determine the source of water in planetary bodies, scientists measure the ratio between the two stable isotopes of hydrogen: deuterium and hydrogen. Different regions of the solar system are characterized by highly variable ratios of these isotopes. The study’s authors knew the ratio for carbonaceous chondrites and reasoned that if they could compare that to an object that was known to crystallize while Earth was actively accreting then they could gauge when water appeared on Earth.

To test this hypothesis, the research team, which also includes Francis McCubbin from the Institute of Meteoritics at the University of New Mexico and Brian Monteleone of WHOI, utilized meteorite samples provided by NASA from the asteroid 4-Vesta. The asteroid 4-Vesta, which formed in the same region of the solar system as Earth, has a surface of basaltic rock—frozen lava. These basaltic meteorites from 4-Vesta are known as eucrites and carry a unique signature of one of the oldest hydrogen reservoirs in the solar system. Their age—approximately 14 million years after the solar system formed—makes them ideal for determining the source of water in the inner solar system at a time when Earth was in its main building phase. The researchers analyzed five different samples at the Northeast National Ion Microprobe Facility—a state-of-the-art national facility housed at WHOI that utilizes secondary ion mass spectrometers. This is the first time hydrogen isotopes have been measured in eucrite meteorites.

The measurements show that 4-Vesta contains the same hydrogen isotopic composition as carbonaceous chondrites, which is also that of Earth. That, combined with nitrogen isotope data, points to carbonaceous chondrites as the most likely common source of water.

“The study shows that Earth’s water most likely accreted at the same time as the rock. The planet formed as a wet planet with water on the surface,” Marschall said.

While the findings don’t preclude a late addition of water on Earth, it shows that it wasn’t necessary since the right amount and composition of water was present at a very early stage.

“An implication of that is that life on our planet could have started to begin very early,” added Nielsen. “Knowing that water came early to the inner solar system also means that the other inner planets could have been wet early and evolved life before they became the harsh environments they are today.

Rare 2.5-billion-year-old rocks reveal hot spot of sulfur-breathing bacteria

Gold miners prospecting in a mountainous region of Brazil drilled this 590-foot cylinder of bedrock from the Neoarchaean Eon, which provides rare evidence of conditions on Earth 2.5 billion years ago. -  Alan J. Kaufman
Gold miners prospecting in a mountainous region of Brazil drilled this 590-foot cylinder of bedrock from the Neoarchaean Eon, which provides rare evidence of conditions on Earth 2.5 billion years ago. – Alan J. Kaufman

Wriggle your toes in a marsh’s mucky bottom sediment and you’ll probably inhale a rotten egg smell, the distinctive odor of hydrogen sulfide gas. That’s the biochemical signature of sulfur-using bacteria, one of Earth’s most ancient and widespread life forms.

Among scientists who study the early history of our 4.5 billion-year-old planet, there is a vigorous debate about the evolution of sulfur-dependent bacteria. These simple organisms arose at a time when oxygen levels in the atmosphere were less than one-thousandth of what they are now. Living in ocean waters, they respired (or breathed in) sulfate, a form of sulfur, instead of oxygen. But how did that sulfate reach the ocean, and when did it become abundant enough for living things to use it?

New research by University of Maryland geology doctoral student Iadviga Zhelezinskaia offers a surprising answer. Zhelezinskaia is the first researcher to analyze the biochemical signals of sulfur compounds found in 2.5 billion-year-old carbonate rocks from Brazil. The rocks were formed on the ocean floor in a geologic time known as the Neoarchaean Eon. They surfaced when prospectors drilling for gold in Brazil punched a hole into bedrock and pulled out a 590-foot-long core of ancient rocks.

In research published Nov. 7, 2014 in the journal Science, Zhelezinskaia and three co-authors–physicist John Cliff of the University of Western Australia and geologists Alan Kaufman and James Farquhar of UMD–show that bacteria dependent on sulfate were plentiful in some parts of the Neoarchaean ocean, even though sea water typically contained about 1,000 times less sulfate than it does today.

“The samples Iadviga measured carry a very strong signal that sulfur compounds were consumed and altered by living organisms, which was surprising,” says Farquhar. “She also used basic geochemical models to give an idea of how much sulfate was in the oceans, and finds the sulfate concentrations are very low, much lower than previously thought.”

Geologists study sulfur because it is abundant and combines readily with other elements, forming compounds stable enough to be preserved in the geologic record. Sulfur has four naturally occurring stable isotopes–atomic signatures left in the rock record that scientists can use to identify the elements’ different forms. Researchers measuring sulfur isotope ratios in a rock sample can learn whether the sulfur came from the atmosphere, weathering rocks or biological processes. From that information about the sulfur sources, they can deduce important information about the state of the atmosphere, oceans, continents and biosphere when those rocks formed.

Farquhar and other researchers have used sulfur isotope ratios in Neoarchaean rocks to show that soon after this period, Earth’s atmosphere changed. Oxygen levels soared from just a few parts per million to almost their current level, which is around 21 percent of all the gases in the atmosphere. The Brazilian rocks Zhelezinskaia sampled show only trace amounts of oxygen, a sign they were formed before this atmospheric change.

With very little oxygen, the Neoarchaean Earth was a forbidding place for most modern life forms. The continents were probably much drier and dominated by volcanoes that released sulfur dioxide, carbon dioxide, methane and other greenhouse gases. Temperatures probably ranged between 0 and 100 degrees Celsius (32 to 212 degrees Fahrenheit), warm enough for liquid oceans to form and microbes to grow in them.

Rocks 2.5 billion years old or older are extremely rare, so geologists’ understanding of the Neoarchaean are based on a handful of samples from a few small areas, such as Western Australia, South Africa and Brazil. Geologists theorize that Western Australia and South Africa were once part of an ancient supercontinent called Vaalbara. The Brazilian rock samples are comparable in age, but they may not be from the same supercontinent, Zhelezinskaia says.

Most of the Neoarchaean rocks studied are from Western Australia and South Africa and are black shale, which forms when fine dust settles on the sea floor. The Brazilian prospector’s core contains plenty of black shale and a band of carbonate rock, formed below the surface of shallow seas, in a setting that probably resembled today’s Bahama Islands. Black shale usually contains sulfur-bearing pyrite, but carbonate rock typically does not, so geologists have not focused on sulfur signals in Neoarchaean carbonate rocks until now.

Zhelezinskaia “chose to look at a type of rock that others generally avoided, and what she saw was spectacularly different,” said Kaufman. “It really opened our eyes to the implications of this study.”

The Brazilian carbonate rocks’ isotopic ratios showed they formed in ancient seabed containing sulfate from atmospheric sources, not continental rock. And the isotopic ratios also showed that Neoarchaean bacteria were plentiful in the sediment, respiring sulfate and emitted hydrogen sulfide–the same process that goes on today as bacteria recycle decaying organic matter into minerals and gases.

How could the sulfur-dependent bacteria have thrived during a geologic time when sulfur levels were so low? “It seems that they were in shallow water, where evaporation may have been high enough to concentrate the sulfate, and that would make it abundant enough to support the bacteria,” says Zhelezinskaia.

Zhelezinskaia is now analyzing carbonate rocks of the same age from Western Australia and South Africa, to see if the pattern holds true for rocks formed in other shallow water environments. If it does, the results may change scientists’ understanding of one of Earth’s earliest biological processes.

“There is an ongoing debate about when sulfate-reducing bacteria arose and how that fits into the evolution of life on our planet,” says Farquhar. “These rocks are telling us the bacteria were there 2.5 billion years ago, and they were doing something significant enough that we can see them today.”

###

This research was supported by the Fulbright Program (Grantee ID 15110620), the NASA Astrobiology Institute (Grant No. NNA09DA81A) and the National Science Foundation Frontiers in Earth-System Dynamics program (Grant No. 432129). The content of this article does not necessarily reflect the views of these organizations.

“Large sulfur isotope fractionations associated with Neoarchaean microbial sulfate reductions,” Iadviga Zhelezinskaia, Alan J. Kaufman, James Farquhar and John Cliff, was published Nov. 7, 2014 in Science. Download the abstract after 2 p.m. U.S. Eastern time, Nov. 6, 2014: http://www.sciencemag.org/lookup/doi/10.1126/science.1256211

James Farquhar home page

http://www.geol.umd.edu/directory.php?id=13

Alan J. Kaufman home page

http://www.geol.umd.edu/directory.php?id=15

Iadviga Zhelezinskaia home page

http://www.geol.umd.edu/directory.php?id=66

Media Relations Contact: Abby Robinson, 301-405-5845, abbyr@umd.edu

Writer: Heather Dewar

Synthetic biology for space exploration

Synthetic biology could be a key to manned space exploration of Mars. -  Photo courtesy of NASA
Synthetic biology could be a key to manned space exploration of Mars. – Photo courtesy of NASA

Does synthetic biology hold the key to manned space exploration of Mars and the Moon? Berkeley Lab researchers have used synthetic biology to produce an inexpensive and reliable microbial-based alternative to the world’s most effective anti-malaria drug, and to develop clean, green and sustainable alternatives to gasoline, diesel and jet fuels. In the future, synthetic biology could also be used to make manned space missions more practical.

“Not only does synthetic biology promise to make the travel to extraterrestrial locations more practical and bearable, it could also be transformative once explorers arrive at their destination,” says Adam Arkin, director of Berkeley Lab’s Physical Biosciences Division (PBD) and a leading authority on synthetic and systems biology.

“During flight, the ability to augment fuel and other energy needs, to provide small amounts of needed materials, plus renewable, nutritional and taste-engineered food, and drugs-on-demand can save costs and increase astronaut health and welfare,” Arkin says. “At an extraterrestrial base, synthetic biology could even make more effective use of the catalytic activities of diverse organisms.”

Arkin is the senior author of a paper in the Journal of the Royal Society Interface that reports on a techno-economic analysis demonstrating “the significant utility of deploying non-traditional biological techniques to harness available volatiles and waste resources on manned long-duration space missions.” The paper is titled “Towards Synthetic Biological Approaches to Resource Utilization on Space Missions.” The lead and corresponding author is Amor Menezes, a postdoctoral scholar in Arkin’s research group at the University of California (UC) Berkeley. Other co-authors are John Cumbers and John Hogan with the NASA Ames Research Center.

One of the biggest challenges to manned space missions is the expense. The NASA rule-of-thumb is that every unit mass of payload launched requires the support of an additional 99 units of mass, with “support” encompassing everything from fuel to oxygen to food and medicine for the astronauts, etc. Most of the current technologies now deployed or under development for providing this support are abiotic, meaning non-biological. Arkin, Menezes and their collaborators have shown that providing this support with technologies based on existing biological processes is a more than viable alternative.

“Because synthetic biology allows us to engineer biological processes to our advantage, we found in our analysis that technologies, when using common space metrics such as mass, power and volume, have the potential to provide substantial cost savings, especially in mass,” Menezes says.

In their study, the authors looked at four target areas: fuel generation, food production, biopolymer synthesis, and pharmaceutical manufacture. They showed that for a 916 day manned mission to Mars, the use of microbial biomanufacturing capabilities could reduce the mass of fuel manufacturing by 56-percent, the mass of food-shipments by 38-percent, and the shipped mass to 3D-print a habitat for six by a whopping 85-percent. In addition, microbes could also completely replenish expired or irradiated stocks of pharmaceuticals, which would provide independence from unmanned re-supply spacecraft that take up to 210 days to arrive.

“Space has always provided a wonderful test of whether technology can meet strict engineering standards for both effect and safety,” Arkin says. “NASA has worked decades to ensure that the specifications that new technologies must meet are rigorous and realistic, which allowed us to perform up-front techno-economic analysis.”

The big advantage biological manufacturing holds over abiotic manufacturing is the remarkable ability of natural and engineered microbes to transform very simple starting substrates, such as carbon dioxide, water biomass or minerals, into materials that astronauts on long-term missions will need. This capability should prove especially useful for future extraterrestrial settlements.

“The mineral and carbon composition of other celestial bodies is different from the bulk of Earth, but the earth is diverse with many extreme environments that have some relationship to those that might be found at possible bases on the Moon or Mars,” Arkin says. “Microbes could be used to greatly augment the materials available at a landing site, enable the biomanufacturing of food and pharmaceuticals, and possibly even modify and enrich local soils for agriculture in controlled environments.”

The authors acknowledge that much of their analysis is speculative and that their calculations show a number of significant challenges to making biomanufacturing a feasible augmentation and replacement for abiotic technologies. However, they argue that the investment to overcome these barriers offers dramatic potential payoff for future space programs.

“We’ve got a long way to go since experimental proof-of-concept work in synthetic biology for space applications is just beginning, but long-duration manned missions are also a ways off,” says Menezes. “Abiotic technologies were developed for many, many decades before they were successfully utilized in space, so of course biological technologies have some catching-up to do. However, this catching-up may not be that much, and in some cases, the biological technologies may already be superior to their abiotic counterparts.”

###

This research was supported by the National Aeronautics and Space Administration (NASA) and the University of California, Santa Cruz.

Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit http://www.lbl.gov.

Offshore islands amplify, rather than dissipate, a tsunami’s power

This model shows the impact of coastal islands on a tsunami's height. -  Courtesy of Jose Borrero/eCoast/USC
This model shows the impact of coastal islands on a tsunami’s height. – Courtesy of Jose Borrero/eCoast/USC

A long-held belief that offshore islands protect the mainland from tsunamis turns out to be the exact opposite of the truth, according to a new study.

Common wisdom — from Southern California to the South Pacific — for coastal residents and scientists alike has long been that offshore islands would create a buffer that blocked the power of a tsunami. In fact, computer modeling of tsunamis striking a wide variety of different offshore island geometries yielded no situation in which the mainland behind them fared better.

Instead, islands focused the energy of the tsunami, increasing flooding on the mainland by up to 70 percent.

“This is where many fishing villages are located, behind offshore islands, in the belief that they will be protected from wind waves. Even Southern California residents believe that the Channel Islands and Catalina will protect them,” said Costas Synolakis of the USC Viterbi School of Engineering, a member of the multinational team that conducted the research.

The research was inspired by a field survey of the impact of the 2010 tsunami on the Mentawai Islands off of Sumatra. The survey data showed that villages located in the shadow of small offshore islets suffered some of the strongest tsunami impacts, worse than villages located along open coasts.

Subsequent computer modeling by Jose Borrero, adjunct assistant research professor at the USC Viterbi Tsunami Research Center, showed that the offshore islands had actually contributed to — not diminished — the tsunami’s impact.

Synolakis then teamed up with researchers Emile Contal and Nicolas Vayatis of Ecoles Normales de Cachan in Paris; and Themistoklis S. Stefanakis and Frederic Dias, who both have joint appointments at Ecoles Normales de Cachan and University College Dublin to determine whether that was a one-of-a-kind situation, or the norm.

Their study, of which Dias was the corresponding author, was published in Proceedings of the Royal Society A on Nov. 5.

The team designed a computer model that took into consideration various island slopes, beach slopes, water depths, distance between the island and the beach, and wavelength of the incoming tsunami.

“Even a casual analysis of these factors would have required hundreds of thousands of computations, each of which could take up to half a day,” Synolakis said. “So instead, we used machine learning.”

Machine learning is a mathematical process that makes it easier to identify the maximum values of interdependent processes with multiple parameters by allowing the computer to “learn” from previous results.

The computer starts to understand how various tweaks to the parameters affect the overall outcome and finds the best answer quicker. As such, results that traditionally could have taken hundreds of thousands of models to uncover were found with 200 models.

“This work is applicable to some of our tsunami study sites in New Zealand,” said Borrero, who is producing tsunami hazard maps for regions of the New Zealand coast. “The northeast coast of New Zealand has many small islands offshore, similar to those in Indonesia, and our modeling suggests that this results in areas of enhanced tsunami heights.”

“Substantial public education efforts are needed to help better explain to coastal residents tsunami hazards, and whenever they need to be extra cautious and responsive with evacuations during actual emergencies,” Synolakis said.

###

The research was funded by EDSP of ENS-Cachan; the Cultural Service of the French Embassy in Dublin; the ERC; SFI; University College Dublin; and the EU FP7 program ASTARTE. The study can be found online at http://rspa.royalsocietypublishing.org/content/470/2172/20140575.

The breathing sand

An Eddy Correlation Lander analyzes the strength of the oxygen fluxes at the bottom of the North Sea. -  Photo: ROV-Team, GEOMAR
An Eddy Correlation Lander analyzes the strength of the oxygen fluxes at the bottom of the North Sea. – Photo: ROV-Team, GEOMAR

A desert at the bottom of the sea? Although the waters of the North Sea exchange about every two to three years, there is evidence of decreasing oxygen content. If lower amounts of this gas are dissolved in seawater, organisms on and in the seabed produce less energy – with implications for larger creatures and the biogeochemical cycling in the marine ecosystem. Since nutrients, carbon and oxygen circulate very well and are processed quickly in the permeable, sandy sediments that make up two-thirds of the North Sea, measurements of metabolic rates are especially difficult here. Using the new Aquatic Eddy Correlation technique, scientists from GEOMAR Helmholtz Centre for Ocean Research Kiel, Leibniz Institute of Freshwater Ecology and Inland Fisheries, the University of Southern Denmark, the University of Koblenz-Landau, the Scottish Marine Institute and Aarhus University were able to demonstrate how oxygen flows at the ground of the North Sea. Their methods and results are presented in the Journal of Geophysical Research: Oceans.

“The so-called ‘Eddy Correlation’ technique detects the flow of oxygen through these small turbulences over an area of several square meters. It considers both the mixing of sediments by organisms living in it and the hydrodynamics of the water above the rough sea floor”, Dr. Peter Linke, a marine biologist at GEOMAR, explains. “Previous methods overlooked only short periods or disregarded important parameters. Now we can create a more realistic picture.” The new method also takes into account the fact that even small objects such as shells or ripples shaped by wave action or currents are able to impact the oxygen exchange in permeable sediments.

On the expedition CE0913 with the Irish research vessel CELTIC EXPLORER, scientists used the underwater robot ROV KIEL 6000 to place three different instruments within the “Tommeliten” area belonging to Norway: Two “Eddy Correlation Landers” recorded the strength of oxygen fluxes over three tidal cycles. Information about the distribution of oxygen in the sediment was collected with a “Profiler Lander”, a seafloor observatory with oxygen sensors and flow meters. A “Benthic chamber” isolated 314 square centimetres of sediment and took samples from the overlying water over a period of 24 hours to determine the oxygen consumption of the sediment.

“The combination of traditional tools with the ‘Eddy Correlation’ technique has given us new insights into the dynamics of the exchange of substances between the sea water and the underlying sediment. A variety of factors determine the timing and amount of oxygen available. Currents that provide the sandy sediment with oxygen, but also the small-scale morphology of the seafloor, ensure that small benthic organisms are able to process carbon or other nutrients. The dependencies are so complex that they can be decrypted only by using special methods”, Dr. Linke summarizes. Therefore, detailed measurements in the water column and at the boundary to the seafloor as well as model calculations are absolutely necessary to understand basic functions and better estimate future changes in the cycle of materials. “With conventional methods, for example, we would never have been able to find that the loose sandy sediment stores oxygen brought in by the currents for periods of less water movement and less oxygen introduction.”

Original publication:
McGinnis, D. F., S. Sommer, A. Lorke, R. N. Glud, P. Linke (2014): Quantifying tidally driven benthic oxygen exchange across permeable sediments: An aquatic eddy correlation study. Journal of Geophysical Research: Oceans, doi:10.1002/2014JC010303.

Links:

GEOMAR Helmholtz Centre for Ocean Research Kiel

Eddy correlation information page

Leibniz Institute of Freshwater Ecology and Inland Fisheries, IGB

University of Southern Denmark

University of Koblenz-Landau

Scottish Marine Institute

Aarhus University

Images:
High resolution images can be downloaded at http://www.geomar.de/n2110-e.

Video footage is available on request.

Contact:
Dr. Peter Linke (GEOMAR FB2-MG), Tel. 0431 600-2115, plinke@geomar.de

Maike Nicolai (GEOMAR, Kommunikation & Medien), Tel. 0431 600-2807, mnicolai@geomar.de