Study hints that ancient Earth made its own water — geologically

A new study is helping to answer a longstanding question that has recently moved to the forefront of earth science: Did our planet make its own water through geologic processes, or did water come to us via icy comets from the far reaches of the solar system?

The answer is likely “both,” according to researchers at The Ohio State University– and the same amount of water that currently fills the Pacific Ocean could be buried deep inside the planet right now.

At the American Geophysical Union (AGU) meeting on Wednesday, Dec. 17, they report the discovery of a previously unknown geochemical pathway by which the Earth can sequester water in its interior for billions of years and still release small amounts to the surface via plate tectonics, feeding our oceans from within.

In trying to understand the formation of the early Earth, some researchers have suggested that the planet was dry and inhospitable to life until icy comets pelted the earth and deposited water on the surface.

Wendy Panero, associate professor of earth sciences at Ohio State, and doctoral student Jeff Pigott are pursuing a different hypothesis: that Earth was formed with entire oceans of water in its interior, and has been continuously supplying water to the surface via plate tectonics ever since.

Researchers have long accepted that the mantle contains some water, but how much water is a mystery. And, if some geological mechanism has been supplying water to the surface all this time, wouldn’t the mantle have run out of water by now?

Because there’s no way to directly study deep mantle rocks, Panero and Pigott are probing the question with high-pressure physics experiments and computer calculations.

“When we look into the origins of water on Earth, what we’re really asking is, why are we so different than all the other planets?” Panero said. “In this solar system, Earth is unique because we have liquid water on the surface. We’re also the only planet with active plate tectonics. Maybe this water in the mantle is key to plate tectonics, and that’s part of what makes Earth habitable.”

Central to the study is the idea that rocks that appear dry to the human eye can actually contain water–in the form of hydrogen atoms trapped inside natural voids and crystal defects. Oxygen is plentiful in minerals, so when a mineral contains some hydrogen, certain chemical reactions can free the hydrogen to bond with the oxygen and make water.

Stray atoms of hydrogen could make up only a tiny fraction of mantle rock, the researchers explained. Given that the mantle is more than 80 percent of the planet’s total volume, however, those stray atoms add up to a lot of potential water.

In a lab at Ohio State, the researchers compress different minerals that are common to the mantle and subject them to high pressures and temperatures using a diamond anvil cell–a device that squeezes a tiny sample of material between two diamonds and heats it with a laser–to simulate conditions in the deep Earth. They examine how the minerals’ crystal structures change as they are compressed, and use that information to gauge the minerals’ relative capacities for storing hydrogen. Then, they extend their experimental results using computer calculations to uncover the geochemical processes that would enable these minerals to rise through the mantle to the surface–a necessary condition for water to escape into the oceans.

In a paper now submitted to a peer-reviewed academic journal, they reported their recent tests of the mineral bridgmanite, a high-pressure form of olivine. While bridgmanite is the most abundant mineral in the lower mantle, they found that it contains too little hydrogen to play an important role in Earth’s water supply.

Another research group recently found that ringwoodite, another form of olivine, does contain enough hydrogen to make it a good candidate for deep-earth water storage. So Panero and Pigott focused their study on the depth where ringwoodite is found–a place 325-500 miles below the surface that researchers call the “transition zone”–as the most likely region that can hold a planet’s worth of water. From there, the same convection of mantle rock that produces plate tectonics could carry the water to the surface.

One problem: If all the water in ringwoodite is continually drained to the surface via plate tectonics, how could the planet hold any in reserve?

For the research presented at AGU, Panero and Pigott performed new computer calculations of the geochemistry in the lowest portion of the mantle, some 500 miles deep and more. There, another mineral, garnet, emerged as a likely water-carrier–a go-between that could deliver some of the water from ringwoodite down into the otherwise dry lower mantle.

If this scenario is accurate, the Earth may today hold half as much water in its depths as is currently flowing in oceans on the surface, Panero said–an amount that would approximately equal the volume of the Pacific Ocean. This water is continuously cycled through the transition zone as a result of plate tectonics.

“One way to look at this research is that we’re putting constraints on the amount of water that could be down there,” Pigott added.

Panero called the complex relationship between plate tectonics and surface water “one of the great mysteries in the geosciences.” But this new study supports researchers’ growing suspicion that mantle convection somehow regulates the amount of water in the oceans. It also vastly expands the timeline for Earth’s water cycle.

“If all of the Earth’s water is on the surface, that gives us one interpretation of the water cycle, where we can think of water cycling from oceans into the atmosphere and into the groundwater over millions of years,” she said. “But if mantle circulation is also part of the water cycle, the total cycle time for our planet’s water has to be billions of years.”

Volcano hazards and the role of westerly wind bursts in El Niño

On June 27, lava from Kīlauea, an active volcano on the island of Hawai'i, began flowing to the northeast, threatening the residents in a community in the District of Puna. -  USGS
On June 27, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in a community in the District of Puna. – USGS

On 27 June, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in Pāhoa, a community in the District of Puna, as well as the only highway accessible to this area. Scientists from the U.S. Geological Survey’s Hawaiian Volcano Observatory (HVO) and the Hawai’i County Civil Defense have been monitoring the volcano’s lava flow and communicating with affected residents through public meetings since 24 August. Eos recently spoke with Michael Poland, a geophysicist at HVO and a member of the Eos Editorial Advisory Board, to discuss how he and his colleagues communicated this threat to the public.

Drilling a Small Basaltic Volcano to Reveal Potential Hazards


Drilling into the Rangitoto Island Volcano in the Auckland Volcanic Field in New Zealand offers insight into a small monogenetic volcano, and may improve understanding of future hazards.

From AGU’s journals: El Niño fades without westerly wind bursts

The warm and wet winter of 1997 brought California floods, Florida tornadoes, and an ice storm in the American northeast, prompting climatologists to dub it the El Niño of the century. Earlier this year, climate scientists thought the coming winter might bring similar extremes, as equatorial Pacific Ocean conditions resembled those seen in early 1997. But the signals weakened by summer, and the El Niño predictions were downgraded. Menkes et al. used simulations to examine the differences between the two years.

The El Niño-Southern Oscillation is defined by abnormally warm sea surface temperatures in the eastern Pacific Ocean and weaker than usual trade winds. In a typical year, southeast trade winds push surface water toward the western Pacific “warm pool”–a region essential to Earth’s climate. The trade winds dramatically weaken or even reverse in El Niño years, and the warm pool extends its reach east.

Scientists have struggled to predict El Niño due to irregularities in the shape, amplitude, and timing of the surges of warm water. Previous studies suggested that short-lived westerly wind pulses (i.e. one to two weeks long) could contribute to this irregularity by triggering and sustaining El Niño events.

To understand the vanishing 2014 El Niño, the authors used computer simulations and examined the wind’s role. The researchers find pronounced differences between 1997 and 2014. Both years saw strong westerly wind events between January and March, but those disappeared this year as spring approached. In contrast, the westerly winds persisted through summer in 1997.

In the past, it was thought that westerly wind pulses were three times as likely to form if the warm pool extended east of the dateline. That did not occur this year. The team says their analysis shows that El Niño’s strength might depend on these short-lived and possibly unpredictable pulses.

###

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing more than 62,000 members in 144 countries. Join our conversation on Facebook, Twitter, YouTube, and other social media channels.

Ancient shellfish remains rewrite 10,000-year history of El Nino cycles

The middens are ancient dumping sites that typically contain a mix of mollusk shells, fish and bird bones, ceramics, cloth, charcoal, maize and other plants. -  M. Carré / Univ. of Montpellier
The middens are ancient dumping sites that typically contain a mix of mollusk shells, fish and bird bones, ceramics, cloth, charcoal, maize and other plants. – M. Carré / Univ. of Montpellier

The planet’s largest and most powerful driver of climate changes from one year to the next, the El Niño Southern Oscillation in the tropical Pacific Ocean, was widely thought to have been weaker in ancient times because of a different configuration of the Earth’s orbit. But scientists analyzing 25-foot piles of ancient shells have found that the El Niños 10,000 years ago were as strong and frequent as the ones we experience today.

The results, from the University of Washington and University of Montpellier, question how well computer models can reproduce historical El Niño cycles, or predict how they could change under future climates. The paper is now online and will appear in an upcoming issue of Science.

“We thought we understood what influences the El Niño mode of climate variation, and we’ve been able to show that we actually don’t understand it very well,” said Julian Sachs, a UW professor of oceanography.

The ancient shellfish feasts also upend a widely held interpretation of past climate.

“Our data contradicts the hypothesis that El Niño activity was very reduced 10,000 years ago, and then slowly increased since then,” said first author Matthieu Carré, who did the research as a UW postdoctoral researcher and now holds a faculty position at the University of Montpellier in France.

In 2007, while at the UW-based Joint Institute for the Study of the Atmosphere and Ocean, Carré accompanied archaeologists to seven sites in coastal Peru. Together they sampled 25-foot-tall piles of shells from Mesodesma donacium clams eaten and then discarded over centuries into piles that archaeologists call middens.

While in graduate school, Carré had developed a technique to analyze shell layers to get ocean temperatures, using carbon dating of charcoal from fires to get the year, and the ratio of oxygen isotopes in the growth layers to get the water temperatures as the shell was forming.

The shells provide 1- to 3-year-long records of monthly temperature of the Pacific Ocean along the coast of Peru. Combining layers of shells from each site gives water temperatures for intervals spanning 100 to 1,000 years during the past 10,000 years.

The new record shows that 10,000 years ago the El Niño cycles were strong, contradicting the current leading interpretations. Roughly 7,000 years ago the shells show a shift to the central Pacific of the most severe El Niño impacts, followed by a lull in the strength and occurrence of El Niño from about 6,000 to 4,000 years ago.

One possible explanation for the surprising finding of a strong El Niño 10,000 years ago was that some other factor was compensating for the dampening effect expected from cyclical changes in Earth’s orbit around the sun during that period.

“The best candidate is the polar ice sheet, which was melting very fast in this period and may have increased El Niño activity by changing ocean currents,” Carré said.

Around 6,000 years ago most of the ice age floes would have finished melting, so the effect of Earth’s orbital geometry might have taken over then to cause the period of weak El Niños.

In previous studies, warm-water shells and evidence of flooding in Andean lakes had been interpreted as signs of a much weaker El Niño around 10,000 years ago.

The new data is more reliable, Carré said, for three reasons: the Peruvian coast is strongly affected by El Niño; the shells record ocean temperature, which is the most important parameter for the El Niño cycles; and the ability to record seasonal changes, the timescale at which El Niño can be observed.

“Climate models and a variety of datasets had concluded that El Niños were essentially nonexistent, did not occur, before 6,000 to 8,000 years ago,” Sachs said. “Our results very clearly show that this is not the case, and suggest that current understanding of the El Niño system is incomplete.

Research provides new theory on cause of ice age 2.6 million years ago

New research published today (Friday 27th June 2014) in the journal Nature Scientific Reports has provided a major new theory on the cause of the ice age that covered large parts of the Northern Hemisphere 2.6 million years ago.

The study, co-authored by Dr Thomas Stevens, from the Department of Geography at Royal Holloway, University of London, found a previously unknown mechanism by which the joining of North and South America changed the salinity of the Pacific Ocean and caused major ice sheet growth across the Northern Hemisphere.

The change in salinity encouraged sea ice to form which in turn created a change in wind patterns, leading to intensified monsoons. These provided moisture that caused an increase in snowfall and the growth of major ice sheets, some of which reached 3km thick.

The team of researchers analysed deposits of wind-blown dust called red clay that accumulated between six million and two and a half million years ago in north central China, adjacent to the Tibetan plateau, and used them to reconstruct changing monsoon precipitation and temperature.

“Until now, the cause of the Quaternary ice age had been a hotly debated topic”, said Dr Stevens. “Our findings suggest a significant link between ice sheet growth, the monsoon and the closing of the Panama Seaway, as North and South America drifted closer together. This provides us with a major new theory on the origins of the ice age, and ultimately our current climate system.”

Surprisingly, the researchers found there was a strengthening of the monsoon during global cooling, instead of the intense rainfall normally associated with warmer climates.

Dr Stevens added: “This led us to discover a previously unknown interaction between plate tectonic movements in the Americas and dramatic changes in global temperature. The intensified monsoons created a positive feedback cycle, promoting more global cooling, more sea ice and even stronger precipitation, culminating in the spread of huge glaciers across the Northern Hemisphere.”

How productive are the ore factories in the deep sea?

About ten years after the first moon landing, scientists on earth made a discovery that proved that our home planet still holds a lot of surprises in store for us. Looking through the portholes of the submersible ALVIN near the bottom of the Pacific Ocean in 1979, American scientists saw for the first time chimneys, several meters tall, from which black water at about 300 degrees and saturated with minerals shot out. What we have found out since then: These “black smokers”, also called hydrothermal vents, exist in all oceans. They occur along the boundaries of tectonic plates along the submarine volcanic chains. However, to date many details of these systems remain unexplained.

One question that has long and intensively been discussed in research is: Where and how deep does seawater penetrate into the seafloor to take up heat and minerals before it leaves the ocean floor at hydrothermal vents? This is of enormous importance for both, the cooling of the underwater volcanoes as well as for the amount of materials dissolved. Using a complex 3-D computer model, scientists at GEOMAR Helmholtz Centre for Ocean Research Kiel were now able to understand the paths of the water toward the black smokers. The study appears in the current issue of the world-renowned scientific journal “Nature“.

In general, it is well known that seawater penetrates into the Earth’s interior through cracks and crevices along the plate boundaries. The seawater is heated by the magma; the hot water rises again, leaches metals and other elements from the ground and is released as a black colored solution. “However, in detail it is somewhat unclear whether the water enters the ocean floor in the immediate vicinity of the vents and flows upward immediately, or whether it travels long distances underground before venting,” explains Dr. Jörg Hasenclever from GEOMAR.

This question is not only important for the fundamental understanding of processes on our planet. It also has very practical implications. Some of the materials leached from the underground are deposited on the seabed and form ore deposits that may be of economically interest. There is a major debate, however, how large the resource potential of these deposits might be. “When we know which paths the water travels underground, we can better estimate the quantities of materials released by black smokers over thousands of years,” says Hasenclever.

Hasenclever and his colleagues have used for the first time a high-resolution computer model of the seafloor to simulate a six kilometer long and deep, and 16 kilometer wide section of a mid-ocean ridge in the Pacific. Among the data used by the model was the heat distribution in the oceanic crust, which is known from seismic studies. In addition, the model also considered the permeability of the rock and the special physical properties of water.

The simulation required several weeks of computing time. The result: “There are actually two different flow paths – about half the water seeps in near the vents, where the ground is very warm. The other half seeps in at greater distances and migrates for kilometers through the seafloor before exiting years later.” Thus, the current study partially confirmed results from a computer model, which were published in 2008 in the scientific journal “Science”. “However, the colleagues back then were able to simulate only a much smaller region of the ocean floor and therefore identified only the short paths near the black smokers,” says Hasenclever.

The current study is based on fundamental work on the modeling of the seafloor, which was conducted in the group of Professor Lars Rüpke within the framework of the Kiel Cluster of Excellence “The Future Ocean”. It provides scientists worldwide with the basis for further investigations to see how much ore is actually on and in the seabed, and whether or not deep-sea mining on a large scale could ever become worthwhile. “So far, we only know the surface of the ore deposits at hydrothermal vents. Nobody knows exactly how much metal is really deposited there. All the discussions about the pros and cons of deep-sea ore mining are based on a very thin database,” says co-author Prof. Dr. Colin Devey from GEOMAR. “We need to collect a lot more data on hydrothermal systems before we can make reliable statements”.

Today’s Antarctic region once as hot as California, Florida

Parts of ancient Antarctica were as warm as today’s California coast, and polar regions of the southern Pacific Ocean registered 21st-century Florida heat, according to scientists using a new way to measure past temperatures.

The findings, published the week of April 21 in the Proceedings of the National Academy of Sciences, underscore the potential for increased warmth at Earth’s poles and the associated risk of melting polar ice and rising sea levels, the researchers said.

Led by scientists at Yale, the study focused on Antarctica during the Eocene epoch, 40-50 million years ago, a period with high concentrations of atmospheric CO2 and consequently a greenhouse climate. Today, Antarctica is year-round one of the coldest places on Earth, and the continent’s interior is the coldest place, with annual average land temperatures far below zero degrees Fahrenheit.

But it wasn’t always that way, and the new measurements can help improve climate models used for predicting future climate, according to co-author Hagit Affek of Yale, associate professor of geology & geophysics.

“Quantifying past temperatures helps us understand the sensitivity of the climate system to greenhouse gases, and especially the amplification of global warming in polar regions,” Affek said.

The paper’s lead author, Peter M.J. Douglas, performed the research as a graduate student in Affek’s Yale laboratory. He is now a postdoctoral scholar at the California Institute of Technology. The research team included paleontologists, geochemists, and a climate physicist.

By measuring concentrations of rare isotopes in ancient fossil shells, the scientists found that temperatures in parts of Antarctica reached as high as 17 degrees Celsius (63F) during the Eocene, with an average of 14 degrees Celsius (57F) – similar to the average annual temperature off the coast of California today.

Eocene temperatures in parts of the southern Pacific Ocean measured 22 degrees Centigrade (or about 72F), researchers said – similar to seawater temperatures near Florida today.

Today the average annual South Pacific sea temperature near Antarctica is about 0 degrees Celsius.

These ancient ocean temperatures were not uniformly distributed throughout the Antarctic ocean regions – they were higher on the South Pacific side of Antarctica – and researchers say this finding suggests that ocean currents led to a temperature difference.

“By measuring past temperatures in different parts of Antarctica, this study gives us a clearer perspective of just how warm Antarctica was when the Earth’s atmosphere contained much more CO2 than it does today,” said Douglas. “We now know that it was warm across the continent, but also that some parts were considerably warmer than others. This provides strong evidence that global warming is especially pronounced close to the Earth’s poles. Warming in these regions has significant consequences for climate well beyond the high latitudes due to ocean circulation and melting of polar ice that leads to sea level rise.”

To determine the ancient temperatures, the scientists measured the abundance of two rare isotopes bound to each other in fossil bivalve shells collected by co-author Linda Ivany of Syracuse University at Seymour Island, a small island off the northeast side of the Antarctic Peninsula. The concentration of bonds between carbon-13 and oxygen-18 reflect the temperature in which the shells grew, the researchers said. They combined these results with other geo-thermometers and model simulations.

The new measurement technique is called carbonate clumped isotope thermometry.

“We managed to combine data from a variety of geochemical techniques on past environmental conditions with climate model simulations to learn something new about how the Earth’s climate system works under conditions different from its current state,” Affek said. “This combined result provides a fuller picture than either approach could on its own.”

Rising mountains dried out Central Asia, scientists say

A record of ancient rainfall teased from long-buried sediments in Mongolia is challenging the popular idea that the arid conditions prevalent in Central Asia today were caused by the ancient uplift of the Himalayas and the Tibetan Plateau.

Instead, Stanford scientists say the formation of two lesser mountain ranges, the Hangay and the Altai, may have been the dominant drivers of climate in the region, leading to the expansion of Asia’s largest desert, the Gobi. The findings will be presented on Thursday, Dec. 12, at the annual meeting of the American Geophysical Union (AGU) in San Francisco.

“These results have major implications for understanding the dominant factors behind modern-day Central Asia’s extremely arid climate and the role of mountain ranges in altering regional climate,” said Page Chamberlain, a professor of environmental Earth system science at Stanford.

Scientists previously thought that the formation of the Himalayan mountain range and the Tibetan plateau around 45 million years ago shaped Asia’s driest environments.

“The traditional explanation has been that the uplift of the Himalayas blocked air from the Indian Ocean from reaching central Asia,” said Jeremy Caves, a doctoral student in Chamberlain’s terrestrial paleoclimate research group who was involved in the study.

This process was thought to have created a distinct rain shadow that led to wetter climates in India and Nepal and drier climates in Central Asia. Similarly, the elevation of the Tibetan Plateau was thought to have triggered an atmospheric process called subsidence, in which a mass of air heated by a high elevation slowly sinks into Central Asia.

“The falling air suppresses convective systems such as thunderstorms, and the result is you get really dry environments,” Caves said.

This long-accepted model of how Central Asia’s arid environments were created mostly ignores, however, the existence of the Altai and Hangay, two northern mountain ranges.

Searching for answers


To investigate the effects of the smaller ranges on the regional climate, Caves and his colleagues from Stanford and Rocky Mountain College in Montana traveled to Mongolia in 2011 and 2012 and collected samples of ancient soil, as well as stream and lake sediments from remote sites in the central, southwestern and western parts of the country.

The team carefully chose its sites by scouring the scientific literature for studies of the region conducted by pioneering researchers in past decades.

“A lot of the papers were by Polish and Russian scientists who went there to look for dinosaur fossils,” said Hari Mix, a doctoral student at Stanford who also participated in the research. “Indeed, at many of the sites we visited, there were dinosaur fossils just lying around.”

The earlier researchers recorded the ages and locations of the rocks they excavated as part of their own investigations; Caves and his team used those age estimates to select the most promising sites for their own study.

At each site, the team bagged sediment samples that were later analyzed to determine their carbon isotope content. The relative level of carbon isotopes present in a soil sample is related to the productivity of plants growing in the soil, which is itself dependent on the annual rainfall. Thus, by measuring carbon isotope amounts from different sediment samples of different ages, the team was able to reconstruct past precipitation levels.

An ancient wet period


The new data suggest that rainfall in central and southwestern Mongolia had decreased by 50 to 90 percent in the last several tens of million of years.

“Right now, precipitation in Mongolia is about 5 inches annually,” Caves said. “To explain our data, rainfall had to decrease from 10 inches a year or more to its current value over the last 10 to 30 million years.”

That means that much of Mongolia and Central Asia were still relatively wet even after the formation of the Himalayas and the Tibetan Plateau 45 million years ago. The data show that it wasn’t until about 30 million years ago, when the Hangay Mountains first formed, that rainfall started to decrease. The region began drying out even faster about 5 million to 10 million years ago, when the Altai Mountains began to rise.

The scientists hypothesize that once they formed, the Hangay and Altai ranges created rain shadows of their own that blocked moisture from entering Central Asia.

“As a result, the northern and western sides of these ranges are wet, while the southern and eastern sides are dry,” Caves said.

The team is not discounting the effect of the Himalayas and the Tibetan Plateau entirely, because portions of the Gobi Desert likely already existed before the Hangay or Altai began forming.

“What these smaller mountains did was expand the Gobi north and west into Mongolia,” Caves said.

The uplift of the Hangay and Altai may have had other, more far-reaching implications as well, Caves said. For example, westerly winds in Asia slam up against the Altai today, creating strong cyclonic winds in the process. Under the right conditions, the cyclones pick up large amounts of dust as they snake across the Gobi Desert. That dust can be lofted across the Pacific Ocean and even reach California, where it serves as microscopic seeds for developing raindrops.

The origins of these cyclonic winds, as well as substantial dust storms in China today, may correlate with uplift of the Altai, Caves said. His team plans to return to Mongolia and Kazakhstan next summer to collect more samples and to use climate models to test whether the Altai are responsible for the start of the large dust storms.

“If the Altai are a key part of regulating Central Asia’s climate, we can go and look for evidence of it in the past,” Caves said.

3-D Earth model developed at Sandia Labs more accurately pinpoints source of earthquakes, explosions

Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth's mantle and crust designed to help pinpoint the location of all types of explosions. -  Photo by Randy Montoya, Sandia National Laboratories
Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth’s mantle and crust designed to help pinpoint the location of all types of explosions. – Photo by Randy Montoya, Sandia National Laboratories

During the Cold War, U.S. and international monitoring agencies could spot nuclear tests and focused on measuring their sizes. Today, they’re looking around the globe to pinpoint much smaller explosives tests.

Under the sponsorship of the National Nuclear Security Administration’s Office of Defense Nuclear Nonproliferation R&D, Sandia National Laboratories and Los Alamos National Laboratory have partnered to develop a 3-D model of the Earth’s mantle and crust called SALSA3D, or Sandia-Los Alamos 3D. The purpose of this model is to assist the US Air Force and the international Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) in Vienna, Austria, more accurately locate all types of explosions.

The model uses a scalable triangular tessellation and seismic tomography to map the Earth’s “compressional wave seismic velocity,” a property of the rocks and other materials inside the Earth that indicates how quickly compressional waves travel through them and is one way to accurately locate seismic events, Sandia geophysicist Sandy Ballard said. Compressional waves – measured first after seismic events – move the particles in rocks and other materials minute distances backward and forward between the location of the event and the station detecting it.

SALSA3D also reduces the uncertainty in the model’s predictions, an important feature for decision-makers who must take action when suspicious activity is detected, he added.

“When you have an earthquake or nuclear explosion, not only do you need to know where it happened, but also how well you know that. That’s a difficult problem for these big 3-D models. It’s mainly a computational problem,” Ballard said. “The math is not so tough, just getting it done is hard, and we’ve accomplished that.”

A Sandia team has been writing and refining code for the model since 2007 and is now demonstrating SALSA3D is more accurate than current models.

In recent tests, SALSA3D was able to predict the source of seismic events over a geographical area that was 26 percent smaller than the traditional one-dimensional model and 9 percent smaller than a recently developed Regional Seismic Travel Time (RSTT) model used with the one-dimensional model.

GeoTess software release

Sandia recently released SALSA3D’s framework – the triangular tessellated grid on which the model is built – to other Earth scientists, seismologists and the public. By standardizing the framework, the seismological research community can more easily share models of the Earth’s structure and global monitoring agencies can better test different models. Both activities are hampered by the plethora of models available today, Ballard said. (See box.)

“GeoTess makes models compatible and standardizes everything,” he said. “This would really facilitate sharing of different models, if everyone agreed on it.”

Seismologists and researchers worldwide can now download GeoTess, which provides a common model parameterization for multidimensional Earth models and a software support system that addresses the construction, population, storage and interrogation of data stored in the model. GeoTess is not specific to any particular data, so users have considerable flexibility in how they store information in the model. The free package, including source code, is being released under the very liberal BSD Open Source License. The code is available in Java and C++, with interfaces to the C++ version written in C and Fortran90. GeoTess has been tested on multiple platforms, including Linux, SunOS, MacOSX and Windows. GeoTess is available here.

When an explosion goes off, the energy travels through the Earth as waves that are picked up by seismometers at U.S. and international ground monitoring stations associated with nuclear explosion monitoring organizations worldwide. Scientists use these signals to determine the location.

They first predict the time taken for the waves to travel from their source through the Earth to each station. To calculate that, they have to know the seismic velocity of the Earth’s materials from the crust to the inner core, Ballard said.

“If you have material that has very high seismic velocity, the waves travel very quickly, but the energy travels less quickly through other kinds of materials, so it takes the signals longer to travel from the source to the receiver,” he says.

For the past 100 years, seismologists have predicted the travel time of seismic energy from source to receiver using one-dimensional models. These models, which are still widely used today, account only for radial variations in seismic velocity and ignore variations in geographic directions. They yield seismic event locations that are reasonably accurate, but not nearly as precise as locations calculated with high fidelity 3-D models.

Modern 3-D models of the Earth, like SALSA3D, account for distortions of the seismic wavefronts caused by minor lateral differences in the properties of rocks and other materials.

For example, waves are distorted when they move through a geological feature called a subduction zone, such as the one beneath the west coast of South America where one tectonic plate under the Pacific Ocean is diving underneath the Andes Mountains. This happens at about the rate at which fingernails grow, but, geologically speaking, that’s fast, Ballard said.

One-dimensional models, like the widely used ak135 developed in the 1990s, are good at predicting the travel time of waves when the distance from the source to the receiver is large because these waves spend most of their time traveling through the deepest, most homogenous parts of the Earth. They don’t do so well at predicting travel time to nearby events where the waves spend most of their time in the Earth’s crust or the shallowest parts of the mantle, both of which contain a larger variety of materials than the lower mantle and the Earth’s core.

RSTT, a previous model developed jointly by Sandia, Los Alamos and Lawrence Livermore national laboratories, tried to solve that problem and works best at ranges of about 60-1,200 miles (100-2,000 kilometers).

Still, “the biggest errors we get are close to the surface of the Earth. That’s where the most variability in materials is,” Ballard said.

Seismic tomography gives SALSA3D accuracy

Today, Earth scientists are mapping three dimensions: the radius, latitude and longitude.

Anyone who’s studied a globe or world atlas knows that the traditional grid of longitudinal and latitudinal lines work all right the closer you are to the equator, but at the poles, the lines are too close together. For nuclear explosion monitoring, Earth models must accurately characterize the polar regions even though they are remote because seismic waves travel under them, Ballard said.

Triangular tessellation solves that with nodes, or intersections of the triangles, that can be accurately modeled even at the poles. The triangles can be smaller where more detail is needed and larger in areas that require less detail, like the oceans. Plus the model extends into the Earth like columns of stacked pieces of pie without the rounded crust edges.

The way Sandia calculates the seismic velocities uses the same math that is used to detect a tumor in an MRI, except on a global, rather than a human, scale.

Sandia uses historical data from 118,000 earthquakes and 13,000 current and former monitoring stations worldwide collected by Los Alamos Lab’s Ground Truth catalog.

“We apply a process called seismic tomography where we take millions of observed travel times and invert them for the seismic velocities that would create that data set. It’s mathematically similar to doing linear regression, but on steroids,” Sandy says. Linear regression is a simple mathematical way to model the relationship between a known variable and one or more unknown variables. Because the Sandia team models hundreds of thousands of unknown variables, they apply a mathematical method called least squares to minimize the discrepancies between the data from previous seismic events and the predictions.

With 10 million data points, Sandia uses a distributed computer network with about 400 core processors to characterize the seismic velocity at every node.

Monitoring agencies could use SALSA3D to precompute the travel time from each station in their network to every point on Earth. When it comes time to compute the location of a new seismic event in real-time, source-to-receiver travel times can be computed in a millisecond and pinpoint the energy’s source in about a second, he said.

Uncertainty modeling a SALSA3D feature

But no model is perfect, so Sandia has developed a way to measure the uncertainty in each prediction SALSA3D makes, based on uncertainty in the velocity at each node and how that uncertainty affects the travel time prediction of each wave from a seismic event to each monitoring station.

SALSA3D estimates for the users at monitoring stations the most likely location of a seismic event and the amount of uncertainty in the answer to help inform their decisions.

International test ban treaties require that on-site inspections can only occur within a 1,000-square-kilometer (385-square-mile) area surrounding a suspected nuclear test site. Today, 3-D Earth models like SALSA3D are helping to meet and sometimes significantly exceed this threshold in most parts of the world.

“It’s extremely difficult to do because the problem is so large,” Ballard said. “But we’ve got to know it within 1,000 square kilometers or they might search in the wrong place.”

Scientists solve a 14,000-year-old ocean mystery

At the end of the last Ice Age, as the world began to warm, a swath of the North Pacific Ocean came to life. During a brief pulse of biological productivity 14,000 years ago, this stretch of the sea teemed with phytoplankton, amoeba-like foraminifera and other tiny creatures, who thrived in large numbers until the productivity ended-as mysteriously as it began-just a few hundred years later.

Researchers have hypothesized that iron sparked this surge of ocean life, but a new study led by Woods Hole Oceanographic Institution (WHOI) scientists and colleagues at the University of Bristol (UK), the University of Bergen (Norway), Williams College and the Lamont Doherty Earth Observatory of Columbia University suggests iron may not have played an important role after all, at least in some settings. The study, published in the journal Nature Geoscience, determines that a different mechanism-a transient “perfect storm” of nutrients and light-spurred life in the post-Ice Age Pacific. Its findings resolve conflicting ideas about the relationship between iron and biological productivity during this time period in the North Pacific-with potential implications for geo-engineering efforts to curb climate change by seeding the ocean with iron.

“A lot of people have put a lot of faith into iron-and, in fact, as a modern ocean chemist, I’ve built my career on the importance of iron-but it may not always have been as important as we think,” says WHOI Associate Scientist Phoebe Lam, a co-author of the study.

Because iron is known to cause blooms of biological activity in today’s North Pacific Ocean, researchers have assumed it played a key role in the past as well. They have hypothesized that as Ice Age glaciers began to melt and sea levels rose, they submerged the surrounding continental shelf, washing iron into the rising sea and setting off a burst of life.

Past studies using sediment cores-long cylinders drilled into the ocean floor that offer scientists a look back through time at what has accumulated there-have repeatedly found evidence of this burst, in the form of a layer of increased opal and calcium carbonate, the materials that made up phytoplankton and foraminifera shells. But no one had searched the fossil record specifically for signs that iron from the continental shelf played a part in the bloom.

Lam and an international team of colleagues revisited the sediment core data to directly test this hypothesis. They sampled GGC-37, a core taken from a site near Russia’s Kamchatka Peninsula, about every 5 centimeters, moving back through time to before the biological bloom began. Then they analyzed the chemical composition of their samples, measuring the relative abundance of the isotopes of the elements neodymium and strontium in the sample, which indicates which variant of iron was present. The isotope abundance ratios were a particularly important clue, because they could reveal where the iron came from-one variant pointed to iron from the ancient Loess Plateau of northern China, a frequent source of iron-rich dust in the northwest Pacific, while another suggested the younger, more volcanic continental shelf was the iron source.

What the researchers found surprised them.

“We saw the flux of iron was really high during glacial times, and that it dropped during deglaciation,” Lam says. “We didn’t see any evidence of a pulse of iron right before this productivity peak.”

The iron the researchers did find during glacial times appeared to be supplemented by a third source, possibly in the Bering Sea area, but it didn’t have a significant effect on the productivity peak. Instead, the data suggest that iron levels were declining when the peak began.

Based on the sediment record, the researchers propose a different cause for the peak: a chain of events that created ideal conditions for sea life to briefly flourish. The changing climate triggered deep mixing in the North Pacific ocean, which stirred nutrients that the tiny plankton depend on up into the sea’s surface layers, but in doing so also mixed the plankton into deep, dark waters, where light for photosynthesis was too scarce for them to thrive. Then a pulse of freshwater from melting glaciers-evidenced by a change in the amount of a certain oxygen isotope in the foraminifera shells found in the core-stopped the mixing, trapping the phytoplankton and other small creatures in a thin, bright, nutrient-rich top layer of ocean. With greater exposure to light and nutrients, and iron levels that were still relatively high, the creatures flourished.

“We think that ultimately this is what caused the productivity peak-that all these things happened all at once,” Lam says. “And it was a transient thing, because the iron continued to drop and eventually the nutrients ran out.”

The study’s findings disprove that iron caused this ancient bloom, but they also raise questions about a very modern idea. Some scientists have proposed seeding the world’s oceans with iron to trigger phytoplankton blooms that could trap some of the atmosphere’s carbon dioxide and help stall climate change. This idea, sometimes referred to as the “Iron Hypothesis,” has met with considerable controversy, but scientific evidence of its potential effectiveness to sequester carbon and its impact on ocean life has been mixed.

“This study shows how there are multiple controls on ocean phytoplankton blooms, not just iron,” says Ken Buesseler, a WHOI marine chemist who led a workshop in 2007 to discuss modern iron fertilization. “Certainly before we think about adding iron to the ocean to sequester carbon as a geoengineering tool, we should encourage studies like this of natural systems where the conditions of adding iron, or not, on longer and larger time scales have already been done for us and we can study the consequences.”

Sea level influenced tropical climate during the last ice age

The exposed Sunda Shelf during glacial times greatly affected the atmospheric circulation. The shelf is shown on the left for present-day as the light-blue submerged areas between Java, Sumatra, Borneo, and Thailand, and on the right for the last ice age as the green exposed area. -  Pedro DiNezio
The exposed Sunda Shelf during glacial times greatly affected the atmospheric circulation. The shelf is shown on the left for present-day as the light-blue submerged areas between Java, Sumatra, Borneo, and Thailand, and on the right for the last ice age as the green exposed area. – Pedro DiNezio

Scientists look at past climates to learn about climate change and the ability to simulate it with computer models. One region that has received a great deal of attention is the Indo-Pacific warm pool, the vast pool of warm water stretching along the equator from Africa to the western Pacific Ocean.

In a new study, Pedro DiNezio of the International Pacific Research Center, University of Hawaii at Manoa, and Jessica Tierney of Woods Hole Oceanographic Institution investigated preserved geological clues (called “proxies”) of rainfall patterns during the last ice age when the planet was dramatically colder than today. They compared these patterns with computer model simulations in order to find a physical explanation for the patterns inferred from the proxies.

Their study, which appears in the May 19, online edition of Nature Geoscience, not only reveals unique patterns of rainfall change over the Indo-Pacific warm pool, but also shows that they were caused by the effect of lowered sea level on the configuration of the Indonesian archipelago.

“For our research,” explains lead-author Pedro DiNezio at the International Pacific Research Center, “we compared the climate of the ice age with our recent warmer climate. We analyzed about 100 proxy records of rainfall and salinity stretching from the tropical western Pacific to the western Indian Ocean and eastern Africa. Rainfall and salinity signals recorded in geological sediments can tell us much about past changes in atmospheric circulation over land and the ocean respectively.”

“Our comparisons show that, as many scientists expected, much of the Indo-Pacific warm pool was drier during this glacial period compared with today. But, counter to some theories, several regions, such as the western Pacific and the western Indian Ocean, especially eastern Africa, were wetter,” adds co-author Jessica Tierney from Woods Hole Oceanographic Institute.

In the second step, the scientists matched these rainfall and salinity patterns with simulations from 12 state-of-the-art climate models that are used to also predict future climate change. For this matching they applied a method of categorical data comparison called the ‘Cohen’s kappa’ statistic. Though widely used in the medical field, this method has not yet been used to match geological climate signals with climate model simulations.

“We were taken aback that only one model out of the 12 showed statistical agreement with the proxy-inferred patterns of the rainfall changes. This model, though, agrees well with both the rainfall and salinity indicators – two entirely independent sets of proxy data covering distinct areas of the tropics,” says DiNezio.

The model reveals that the dry climate during the glacial period was driven by reduced convection over a region of the warm pool called the Sunda Shelf. Today the shelf is submerged beneath the Gulf of Thailand, but was above sea level during the glacial period, when sea level was about 120 m lower.

“The exposure of the Sunda Shelf greatly weakened convection over the warm pool, with far-reaching impacts on the large-scale circulation and on rainfall patterns from Africa to the western Pacific and northern Australia,” explains DiNezio.

The main weakness of the other models, according to the authors, is their limited ability to simulate convection, the vertical air motions that lift humid air into the atmosphere. Differences in the way each model simulates convection may explain why the results for the glacial period are so different.

“Our research resolves a decades-old question of what the response of tropical climate was to glaciation,” concludes DiNezio. “The study, moreover, presents a fine benchmark for assessing the ability of climate models to simulate the response of tropical convection to altered land masses and global temperatures.