Today’s Antarctic region once as hot as California, Florida

Parts of ancient Antarctica were as warm as today’s California coast, and polar regions of the southern Pacific Ocean registered 21st-century Florida heat, according to scientists using a new way to measure past temperatures.

The findings, published the week of April 21 in the Proceedings of the National Academy of Sciences, underscore the potential for increased warmth at Earth’s poles and the associated risk of melting polar ice and rising sea levels, the researchers said.

Led by scientists at Yale, the study focused on Antarctica during the Eocene epoch, 40-50 million years ago, a period with high concentrations of atmospheric CO2 and consequently a greenhouse climate. Today, Antarctica is year-round one of the coldest places on Earth, and the continent’s interior is the coldest place, with annual average land temperatures far below zero degrees Fahrenheit.

But it wasn’t always that way, and the new measurements can help improve climate models used for predicting future climate, according to co-author Hagit Affek of Yale, associate professor of geology & geophysics.

“Quantifying past temperatures helps us understand the sensitivity of the climate system to greenhouse gases, and especially the amplification of global warming in polar regions,” Affek said.

The paper’s lead author, Peter M.J. Douglas, performed the research as a graduate student in Affek’s Yale laboratory. He is now a postdoctoral scholar at the California Institute of Technology. The research team included paleontologists, geochemists, and a climate physicist.

By measuring concentrations of rare isotopes in ancient fossil shells, the scientists found that temperatures in parts of Antarctica reached as high as 17 degrees Celsius (63F) during the Eocene, with an average of 14 degrees Celsius (57F) – similar to the average annual temperature off the coast of California today.

Eocene temperatures in parts of the southern Pacific Ocean measured 22 degrees Centigrade (or about 72F), researchers said – similar to seawater temperatures near Florida today.

Today the average annual South Pacific sea temperature near Antarctica is about 0 degrees Celsius.

These ancient ocean temperatures were not uniformly distributed throughout the Antarctic ocean regions – they were higher on the South Pacific side of Antarctica – and researchers say this finding suggests that ocean currents led to a temperature difference.

“By measuring past temperatures in different parts of Antarctica, this study gives us a clearer perspective of just how warm Antarctica was when the Earth’s atmosphere contained much more CO2 than it does today,” said Douglas. “We now know that it was warm across the continent, but also that some parts were considerably warmer than others. This provides strong evidence that global warming is especially pronounced close to the Earth’s poles. Warming in these regions has significant consequences for climate well beyond the high latitudes due to ocean circulation and melting of polar ice that leads to sea level rise.”

To determine the ancient temperatures, the scientists measured the abundance of two rare isotopes bound to each other in fossil bivalve shells collected by co-author Linda Ivany of Syracuse University at Seymour Island, a small island off the northeast side of the Antarctic Peninsula. The concentration of bonds between carbon-13 and oxygen-18 reflect the temperature in which the shells grew, the researchers said. They combined these results with other geo-thermometers and model simulations.

The new measurement technique is called carbonate clumped isotope thermometry.

“We managed to combine data from a variety of geochemical techniques on past environmental conditions with climate model simulations to learn something new about how the Earth’s climate system works under conditions different from its current state,” Affek said. “This combined result provides a fuller picture than either approach could on its own.”

Today’s Antarctic region once as hot as California, Florida

Parts of ancient Antarctica were as warm as today’s California coast, and polar regions of the southern Pacific Ocean registered 21st-century Florida heat, according to scientists using a new way to measure past temperatures.

The findings, published the week of April 21 in the Proceedings of the National Academy of Sciences, underscore the potential for increased warmth at Earth’s poles and the associated risk of melting polar ice and rising sea levels, the researchers said.

Led by scientists at Yale, the study focused on Antarctica during the Eocene epoch, 40-50 million years ago, a period with high concentrations of atmospheric CO2 and consequently a greenhouse climate. Today, Antarctica is year-round one of the coldest places on Earth, and the continent’s interior is the coldest place, with annual average land temperatures far below zero degrees Fahrenheit.

But it wasn’t always that way, and the new measurements can help improve climate models used for predicting future climate, according to co-author Hagit Affek of Yale, associate professor of geology & geophysics.

“Quantifying past temperatures helps us understand the sensitivity of the climate system to greenhouse gases, and especially the amplification of global warming in polar regions,” Affek said.

The paper’s lead author, Peter M.J. Douglas, performed the research as a graduate student in Affek’s Yale laboratory. He is now a postdoctoral scholar at the California Institute of Technology. The research team included paleontologists, geochemists, and a climate physicist.

By measuring concentrations of rare isotopes in ancient fossil shells, the scientists found that temperatures in parts of Antarctica reached as high as 17 degrees Celsius (63F) during the Eocene, with an average of 14 degrees Celsius (57F) – similar to the average annual temperature off the coast of California today.

Eocene temperatures in parts of the southern Pacific Ocean measured 22 degrees Centigrade (or about 72F), researchers said – similar to seawater temperatures near Florida today.

Today the average annual South Pacific sea temperature near Antarctica is about 0 degrees Celsius.

These ancient ocean temperatures were not uniformly distributed throughout the Antarctic ocean regions – they were higher on the South Pacific side of Antarctica – and researchers say this finding suggests that ocean currents led to a temperature difference.

“By measuring past temperatures in different parts of Antarctica, this study gives us a clearer perspective of just how warm Antarctica was when the Earth’s atmosphere contained much more CO2 than it does today,” said Douglas. “We now know that it was warm across the continent, but also that some parts were considerably warmer than others. This provides strong evidence that global warming is especially pronounced close to the Earth’s poles. Warming in these regions has significant consequences for climate well beyond the high latitudes due to ocean circulation and melting of polar ice that leads to sea level rise.”

To determine the ancient temperatures, the scientists measured the abundance of two rare isotopes bound to each other in fossil bivalve shells collected by co-author Linda Ivany of Syracuse University at Seymour Island, a small island off the northeast side of the Antarctic Peninsula. The concentration of bonds between carbon-13 and oxygen-18 reflect the temperature in which the shells grew, the researchers said. They combined these results with other geo-thermometers and model simulations.

The new measurement technique is called carbonate clumped isotope thermometry.

“We managed to combine data from a variety of geochemical techniques on past environmental conditions with climate model simulations to learn something new about how the Earth’s climate system works under conditions different from its current state,” Affek said. “This combined result provides a fuller picture than either approach could on its own.”

Rising mountains dried out Central Asia, scientists say

A record of ancient rainfall teased from long-buried sediments in Mongolia is challenging the popular idea that the arid conditions prevalent in Central Asia today were caused by the ancient uplift of the Himalayas and the Tibetan Plateau.

Instead, Stanford scientists say the formation of two lesser mountain ranges, the Hangay and the Altai, may have been the dominant drivers of climate in the region, leading to the expansion of Asia’s largest desert, the Gobi. The findings will be presented on Thursday, Dec. 12, at the annual meeting of the American Geophysical Union (AGU) in San Francisco.

“These results have major implications for understanding the dominant factors behind modern-day Central Asia’s extremely arid climate and the role of mountain ranges in altering regional climate,” said Page Chamberlain, a professor of environmental Earth system science at Stanford.

Scientists previously thought that the formation of the Himalayan mountain range and the Tibetan plateau around 45 million years ago shaped Asia’s driest environments.

“The traditional explanation has been that the uplift of the Himalayas blocked air from the Indian Ocean from reaching central Asia,” said Jeremy Caves, a doctoral student in Chamberlain’s terrestrial paleoclimate research group who was involved in the study.

This process was thought to have created a distinct rain shadow that led to wetter climates in India and Nepal and drier climates in Central Asia. Similarly, the elevation of the Tibetan Plateau was thought to have triggered an atmospheric process called subsidence, in which a mass of air heated by a high elevation slowly sinks into Central Asia.

“The falling air suppresses convective systems such as thunderstorms, and the result is you get really dry environments,” Caves said.

This long-accepted model of how Central Asia’s arid environments were created mostly ignores, however, the existence of the Altai and Hangay, two northern mountain ranges.

Searching for answers


To investigate the effects of the smaller ranges on the regional climate, Caves and his colleagues from Stanford and Rocky Mountain College in Montana traveled to Mongolia in 2011 and 2012 and collected samples of ancient soil, as well as stream and lake sediments from remote sites in the central, southwestern and western parts of the country.

The team carefully chose its sites by scouring the scientific literature for studies of the region conducted by pioneering researchers in past decades.

“A lot of the papers were by Polish and Russian scientists who went there to look for dinosaur fossils,” said Hari Mix, a doctoral student at Stanford who also participated in the research. “Indeed, at many of the sites we visited, there were dinosaur fossils just lying around.”

The earlier researchers recorded the ages and locations of the rocks they excavated as part of their own investigations; Caves and his team used those age estimates to select the most promising sites for their own study.

At each site, the team bagged sediment samples that were later analyzed to determine their carbon isotope content. The relative level of carbon isotopes present in a soil sample is related to the productivity of plants growing in the soil, which is itself dependent on the annual rainfall. Thus, by measuring carbon isotope amounts from different sediment samples of different ages, the team was able to reconstruct past precipitation levels.

An ancient wet period


The new data suggest that rainfall in central and southwestern Mongolia had decreased by 50 to 90 percent in the last several tens of million of years.

“Right now, precipitation in Mongolia is about 5 inches annually,” Caves said. “To explain our data, rainfall had to decrease from 10 inches a year or more to its current value over the last 10 to 30 million years.”

That means that much of Mongolia and Central Asia were still relatively wet even after the formation of the Himalayas and the Tibetan Plateau 45 million years ago. The data show that it wasn’t until about 30 million years ago, when the Hangay Mountains first formed, that rainfall started to decrease. The region began drying out even faster about 5 million to 10 million years ago, when the Altai Mountains began to rise.

The scientists hypothesize that once they formed, the Hangay and Altai ranges created rain shadows of their own that blocked moisture from entering Central Asia.

“As a result, the northern and western sides of these ranges are wet, while the southern and eastern sides are dry,” Caves said.

The team is not discounting the effect of the Himalayas and the Tibetan Plateau entirely, because portions of the Gobi Desert likely already existed before the Hangay or Altai began forming.

“What these smaller mountains did was expand the Gobi north and west into Mongolia,” Caves said.

The uplift of the Hangay and Altai may have had other, more far-reaching implications as well, Caves said. For example, westerly winds in Asia slam up against the Altai today, creating strong cyclonic winds in the process. Under the right conditions, the cyclones pick up large amounts of dust as they snake across the Gobi Desert. That dust can be lofted across the Pacific Ocean and even reach California, where it serves as microscopic seeds for developing raindrops.

The origins of these cyclonic winds, as well as substantial dust storms in China today, may correlate with uplift of the Altai, Caves said. His team plans to return to Mongolia and Kazakhstan next summer to collect more samples and to use climate models to test whether the Altai are responsible for the start of the large dust storms.

“If the Altai are a key part of regulating Central Asia’s climate, we can go and look for evidence of it in the past,” Caves said.

Rising mountains dried out Central Asia, scientists say

A record of ancient rainfall teased from long-buried sediments in Mongolia is challenging the popular idea that the arid conditions prevalent in Central Asia today were caused by the ancient uplift of the Himalayas and the Tibetan Plateau.

Instead, Stanford scientists say the formation of two lesser mountain ranges, the Hangay and the Altai, may have been the dominant drivers of climate in the region, leading to the expansion of Asia’s largest desert, the Gobi. The findings will be presented on Thursday, Dec. 12, at the annual meeting of the American Geophysical Union (AGU) in San Francisco.

“These results have major implications for understanding the dominant factors behind modern-day Central Asia’s extremely arid climate and the role of mountain ranges in altering regional climate,” said Page Chamberlain, a professor of environmental Earth system science at Stanford.

Scientists previously thought that the formation of the Himalayan mountain range and the Tibetan plateau around 45 million years ago shaped Asia’s driest environments.

“The traditional explanation has been that the uplift of the Himalayas blocked air from the Indian Ocean from reaching central Asia,” said Jeremy Caves, a doctoral student in Chamberlain’s terrestrial paleoclimate research group who was involved in the study.

This process was thought to have created a distinct rain shadow that led to wetter climates in India and Nepal and drier climates in Central Asia. Similarly, the elevation of the Tibetan Plateau was thought to have triggered an atmospheric process called subsidence, in which a mass of air heated by a high elevation slowly sinks into Central Asia.

“The falling air suppresses convective systems such as thunderstorms, and the result is you get really dry environments,” Caves said.

This long-accepted model of how Central Asia’s arid environments were created mostly ignores, however, the existence of the Altai and Hangay, two northern mountain ranges.

Searching for answers


To investigate the effects of the smaller ranges on the regional climate, Caves and his colleagues from Stanford and Rocky Mountain College in Montana traveled to Mongolia in 2011 and 2012 and collected samples of ancient soil, as well as stream and lake sediments from remote sites in the central, southwestern and western parts of the country.

The team carefully chose its sites by scouring the scientific literature for studies of the region conducted by pioneering researchers in past decades.

“A lot of the papers were by Polish and Russian scientists who went there to look for dinosaur fossils,” said Hari Mix, a doctoral student at Stanford who also participated in the research. “Indeed, at many of the sites we visited, there were dinosaur fossils just lying around.”

The earlier researchers recorded the ages and locations of the rocks they excavated as part of their own investigations; Caves and his team used those age estimates to select the most promising sites for their own study.

At each site, the team bagged sediment samples that were later analyzed to determine their carbon isotope content. The relative level of carbon isotopes present in a soil sample is related to the productivity of plants growing in the soil, which is itself dependent on the annual rainfall. Thus, by measuring carbon isotope amounts from different sediment samples of different ages, the team was able to reconstruct past precipitation levels.

An ancient wet period


The new data suggest that rainfall in central and southwestern Mongolia had decreased by 50 to 90 percent in the last several tens of million of years.

“Right now, precipitation in Mongolia is about 5 inches annually,” Caves said. “To explain our data, rainfall had to decrease from 10 inches a year or more to its current value over the last 10 to 30 million years.”

That means that much of Mongolia and Central Asia were still relatively wet even after the formation of the Himalayas and the Tibetan Plateau 45 million years ago. The data show that it wasn’t until about 30 million years ago, when the Hangay Mountains first formed, that rainfall started to decrease. The region began drying out even faster about 5 million to 10 million years ago, when the Altai Mountains began to rise.

The scientists hypothesize that once they formed, the Hangay and Altai ranges created rain shadows of their own that blocked moisture from entering Central Asia.

“As a result, the northern and western sides of these ranges are wet, while the southern and eastern sides are dry,” Caves said.

The team is not discounting the effect of the Himalayas and the Tibetan Plateau entirely, because portions of the Gobi Desert likely already existed before the Hangay or Altai began forming.

“What these smaller mountains did was expand the Gobi north and west into Mongolia,” Caves said.

The uplift of the Hangay and Altai may have had other, more far-reaching implications as well, Caves said. For example, westerly winds in Asia slam up against the Altai today, creating strong cyclonic winds in the process. Under the right conditions, the cyclones pick up large amounts of dust as they snake across the Gobi Desert. That dust can be lofted across the Pacific Ocean and even reach California, where it serves as microscopic seeds for developing raindrops.

The origins of these cyclonic winds, as well as substantial dust storms in China today, may correlate with uplift of the Altai, Caves said. His team plans to return to Mongolia and Kazakhstan next summer to collect more samples and to use climate models to test whether the Altai are responsible for the start of the large dust storms.

“If the Altai are a key part of regulating Central Asia’s climate, we can go and look for evidence of it in the past,” Caves said.

3-D Earth model developed at Sandia Labs more accurately pinpoints source of earthquakes, explosions

Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth's mantle and crust designed to help pinpoint the location of all types of explosions. -  Photo by Randy Montoya, Sandia National Laboratories
Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth’s mantle and crust designed to help pinpoint the location of all types of explosions. – Photo by Randy Montoya, Sandia National Laboratories

During the Cold War, U.S. and international monitoring agencies could spot nuclear tests and focused on measuring their sizes. Today, they’re looking around the globe to pinpoint much smaller explosives tests.

Under the sponsorship of the National Nuclear Security Administration’s Office of Defense Nuclear Nonproliferation R&D, Sandia National Laboratories and Los Alamos National Laboratory have partnered to develop a 3-D model of the Earth’s mantle and crust called SALSA3D, or Sandia-Los Alamos 3D. The purpose of this model is to assist the US Air Force and the international Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) in Vienna, Austria, more accurately locate all types of explosions.

The model uses a scalable triangular tessellation and seismic tomography to map the Earth’s “compressional wave seismic velocity,” a property of the rocks and other materials inside the Earth that indicates how quickly compressional waves travel through them and is one way to accurately locate seismic events, Sandia geophysicist Sandy Ballard said. Compressional waves – measured first after seismic events – move the particles in rocks and other materials minute distances backward and forward between the location of the event and the station detecting it.

SALSA3D also reduces the uncertainty in the model’s predictions, an important feature for decision-makers who must take action when suspicious activity is detected, he added.

“When you have an earthquake or nuclear explosion, not only do you need to know where it happened, but also how well you know that. That’s a difficult problem for these big 3-D models. It’s mainly a computational problem,” Ballard said. “The math is not so tough, just getting it done is hard, and we’ve accomplished that.”

A Sandia team has been writing and refining code for the model since 2007 and is now demonstrating SALSA3D is more accurate than current models.

In recent tests, SALSA3D was able to predict the source of seismic events over a geographical area that was 26 percent smaller than the traditional one-dimensional model and 9 percent smaller than a recently developed Regional Seismic Travel Time (RSTT) model used with the one-dimensional model.

GeoTess software release

Sandia recently released SALSA3D’s framework – the triangular tessellated grid on which the model is built – to other Earth scientists, seismologists and the public. By standardizing the framework, the seismological research community can more easily share models of the Earth’s structure and global monitoring agencies can better test different models. Both activities are hampered by the plethora of models available today, Ballard said. (See box.)

“GeoTess makes models compatible and standardizes everything,” he said. “This would really facilitate sharing of different models, if everyone agreed on it.”

Seismologists and researchers worldwide can now download GeoTess, which provides a common model parameterization for multidimensional Earth models and a software support system that addresses the construction, population, storage and interrogation of data stored in the model. GeoTess is not specific to any particular data, so users have considerable flexibility in how they store information in the model. The free package, including source code, is being released under the very liberal BSD Open Source License. The code is available in Java and C++, with interfaces to the C++ version written in C and Fortran90. GeoTess has been tested on multiple platforms, including Linux, SunOS, MacOSX and Windows. GeoTess is available here.

When an explosion goes off, the energy travels through the Earth as waves that are picked up by seismometers at U.S. and international ground monitoring stations associated with nuclear explosion monitoring organizations worldwide. Scientists use these signals to determine the location.

They first predict the time taken for the waves to travel from their source through the Earth to each station. To calculate that, they have to know the seismic velocity of the Earth’s materials from the crust to the inner core, Ballard said.

“If you have material that has very high seismic velocity, the waves travel very quickly, but the energy travels less quickly through other kinds of materials, so it takes the signals longer to travel from the source to the receiver,” he says.

For the past 100 years, seismologists have predicted the travel time of seismic energy from source to receiver using one-dimensional models. These models, which are still widely used today, account only for radial variations in seismic velocity and ignore variations in geographic directions. They yield seismic event locations that are reasonably accurate, but not nearly as precise as locations calculated with high fidelity 3-D models.

Modern 3-D models of the Earth, like SALSA3D, account for distortions of the seismic wavefronts caused by minor lateral differences in the properties of rocks and other materials.

For example, waves are distorted when they move through a geological feature called a subduction zone, such as the one beneath the west coast of South America where one tectonic plate under the Pacific Ocean is diving underneath the Andes Mountains. This happens at about the rate at which fingernails grow, but, geologically speaking, that’s fast, Ballard said.

One-dimensional models, like the widely used ak135 developed in the 1990s, are good at predicting the travel time of waves when the distance from the source to the receiver is large because these waves spend most of their time traveling through the deepest, most homogenous parts of the Earth. They don’t do so well at predicting travel time to nearby events where the waves spend most of their time in the Earth’s crust or the shallowest parts of the mantle, both of which contain a larger variety of materials than the lower mantle and the Earth’s core.

RSTT, a previous model developed jointly by Sandia, Los Alamos and Lawrence Livermore national laboratories, tried to solve that problem and works best at ranges of about 60-1,200 miles (100-2,000 kilometers).

Still, “the biggest errors we get are close to the surface of the Earth. That’s where the most variability in materials is,” Ballard said.

Seismic tomography gives SALSA3D accuracy

Today, Earth scientists are mapping three dimensions: the radius, latitude and longitude.

Anyone who’s studied a globe or world atlas knows that the traditional grid of longitudinal and latitudinal lines work all right the closer you are to the equator, but at the poles, the lines are too close together. For nuclear explosion monitoring, Earth models must accurately characterize the polar regions even though they are remote because seismic waves travel under them, Ballard said.

Triangular tessellation solves that with nodes, or intersections of the triangles, that can be accurately modeled even at the poles. The triangles can be smaller where more detail is needed and larger in areas that require less detail, like the oceans. Plus the model extends into the Earth like columns of stacked pieces of pie without the rounded crust edges.

The way Sandia calculates the seismic velocities uses the same math that is used to detect a tumor in an MRI, except on a global, rather than a human, scale.

Sandia uses historical data from 118,000 earthquakes and 13,000 current and former monitoring stations worldwide collected by Los Alamos Lab’s Ground Truth catalog.

“We apply a process called seismic tomography where we take millions of observed travel times and invert them for the seismic velocities that would create that data set. It’s mathematically similar to doing linear regression, but on steroids,” Sandy says. Linear regression is a simple mathematical way to model the relationship between a known variable and one or more unknown variables. Because the Sandia team models hundreds of thousands of unknown variables, they apply a mathematical method called least squares to minimize the discrepancies between the data from previous seismic events and the predictions.

With 10 million data points, Sandia uses a distributed computer network with about 400 core processors to characterize the seismic velocity at every node.

Monitoring agencies could use SALSA3D to precompute the travel time from each station in their network to every point on Earth. When it comes time to compute the location of a new seismic event in real-time, source-to-receiver travel times can be computed in a millisecond and pinpoint the energy’s source in about a second, he said.

Uncertainty modeling a SALSA3D feature

But no model is perfect, so Sandia has developed a way to measure the uncertainty in each prediction SALSA3D makes, based on uncertainty in the velocity at each node and how that uncertainty affects the travel time prediction of each wave from a seismic event to each monitoring station.

SALSA3D estimates for the users at monitoring stations the most likely location of a seismic event and the amount of uncertainty in the answer to help inform their decisions.

International test ban treaties require that on-site inspections can only occur within a 1,000-square-kilometer (385-square-mile) area surrounding a suspected nuclear test site. Today, 3-D Earth models like SALSA3D are helping to meet and sometimes significantly exceed this threshold in most parts of the world.

“It’s extremely difficult to do because the problem is so large,” Ballard said. “But we’ve got to know it within 1,000 square kilometers or they might search in the wrong place.”

3-D Earth model developed at Sandia Labs more accurately pinpoints source of earthquakes, explosions

Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth's mantle and crust designed to help pinpoint the location of all types of explosions. -  Photo by Randy Montoya, Sandia National Laboratories
Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth’s mantle and crust designed to help pinpoint the location of all types of explosions. – Photo by Randy Montoya, Sandia National Laboratories

During the Cold War, U.S. and international monitoring agencies could spot nuclear tests and focused on measuring their sizes. Today, they’re looking around the globe to pinpoint much smaller explosives tests.

Under the sponsorship of the National Nuclear Security Administration’s Office of Defense Nuclear Nonproliferation R&D, Sandia National Laboratories and Los Alamos National Laboratory have partnered to develop a 3-D model of the Earth’s mantle and crust called SALSA3D, or Sandia-Los Alamos 3D. The purpose of this model is to assist the US Air Force and the international Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) in Vienna, Austria, more accurately locate all types of explosions.

The model uses a scalable triangular tessellation and seismic tomography to map the Earth’s “compressional wave seismic velocity,” a property of the rocks and other materials inside the Earth that indicates how quickly compressional waves travel through them and is one way to accurately locate seismic events, Sandia geophysicist Sandy Ballard said. Compressional waves – measured first after seismic events – move the particles in rocks and other materials minute distances backward and forward between the location of the event and the station detecting it.

SALSA3D also reduces the uncertainty in the model’s predictions, an important feature for decision-makers who must take action when suspicious activity is detected, he added.

“When you have an earthquake or nuclear explosion, not only do you need to know where it happened, but also how well you know that. That’s a difficult problem for these big 3-D models. It’s mainly a computational problem,” Ballard said. “The math is not so tough, just getting it done is hard, and we’ve accomplished that.”

A Sandia team has been writing and refining code for the model since 2007 and is now demonstrating SALSA3D is more accurate than current models.

In recent tests, SALSA3D was able to predict the source of seismic events over a geographical area that was 26 percent smaller than the traditional one-dimensional model and 9 percent smaller than a recently developed Regional Seismic Travel Time (RSTT) model used with the one-dimensional model.

GeoTess software release

Sandia recently released SALSA3D’s framework – the triangular tessellated grid on which the model is built – to other Earth scientists, seismologists and the public. By standardizing the framework, the seismological research community can more easily share models of the Earth’s structure and global monitoring agencies can better test different models. Both activities are hampered by the plethora of models available today, Ballard said. (See box.)

“GeoTess makes models compatible and standardizes everything,” he said. “This would really facilitate sharing of different models, if everyone agreed on it.”

Seismologists and researchers worldwide can now download GeoTess, which provides a common model parameterization for multidimensional Earth models and a software support system that addresses the construction, population, storage and interrogation of data stored in the model. GeoTess is not specific to any particular data, so users have considerable flexibility in how they store information in the model. The free package, including source code, is being released under the very liberal BSD Open Source License. The code is available in Java and C++, with interfaces to the C++ version written in C and Fortran90. GeoTess has been tested on multiple platforms, including Linux, SunOS, MacOSX and Windows. GeoTess is available here.

When an explosion goes off, the energy travels through the Earth as waves that are picked up by seismometers at U.S. and international ground monitoring stations associated with nuclear explosion monitoring organizations worldwide. Scientists use these signals to determine the location.

They first predict the time taken for the waves to travel from their source through the Earth to each station. To calculate that, they have to know the seismic velocity of the Earth’s materials from the crust to the inner core, Ballard said.

“If you have material that has very high seismic velocity, the waves travel very quickly, but the energy travels less quickly through other kinds of materials, so it takes the signals longer to travel from the source to the receiver,” he says.

For the past 100 years, seismologists have predicted the travel time of seismic energy from source to receiver using one-dimensional models. These models, which are still widely used today, account only for radial variations in seismic velocity and ignore variations in geographic directions. They yield seismic event locations that are reasonably accurate, but not nearly as precise as locations calculated with high fidelity 3-D models.

Modern 3-D models of the Earth, like SALSA3D, account for distortions of the seismic wavefronts caused by minor lateral differences in the properties of rocks and other materials.

For example, waves are distorted when they move through a geological feature called a subduction zone, such as the one beneath the west coast of South America where one tectonic plate under the Pacific Ocean is diving underneath the Andes Mountains. This happens at about the rate at which fingernails grow, but, geologically speaking, that’s fast, Ballard said.

One-dimensional models, like the widely used ak135 developed in the 1990s, are good at predicting the travel time of waves when the distance from the source to the receiver is large because these waves spend most of their time traveling through the deepest, most homogenous parts of the Earth. They don’t do so well at predicting travel time to nearby events where the waves spend most of their time in the Earth’s crust or the shallowest parts of the mantle, both of which contain a larger variety of materials than the lower mantle and the Earth’s core.

RSTT, a previous model developed jointly by Sandia, Los Alamos and Lawrence Livermore national laboratories, tried to solve that problem and works best at ranges of about 60-1,200 miles (100-2,000 kilometers).

Still, “the biggest errors we get are close to the surface of the Earth. That’s where the most variability in materials is,” Ballard said.

Seismic tomography gives SALSA3D accuracy

Today, Earth scientists are mapping three dimensions: the radius, latitude and longitude.

Anyone who’s studied a globe or world atlas knows that the traditional grid of longitudinal and latitudinal lines work all right the closer you are to the equator, but at the poles, the lines are too close together. For nuclear explosion monitoring, Earth models must accurately characterize the polar regions even though they are remote because seismic waves travel under them, Ballard said.

Triangular tessellation solves that with nodes, or intersections of the triangles, that can be accurately modeled even at the poles. The triangles can be smaller where more detail is needed and larger in areas that require less detail, like the oceans. Plus the model extends into the Earth like columns of stacked pieces of pie without the rounded crust edges.

The way Sandia calculates the seismic velocities uses the same math that is used to detect a tumor in an MRI, except on a global, rather than a human, scale.

Sandia uses historical data from 118,000 earthquakes and 13,000 current and former monitoring stations worldwide collected by Los Alamos Lab’s Ground Truth catalog.

“We apply a process called seismic tomography where we take millions of observed travel times and invert them for the seismic velocities that would create that data set. It’s mathematically similar to doing linear regression, but on steroids,” Sandy says. Linear regression is a simple mathematical way to model the relationship between a known variable and one or more unknown variables. Because the Sandia team models hundreds of thousands of unknown variables, they apply a mathematical method called least squares to minimize the discrepancies between the data from previous seismic events and the predictions.

With 10 million data points, Sandia uses a distributed computer network with about 400 core processors to characterize the seismic velocity at every node.

Monitoring agencies could use SALSA3D to precompute the travel time from each station in their network to every point on Earth. When it comes time to compute the location of a new seismic event in real-time, source-to-receiver travel times can be computed in a millisecond and pinpoint the energy’s source in about a second, he said.

Uncertainty modeling a SALSA3D feature

But no model is perfect, so Sandia has developed a way to measure the uncertainty in each prediction SALSA3D makes, based on uncertainty in the velocity at each node and how that uncertainty affects the travel time prediction of each wave from a seismic event to each monitoring station.

SALSA3D estimates for the users at monitoring stations the most likely location of a seismic event and the amount of uncertainty in the answer to help inform their decisions.

International test ban treaties require that on-site inspections can only occur within a 1,000-square-kilometer (385-square-mile) area surrounding a suspected nuclear test site. Today, 3-D Earth models like SALSA3D are helping to meet and sometimes significantly exceed this threshold in most parts of the world.

“It’s extremely difficult to do because the problem is so large,” Ballard said. “But we’ve got to know it within 1,000 square kilometers or they might search in the wrong place.”

Scientists solve a 14,000-year-old ocean mystery

At the end of the last Ice Age, as the world began to warm, a swath of the North Pacific Ocean came to life. During a brief pulse of biological productivity 14,000 years ago, this stretch of the sea teemed with phytoplankton, amoeba-like foraminifera and other tiny creatures, who thrived in large numbers until the productivity ended-as mysteriously as it began-just a few hundred years later.

Researchers have hypothesized that iron sparked this surge of ocean life, but a new study led by Woods Hole Oceanographic Institution (WHOI) scientists and colleagues at the University of Bristol (UK), the University of Bergen (Norway), Williams College and the Lamont Doherty Earth Observatory of Columbia University suggests iron may not have played an important role after all, at least in some settings. The study, published in the journal Nature Geoscience, determines that a different mechanism-a transient “perfect storm” of nutrients and light-spurred life in the post-Ice Age Pacific. Its findings resolve conflicting ideas about the relationship between iron and biological productivity during this time period in the North Pacific-with potential implications for geo-engineering efforts to curb climate change by seeding the ocean with iron.

“A lot of people have put a lot of faith into iron-and, in fact, as a modern ocean chemist, I’ve built my career on the importance of iron-but it may not always have been as important as we think,” says WHOI Associate Scientist Phoebe Lam, a co-author of the study.

Because iron is known to cause blooms of biological activity in today’s North Pacific Ocean, researchers have assumed it played a key role in the past as well. They have hypothesized that as Ice Age glaciers began to melt and sea levels rose, they submerged the surrounding continental shelf, washing iron into the rising sea and setting off a burst of life.

Past studies using sediment cores-long cylinders drilled into the ocean floor that offer scientists a look back through time at what has accumulated there-have repeatedly found evidence of this burst, in the form of a layer of increased opal and calcium carbonate, the materials that made up phytoplankton and foraminifera shells. But no one had searched the fossil record specifically for signs that iron from the continental shelf played a part in the bloom.

Lam and an international team of colleagues revisited the sediment core data to directly test this hypothesis. They sampled GGC-37, a core taken from a site near Russia’s Kamchatka Peninsula, about every 5 centimeters, moving back through time to before the biological bloom began. Then they analyzed the chemical composition of their samples, measuring the relative abundance of the isotopes of the elements neodymium and strontium in the sample, which indicates which variant of iron was present. The isotope abundance ratios were a particularly important clue, because they could reveal where the iron came from-one variant pointed to iron from the ancient Loess Plateau of northern China, a frequent source of iron-rich dust in the northwest Pacific, while another suggested the younger, more volcanic continental shelf was the iron source.

What the researchers found surprised them.

“We saw the flux of iron was really high during glacial times, and that it dropped during deglaciation,” Lam says. “We didn’t see any evidence of a pulse of iron right before this productivity peak.”

The iron the researchers did find during glacial times appeared to be supplemented by a third source, possibly in the Bering Sea area, but it didn’t have a significant effect on the productivity peak. Instead, the data suggest that iron levels were declining when the peak began.

Based on the sediment record, the researchers propose a different cause for the peak: a chain of events that created ideal conditions for sea life to briefly flourish. The changing climate triggered deep mixing in the North Pacific ocean, which stirred nutrients that the tiny plankton depend on up into the sea’s surface layers, but in doing so also mixed the plankton into deep, dark waters, where light for photosynthesis was too scarce for them to thrive. Then a pulse of freshwater from melting glaciers-evidenced by a change in the amount of a certain oxygen isotope in the foraminifera shells found in the core-stopped the mixing, trapping the phytoplankton and other small creatures in a thin, bright, nutrient-rich top layer of ocean. With greater exposure to light and nutrients, and iron levels that were still relatively high, the creatures flourished.

“We think that ultimately this is what caused the productivity peak-that all these things happened all at once,” Lam says. “And it was a transient thing, because the iron continued to drop and eventually the nutrients ran out.”

The study’s findings disprove that iron caused this ancient bloom, but they also raise questions about a very modern idea. Some scientists have proposed seeding the world’s oceans with iron to trigger phytoplankton blooms that could trap some of the atmosphere’s carbon dioxide and help stall climate change. This idea, sometimes referred to as the “Iron Hypothesis,” has met with considerable controversy, but scientific evidence of its potential effectiveness to sequester carbon and its impact on ocean life has been mixed.

“This study shows how there are multiple controls on ocean phytoplankton blooms, not just iron,” says Ken Buesseler, a WHOI marine chemist who led a workshop in 2007 to discuss modern iron fertilization. “Certainly before we think about adding iron to the ocean to sequester carbon as a geoengineering tool, we should encourage studies like this of natural systems where the conditions of adding iron, or not, on longer and larger time scales have already been done for us and we can study the consequences.”

Scientists solve a 14,000-year-old ocean mystery

At the end of the last Ice Age, as the world began to warm, a swath of the North Pacific Ocean came to life. During a brief pulse of biological productivity 14,000 years ago, this stretch of the sea teemed with phytoplankton, amoeba-like foraminifera and other tiny creatures, who thrived in large numbers until the productivity ended-as mysteriously as it began-just a few hundred years later.

Researchers have hypothesized that iron sparked this surge of ocean life, but a new study led by Woods Hole Oceanographic Institution (WHOI) scientists and colleagues at the University of Bristol (UK), the University of Bergen (Norway), Williams College and the Lamont Doherty Earth Observatory of Columbia University suggests iron may not have played an important role after all, at least in some settings. The study, published in the journal Nature Geoscience, determines that a different mechanism-a transient “perfect storm” of nutrients and light-spurred life in the post-Ice Age Pacific. Its findings resolve conflicting ideas about the relationship between iron and biological productivity during this time period in the North Pacific-with potential implications for geo-engineering efforts to curb climate change by seeding the ocean with iron.

“A lot of people have put a lot of faith into iron-and, in fact, as a modern ocean chemist, I’ve built my career on the importance of iron-but it may not always have been as important as we think,” says WHOI Associate Scientist Phoebe Lam, a co-author of the study.

Because iron is known to cause blooms of biological activity in today’s North Pacific Ocean, researchers have assumed it played a key role in the past as well. They have hypothesized that as Ice Age glaciers began to melt and sea levels rose, they submerged the surrounding continental shelf, washing iron into the rising sea and setting off a burst of life.

Past studies using sediment cores-long cylinders drilled into the ocean floor that offer scientists a look back through time at what has accumulated there-have repeatedly found evidence of this burst, in the form of a layer of increased opal and calcium carbonate, the materials that made up phytoplankton and foraminifera shells. But no one had searched the fossil record specifically for signs that iron from the continental shelf played a part in the bloom.

Lam and an international team of colleagues revisited the sediment core data to directly test this hypothesis. They sampled GGC-37, a core taken from a site near Russia’s Kamchatka Peninsula, about every 5 centimeters, moving back through time to before the biological bloom began. Then they analyzed the chemical composition of their samples, measuring the relative abundance of the isotopes of the elements neodymium and strontium in the sample, which indicates which variant of iron was present. The isotope abundance ratios were a particularly important clue, because they could reveal where the iron came from-one variant pointed to iron from the ancient Loess Plateau of northern China, a frequent source of iron-rich dust in the northwest Pacific, while another suggested the younger, more volcanic continental shelf was the iron source.

What the researchers found surprised them.

“We saw the flux of iron was really high during glacial times, and that it dropped during deglaciation,” Lam says. “We didn’t see any evidence of a pulse of iron right before this productivity peak.”

The iron the researchers did find during glacial times appeared to be supplemented by a third source, possibly in the Bering Sea area, but it didn’t have a significant effect on the productivity peak. Instead, the data suggest that iron levels were declining when the peak began.

Based on the sediment record, the researchers propose a different cause for the peak: a chain of events that created ideal conditions for sea life to briefly flourish. The changing climate triggered deep mixing in the North Pacific ocean, which stirred nutrients that the tiny plankton depend on up into the sea’s surface layers, but in doing so also mixed the plankton into deep, dark waters, where light for photosynthesis was too scarce for them to thrive. Then a pulse of freshwater from melting glaciers-evidenced by a change in the amount of a certain oxygen isotope in the foraminifera shells found in the core-stopped the mixing, trapping the phytoplankton and other small creatures in a thin, bright, nutrient-rich top layer of ocean. With greater exposure to light and nutrients, and iron levels that were still relatively high, the creatures flourished.

“We think that ultimately this is what caused the productivity peak-that all these things happened all at once,” Lam says. “And it was a transient thing, because the iron continued to drop and eventually the nutrients ran out.”

The study’s findings disprove that iron caused this ancient bloom, but they also raise questions about a very modern idea. Some scientists have proposed seeding the world’s oceans with iron to trigger phytoplankton blooms that could trap some of the atmosphere’s carbon dioxide and help stall climate change. This idea, sometimes referred to as the “Iron Hypothesis,” has met with considerable controversy, but scientific evidence of its potential effectiveness to sequester carbon and its impact on ocean life has been mixed.

“This study shows how there are multiple controls on ocean phytoplankton blooms, not just iron,” says Ken Buesseler, a WHOI marine chemist who led a workshop in 2007 to discuss modern iron fertilization. “Certainly before we think about adding iron to the ocean to sequester carbon as a geoengineering tool, we should encourage studies like this of natural systems where the conditions of adding iron, or not, on longer and larger time scales have already been done for us and we can study the consequences.”

Sea level influenced tropical climate during the last ice age

The exposed Sunda Shelf during glacial times greatly affected the atmospheric circulation. The shelf is shown on the left for present-day as the light-blue submerged areas between Java, Sumatra, Borneo, and Thailand, and on the right for the last ice age as the green exposed area. -  Pedro DiNezio
The exposed Sunda Shelf during glacial times greatly affected the atmospheric circulation. The shelf is shown on the left for present-day as the light-blue submerged areas between Java, Sumatra, Borneo, and Thailand, and on the right for the last ice age as the green exposed area. – Pedro DiNezio

Scientists look at past climates to learn about climate change and the ability to simulate it with computer models. One region that has received a great deal of attention is the Indo-Pacific warm pool, the vast pool of warm water stretching along the equator from Africa to the western Pacific Ocean.

In a new study, Pedro DiNezio of the International Pacific Research Center, University of Hawaii at Manoa, and Jessica Tierney of Woods Hole Oceanographic Institution investigated preserved geological clues (called “proxies”) of rainfall patterns during the last ice age when the planet was dramatically colder than today. They compared these patterns with computer model simulations in order to find a physical explanation for the patterns inferred from the proxies.

Their study, which appears in the May 19, online edition of Nature Geoscience, not only reveals unique patterns of rainfall change over the Indo-Pacific warm pool, but also shows that they were caused by the effect of lowered sea level on the configuration of the Indonesian archipelago.

“For our research,” explains lead-author Pedro DiNezio at the International Pacific Research Center, “we compared the climate of the ice age with our recent warmer climate. We analyzed about 100 proxy records of rainfall and salinity stretching from the tropical western Pacific to the western Indian Ocean and eastern Africa. Rainfall and salinity signals recorded in geological sediments can tell us much about past changes in atmospheric circulation over land and the ocean respectively.”

“Our comparisons show that, as many scientists expected, much of the Indo-Pacific warm pool was drier during this glacial period compared with today. But, counter to some theories, several regions, such as the western Pacific and the western Indian Ocean, especially eastern Africa, were wetter,” adds co-author Jessica Tierney from Woods Hole Oceanographic Institute.

In the second step, the scientists matched these rainfall and salinity patterns with simulations from 12 state-of-the-art climate models that are used to also predict future climate change. For this matching they applied a method of categorical data comparison called the ‘Cohen’s kappa’ statistic. Though widely used in the medical field, this method has not yet been used to match geological climate signals with climate model simulations.

“We were taken aback that only one model out of the 12 showed statistical agreement with the proxy-inferred patterns of the rainfall changes. This model, though, agrees well with both the rainfall and salinity indicators – two entirely independent sets of proxy data covering distinct areas of the tropics,” says DiNezio.

The model reveals that the dry climate during the glacial period was driven by reduced convection over a region of the warm pool called the Sunda Shelf. Today the shelf is submerged beneath the Gulf of Thailand, but was above sea level during the glacial period, when sea level was about 120 m lower.

“The exposure of the Sunda Shelf greatly weakened convection over the warm pool, with far-reaching impacts on the large-scale circulation and on rainfall patterns from Africa to the western Pacific and northern Australia,” explains DiNezio.

The main weakness of the other models, according to the authors, is their limited ability to simulate convection, the vertical air motions that lift humid air into the atmosphere. Differences in the way each model simulates convection may explain why the results for the glacial period are so different.

“Our research resolves a decades-old question of what the response of tropical climate was to glaciation,” concludes DiNezio. “The study, moreover, presents a fine benchmark for assessing the ability of climate models to simulate the response of tropical convection to altered land masses and global temperatures.

Sea level influenced tropical climate during the last ice age

The exposed Sunda Shelf during glacial times greatly affected the atmospheric circulation. The shelf is shown on the left for present-day as the light-blue submerged areas between Java, Sumatra, Borneo, and Thailand, and on the right for the last ice age as the green exposed area. -  Pedro DiNezio
The exposed Sunda Shelf during glacial times greatly affected the atmospheric circulation. The shelf is shown on the left for present-day as the light-blue submerged areas between Java, Sumatra, Borneo, and Thailand, and on the right for the last ice age as the green exposed area. – Pedro DiNezio

Scientists look at past climates to learn about climate change and the ability to simulate it with computer models. One region that has received a great deal of attention is the Indo-Pacific warm pool, the vast pool of warm water stretching along the equator from Africa to the western Pacific Ocean.

In a new study, Pedro DiNezio of the International Pacific Research Center, University of Hawaii at Manoa, and Jessica Tierney of Woods Hole Oceanographic Institution investigated preserved geological clues (called “proxies”) of rainfall patterns during the last ice age when the planet was dramatically colder than today. They compared these patterns with computer model simulations in order to find a physical explanation for the patterns inferred from the proxies.

Their study, which appears in the May 19, online edition of Nature Geoscience, not only reveals unique patterns of rainfall change over the Indo-Pacific warm pool, but also shows that they were caused by the effect of lowered sea level on the configuration of the Indonesian archipelago.

“For our research,” explains lead-author Pedro DiNezio at the International Pacific Research Center, “we compared the climate of the ice age with our recent warmer climate. We analyzed about 100 proxy records of rainfall and salinity stretching from the tropical western Pacific to the western Indian Ocean and eastern Africa. Rainfall and salinity signals recorded in geological sediments can tell us much about past changes in atmospheric circulation over land and the ocean respectively.”

“Our comparisons show that, as many scientists expected, much of the Indo-Pacific warm pool was drier during this glacial period compared with today. But, counter to some theories, several regions, such as the western Pacific and the western Indian Ocean, especially eastern Africa, were wetter,” adds co-author Jessica Tierney from Woods Hole Oceanographic Institute.

In the second step, the scientists matched these rainfall and salinity patterns with simulations from 12 state-of-the-art climate models that are used to also predict future climate change. For this matching they applied a method of categorical data comparison called the ‘Cohen’s kappa’ statistic. Though widely used in the medical field, this method has not yet been used to match geological climate signals with climate model simulations.

“We were taken aback that only one model out of the 12 showed statistical agreement with the proxy-inferred patterns of the rainfall changes. This model, though, agrees well with both the rainfall and salinity indicators – two entirely independent sets of proxy data covering distinct areas of the tropics,” says DiNezio.

The model reveals that the dry climate during the glacial period was driven by reduced convection over a region of the warm pool called the Sunda Shelf. Today the shelf is submerged beneath the Gulf of Thailand, but was above sea level during the glacial period, when sea level was about 120 m lower.

“The exposure of the Sunda Shelf greatly weakened convection over the warm pool, with far-reaching impacts on the large-scale circulation and on rainfall patterns from Africa to the western Pacific and northern Australia,” explains DiNezio.

The main weakness of the other models, according to the authors, is their limited ability to simulate convection, the vertical air motions that lift humid air into the atmosphere. Differences in the way each model simulates convection may explain why the results for the glacial period are so different.

“Our research resolves a decades-old question of what the response of tropical climate was to glaciation,” concludes DiNezio. “The study, moreover, presents a fine benchmark for assessing the ability of climate models to simulate the response of tropical convection to altered land masses and global temperatures.