‘Array of arrays’ coaxing secrets from unfelt seismic tremor events

A seismic sensor (orange device) and datalogger (black box) are installed as part of one station in the  -  Lisa Foley, Iris/Pascal Instrument Center
A seismic sensor (orange device) and datalogger (black box) are installed as part of one station in the – Lisa Foley, Iris/Pascal Instrument Center

Every 15 months or so, an unfelt earthquake occurs in western Washington and travels northward to Canada’s Vancouver Island. The episode typically releases as much energy as a magnitude 6.5 earthquake, but it does so gradually over a month.

New technology is letting University of Washington researchers get a much better picture of how these episodic tremor events relate to potentially catastrophic earthquakes, perhaps as powerful as magnitude 9, that occur every 300 to 500 years in the Cascadia subduction zone in western Washington, Oregon and British Columbia.

“Depending on where the tremor is, a different part of the fault is being loaded,” said Abhijit Ghosh, a UW doctoral student in Earth and space sciences, who is presenting the most recent findings Monday (Dec. 13) at the annual meeting of the American Geophysical Union in San Francisco.

Scientists discovered episodic tremor about a decade ago and have been trying to understand how it figures in the seismic hierarchy of the earthquake-prone Pacific Northwest. In 2008 on Washington’s Olympic Peninsula, UW scientists deployed an array of 80 seismic sensors that act something like a radio antenna, except that instead of bringing in distant radio waves it collects signals from tremor events. Now there are eight such arrays, each armed with 20 to 30 sensors, a complex the scientists call the “array of arrays.”

It was known that tremor events generally start near Olympia, Wash., and march slowly northward on the Olympic Peninsula, eventually reaching Canada’s Vancouver Island and running their course in several weeks.

But Ghosh has found the tremor movement to be far more complex. The source of the tremor generates streaks that travel 60 miles per hour back and forth along a southwest-northeast track. Several hours of this activity produces what shows up as bands of tremor that steadily migrate northward at a much slower speed, about 6 miles per day.

The effect is similar to someone painting a wall, with the wall representing the area where the tremor occurs and paint representing tremor streaks. Eventually the brush strokes will cover the wall.

The arrays are producing enough data for scientists to locate the precise latitude and longitude where a signal originates, Ghosh said, but more work must be done to determine precise depths. It could be that the signal comes from the same depth, about 25 miles, as the subduction fault zone, but that is unclear.

“Because the signal is very different from our garden variety earthquakes, we need new techniques to determine the source of the signal, and this is one step toward that,” Ghosh said. “With the array of arrays we should be able to see a greater quantity of clear signal, and we do. We see more tremor – way more tremor – than with conventional methods.”

Researchers have known for several years that these tremor events add to the fault stress in the Cascadia subduction zone, where the Juan de Fuca tectonic plate dives beneath the North American plate that is directly under the most populous areas of Washington, Oregon and British Columbia. The last great Cascadia subduction zone earthquake, estimated at magnitude 9, occurred in January 1700 and generated a tsunami that traveled to Japan.

The arrays are beginning to produce a better understanding of how tremor events are related to the Cascadia fault zone. For example, the southwest-northeast angle of the tremor streaks and bands matches almost exactly the angle, about 54 degrees, at which the Juan de Fuca plate meets the North American plate.

“We have already seen different types of tremor migration in Cascadia, and there might be even more,” Ghosh said. “With high-precision locating technology, we are getting a clearer picture.”

Geologist’s discoveries resolve debate about oxygen in Earth’s mantle

While there continues to be considerable debate among geologists about the availability of oxygen in the Earth’s mantle, recent discoveries by a University of Rhode Island scientist are bringing resolution to the question.

Analysis of erupted rock from Agrigan volcano in the western Pacific near Guam found it to be highly oxidized as a result of its exposure to oxygen when it formed in the Earth’s mantle. When, over millions of years, seafloor rocks are transported back into the Earth’s mantle at subduction zones – sites on the seafloor where tectonic plates have collided, forcing one plate beneath the other – they deliver more oxygen into the mantle.

The results of the research was presented today at a meeting of the American Geophysical Union in San Francisco.

“The cycling of oxygen at the Earth’s surface is central to the life and activity that takes place at the surface, but it is equally essential in the Earth’s mantle,” said URI Assistant Professor Katherine Kelley. “The availability of oxygen to the mantle is in part controlled by the oxygen at the surface.”

Kelley said that this discovery is important because the availability of oxygen to the mantle controls what minerals are found there, how certain elements behave, and what kind of gasses might be expelled from volcanoes.

“The most primitive samples of lava we can identify are the most oxidized,” she said. “That oxidation comes off the subducted plate at depth in the mantle and makes its way into volcanic magma sources that then erupt.”

According to Kelley, some scientists have argued that the availability of oxygen to the mantle hasn’t changed since the Earth was formed. However, if plate tectonics carry this oxidized material into the mantle, as she has demonstrated, then it is adding oxygen to the mantle. It also suggests that what takes place at the surface of the Earth probably influences what happens deep beneath the surface as well.

At Brookhaven National Laboratory, Kelley analyzed tiny olivine crystals that contain naturally formed glass from the early histories of magmas, in which are found dissolved gases from volcanic eruptions. By analyzing the glass she determined the oxidation state of iron in rocks and related it to the dissolved gases, which are elevated in subduction zone magmas.

This work follows a related study by Kelley that found that material from subduction zones are more oxidized than material from mid-ocean ridges where the plates are pulling apart. That study was published in the journal Science in 2009.

“These are important processes to understand, but they are hard to get a clear picture of because they take place over such long periods of time,” Kelley said. “It’s one piece of the big puzzle of Earth’s evolution and how it continues to change.”

Assessing the seismic hazard of the central eastern United States

Virginia Tech associate professor of civil and environmental engineering Russell Green focuses on the study of paleoseismology to achieve a greater understanding of the probability of seismic events. -  Provided by Virginia Tech
Virginia Tech associate professor of civil and environmental engineering Russell Green focuses on the study of paleoseismology to achieve a greater understanding of the probability of seismic events. – Provided by Virginia Tech

As the U.S. policy makers renew emphasis on the use of nuclear energy in their efforts to reduce the country’s oil dependence, other factors come into play. One concern of paramount importance is the seismic hazard at the site where nuclear reactors are located.

Russell A. Green, associate professor of civil and environmental engineering at Virginia Tech, spent five years as an earthquake engineer for the U.S. Defense Nuclear Facilities Safety Board in Washington, D.C., prior to becoming a university professor. Part of his responsibility at the safety board was to perform seismic safety analyses on the nation’s defense nuclear facilities.

“I found the greatest uncertainty in seismic analyses was related to the ground motions used in the analyses?Many of the facilities being analyzed were already built and operating, and the facilities were already heavily contaminated with radioactive material,” Green said.

An immediate concern then became how and which buildings to retrofit. The balance in the decision-making process was between using overly conservative ground motions and potentially wasting “hundreds of millions of dollars in unnecessary retrofits” versus using less demanding motions and potentially “placing facility workers, neighboring towns, and cities at risk,” Green added.

Green’s concerns and expertise in earthquake engineering earned him a National Science Foundation CAREER Award in 2006 valued at more than $400,000. He has used this support for the development of procedures for collecting and analyzing data required for assessing the seismic hazard in regions where moderate to large earthquakes would have significant consequences, yet they remain low probability events.

Green said a “huge shift” in the engineering profession’s approach to reducing seismic risk has occurred during the past decade. Building codes have been modified to include performance-based earthquake engineering (PBEE) concepts. This differs from the previous traditional design approach that used “life safety as the primary design goal,” Green explained. “PBEE is based on the premise that performance can be predicted and evaluated with quantifiable confidence, allowing the engineer, together with the client, to make intelligent and informed trade-offs based on life-cycle considerations rather than construction costs alone.”

To implement PBEE and to calculate the annual probability of specific losses due to seismic events, engineers need to know the fragility of structural systems and the probabilistically quantified seismic hazard.

To conduct his research, Green is focusing on paleoseismology, the study of the timing, location, and size of prehistoric/pre-instrumental earthquakes, ranging from those that occurred hundreds to tens of thousands of years ago.

“I believe that earthquake engineering encompasses geology, seismology, geotechnical engineering, structural engineering, urban planning, and emergency response, ” Green said.

“The appropriate selection of ground motions is particularly difficult because many critical facilities are located in the central and eastern U.S. and in the Pacific Northwest,” Green said. “We know moderate to large earthquakes have occurred in these regions. We just do not know how large the events were, how often they occurred, or the characteristics of the associated ground shaking, such as duration, amplitude, and frequency content.”

Unlike many places in the western U.S. where excavations can be used to determine the past movement on earthquake faults, in the central-eastern U.S. the locations of most faults are unknown and/or the faults are too deep to excavate. As a result, Green is concentrating his work on the development and validation of paleoliquefaction procedures. Soil liquefaction is the transition of soil from a solid to a liquefied state. Earthquakes are one cause of liquefaction, with the evidence of liquefaction often remaining in the soil profile for many thousands of years after the earthquake.

“Paleoliquefaction investigations are the most plausible way to determine the recurrence time of moderate to large earthquakes in the central-eastern U.S. ,” Green said. “By extending the earthquake record into prehistoric times, paleoseismic investigations remove one of the major obstacles to implementing PBEE across the U.S.”

To determine the age of a paleoliquefaction feature, researchers might use any one of a number of techniques, including: radiocarbon dating, optically stimulated luminescence, or archeological evidence.

Green said his work will address the “gaps in knowledge that typically stem from uncertainties related to analytical techniques used in back-calculations, the amount and quantity of paleoliquefaction data, and the significance of changes in the geotechnical properties of post-liquefied sediments such as aging and density changes.”

In addition to his work studying paleoearthquakes, Green has also been involved in performing field studies of several recent earthquakes. He has performed post-earthquake field studies of the 2008 Mt. Carmel, Ill., magnitude 5.2 earthquake, the 2008 Iwate Miyagi-Nairiku, Japan, magnitude 6.9 earthquake, the 2010 Haiti, magnitude 7.0 earthquake, and the 2010 Darfield, New Zealand, magnitude 7.1 earthquake. The latter two field studies were National Science Foundation sponsored Geo-Engineering Extremes Events Reconnaissance (GEER) investigations, with Green serving as the US Team leader for the Darfield earthquake study.

New way found of monitoring volcanic ash cloud

The eruption of the Icelandic volcano Eyjafjallajökull in April this year resulted in a giant ash cloud, which – at one point covering most of Europe – brought international aviation to a temporary standstill, resulting in travel chaos for tens of thousands.

New research, to be published today, Friday 10 December, in IOP Publishing’s Environmental Research Letters, shows that lightning could be used as part of an integrated approach to estimate volcanic plume properties.

The scientists found that during many of the periods of significant volcanic activity, the ash plume was sufficiently electrified to generate lightning, which was measured by the UK Met Office’s long range lightning location network (ATDnet), operating in the Very Low Frequency radio spectrum.

The measurements suggest a general correlation between lightning frequency and plume height and the method has the advantage of being detectable many thousands of kilometers away, in both day and night as well as in all weather conditions.

As the researchers write, “When a plume becomes sufficiently electrified to produce lightning, the rate of lightning generation provides a method of remotely monitoring the plume height, offering clear benefits to the volcanic monitoring community.”

The end of planet formation, as told by trace elements from the mantles of Earth, the moon and Mars

New research reveals that the abundance of so-called highly siderophile, or metal-loving, elements like gold and platinum found in the mantles of Earth, the Moon and Mars were delivered by massive impactors during the final phase of planet formation over 4.5 billion years ago. The predicted sizes of the projectiles, which hit within tens of millions of years of the giant impact that produced our Moon, are consistent with current planet formation models as well as physical evidence such as the size distributions of asteroids and ancient Martian impact scars. They predict that the largest of the late impactors on Earth, at 1,500-2,000 miles in diameter, potentially modified Earth’s obliquity by approximately 10 degrees, while those for the Moon, at approximately 150-200 miles, may have delivered water to its mantle.

The team that conducted this study comprises solar system dynamicists, such as Dr. William Bottke and Dr. David Nesvorny from the Southwest Research Institute, and geophysical-geochemical modelers, such as Prof. Richard J. Walker from the University of Maryland, Prof. James Day from the University of Maryland and Scripps Institution of Oceanography, and Prof. Linda Elkins-Tanton, from the Massachusetts Institute of Technology. Together, they represent three teams within the NASA Lunar Science Institute (NLSI).

A fundamental problem in planetary science is to determine how Earth, the Moon, and other inner solar system planets formed and evolved. This is a difficult question to answer given that billions of years of history have steadily erased evidence for these early events. Despite this, critical clues can still be found to help determine what happened, provided one knows where to look.

For instance, careful study of lunar samples brought back by the Apollo astronauts, combined with numerical modeling work, indicates that the Moon formed as a result of a collision between a Mars-sized body and the early Earth about 4.5 billion years ago. While the idea that the Earth-Moon system owes its existence to a single, random event was initially viewed as radical, it is now believed that such large impacts were commonplace during the end stages of planet formation. The giant impact is believed to have led to a final phase of core formation and global magma oceans on both the Earth and Moon.

For the giant impact hypothesis to be correct, one might expect samples from the Earth and Moon’s mantle, brought to the surface by volcanic activity, to back it up. In particular, scientists have examined the abundance in these rocks of so-called highly siderophile, or metal-loving, elements: Re, Os, Ir, Ru, Pt, Rh, Pd, Au. These elements should have followed the iron and other metals to the core in the aftermath of the Moon-forming event, leaving the rocky crusts and mantles of these bodies void of these elements. Accordingly, their near-absence from mantle rocks should provide a key test of the giant impact model.

However, as described by team member Walker, “The big problem for the modelers is that these metals are not missing at all, but instead are modestly plentiful.” Team member Day adds, “This is a good thing for anyone who likes their gold wedding rings or the cleaner air provided by the palladium in their car’s catalytic convertors.”

A proposed solution to this conundrum is that highly siderophile elements were indeed stripped from the mantle by the effects of the giant impact, but were then partially replenished by later impacts from the original building blocks of the planets, called planetesimals. This is not a surprise – planet formation models predict such late impacts – but their nature, numbers, and most especially size of the accreting bodies are unknown. Presumably, they could have represented the accretion of many small bodies or a few large events. To match observations, the late-arriving planetesimals need to deliver 0.5 percent of the Earth’s mass to Earth’s mantle, equivalent to one-third of the mass of the Moon, and about 1,200 times less mass to the Moon’s mantle.

Using numerical models, the team showed that they could reproduce these amounts if the late accretion population was dominated by massive projectiles. Their results indicate the largest Earth impactor was 1,500-2,000 miles in diameter, roughly the size of Pluto, while those hitting the Moon were only 150-200 miles across. Lead author Bottke says, “These impactors are thought to be large enough to produce the observed enrichments in highly siderophile elements, but not so large that their fragmented cores joined with the planet’s core. They probably represent the largest objects to hit those worlds since the giant impact that formed our Moon.”

Intriguingly, the predicted distribution of projectile sizes, where most of the mass of the population is found among the largest objects, is consistent with other evidence.

  • New models describing how planetesimals form and evolve suggest the biggest ones efficiently gobble up the smaller ones and run away in terms of size, leaving behind a population of enormous objects largely resistant to collisional erosion.

  • The last surviving planetesimal populations in the inner solar system are the asteroids. In the inner asteroid belt, the asteroids Ceres, Pallas and Vesta, at 600, 300 and 300 miles across, respectively, dwarf the next largest asteroids at 150 miles across. No asteroids with “in-between” sizes are observed in this region.

  • The sizes of the oldest and largest craters on Mars, many of which are thousands of miles across, are consistent with it being bombarded by an inner asteroid belt-like population dominated by large bodies early in its history.

These results make it possible to make some interesting predictions about the evolution of the Earth, Mars and the Moon. For example:

  • The largest projectiles that struck Earth were capable of modifying its spin axis, on average, by approximately 10 degrees.

  • The largest impactor to strike Mars, according to this work and the abundance of highly siderophile elements found in Martian meteorites, was 900𔂿,100 miles across. This is approximately the projectile size needed to create the proposed Borealis basin that may have produced Mars’ global hemispheric dichotomy.

  • For the Moon, the projectiles would have been large enough to have created the South-Pole-Aitkin basin or perhaps a comparable-sized early basin. Moreover, if they contained even a trace amount of volatiles, then the same processes that brought highly siderophile elements to the Moon’s mantle may have also delivered its observed abundance of water.

Changes in solar activity affect local climate

Solar Filament exploding out into space.
Solar Filament exploding out into space.

Raimund Muscheler is a researcher at the Department of Earth and Ecosystem Sciences at Lund University in Sweden. In the latest issue of the journal Science, he and his colleagues have described how the surface water temperature in the tropical parts of the eastern Pacific varied with the sun’s activity between 7 000 and 11 000 years ago (early Holocene). Contrary to what one might intuitively believe, high solar activity had a cooling effect in this region.

“It is perhaps a similar phenomenon that we are seeing here today”, says Raimund Muscheler. “Last year’s cold winter in Sweden could intuitively be seen to refute global warming. But the winter in Greenland was exceptionally mild. Both phenomena coincide with low solar activity and the sun’s activity probably influences the local climate variations.”

Today there is a lot of debate about whether the sun’s activity could have influenced the earth’s climate over thousands or millions of years.

“The key processes in this influence are still mostly unclear. This is why the present climate models probably do not include the full effect of solar activity”, says Raimund Muscheler.

By reconstructing surface water temperatures from plankton stored in a sediment core taken from the seabed off the west coast of Baja California Sur, Mexico, researchers have now made new findings. The results suggest that solar activity has influenced the sea’s surface water temperature by changing local circulation processes in the sea. Previous studies have shown that the surface water temperature in the tropical Pacific Ocean is linked to atmospheric and seawater circulation through the regional weather phenomena El Niño and El Niña.

“We know that El Niño brings a warmer climate, while El Niña brings a cooler climate in the eastern part of the Pacific Ocean”, says Raimund Muscheler. “If we presume that this connection existed during the early Holocene, this means that there could be a link between solar activity and El Niño/El Niña on long time scales.”

In his research, Raimund Muscheler works to reconstruct previous changes in solar activity by studying how cosmogenic isotopes, for example of beryllium-10 and carbon-14, have been stored in both ice cores and annual rings in trees. Cosmogenic isotopes are formed in the atmosphere as a result of cosmic radiation from space. When solar activity is high, a small amount of the cosmic radiation reaches the atmosphere and thus a small number of cosmogenic isotopes are formed and stored.

“This is the best and most reliable method we have to reconstruct solar activity”, says Raimund Muscheler.

Using chaos to model geophysical phenomena

These two images show that the most 'coherent set,' the most nondispersive transport time from Sept. 1 to Sept. 14, is in fact the vortex itself over this domain -- demonstrating that the new technique very accurately pinpoints the polar vortex at specific times. -  American Institute of Physics
These two images show that the most ‘coherent set,’ the most nondispersive transport time from Sept. 1 to Sept. 14, is in fact the vortex itself over this domain — demonstrating that the new technique very accurately pinpoints the polar vortex at specific times. – American Institute of Physics

Geophysical phenomena such as the dynamics of the atmosphere and ocean circulation are typically modeled mathematically by tracking the motion of air or water particles. These mathematical models define velocity fields that, given (i) a position in three-dimensional space and (ii) a time instant, provide a speed and direction for a particle at that position and time instant.

“Geophysical phenomena are still not fully understood, especially in turbulent regimes,” explains Gary Froyland at the School of Mathematics and Statistics and the Australian Research Council Centre of Excellence for Mathematics and Statistics of Complex Systems (MASCOS) at the University of New South Wales in Australia.

“Nevertheless, it is very important that scientists can quantify the ‘transport’ properties of these geophysical systems: Put very simply, how does a packet of air or water get from A to B, and how large are these packets? An example of one of these packets is the Antarctic polar vortex, a rotating mass of air in the stratosphere above Antarctica that traps chemicals such as ozone and chlorofluorocarbons (CFCs), exacerbating the effect of the CFCs on the ozone hole,” Froyland says.

In the American Institute of Physics’ journal CHAOS, Froyland and his research team, including colleague Adam Monahan from the School of Earth and Ocean Sciences at the University of Victoria in Canada, describe how they developed the first direct approach for identifying these packets, called “coherent sets” due to their nondispersive properties.

This technique is based on so-called “transfer operators,” which represent a complete description of the ensemble evolution of the fluid. The transfer operator approach is very simple to implement, they say, requiring only singular vector computations of a matrix of transitions induced by the dynamics.

When tested using European Centre for Medium Range Weather Forecasting (ECMWF) data, they found that their new methodology was significantly better than existing technologies for identifying the location and transport properties of the vortex.

The transport operator methodology has myriad applications in atmospheric science and physical oceanography to discover the main transport pathways in the atmosphere and oceans, and to quantify the transport. “As atmosphere-ocean models continue to increase in resolution with improved computing power, the analysis and understanding of these models with techniques such as transfer operators must be undertaken beyond pure simulation,” says Froyland.

Their next application will be the Agulhas rings off the South African coast, because the rings are responsible for a significant amount of transport of warm water and salt between the Indian and Atlantic Oceans.

New research shows rivers cut deep notches in the Alps’ broad glacial valleys

A train crosses a river gorge in the Swiss Alps that drops steeply from the floor of the broad glacial valley above it. -  Oliver Korup
A train crosses a river gorge in the Swiss Alps that drops steeply from the floor of the broad glacial valley above it. – Oliver Korup

For years, geologists have argued about the processes that formed steep inner gorges in the broad glacial valleys of the Swiss Alps.

The U-shaped valleys were created by slow-moving glaciers that behaved something like road graders, eroding the bedrock over hundreds or thousands of years. When the glaciers receded, rivers carved V-shaped notches, or inner gorges, into the floors of the glacial valleys. But scientists disagreed about whether those notches were erased by subsequent glaciers and then formed all over again as the second round of glaciers receded.

New research led by a University of Washington scientist indicates that the notches endure, at least in part, from one glacial episode to the next. The glaciers appear to fill the gorges with ice and rock, protecting them from being scoured away as the glaciers move.

When the glaciers receded, the resulting rivers returned to the gorges and easily cleared out the debris deposited there, said David Montgomery, a UW professor of Earth and space sciences.

“The alpine inner gorges appear to lay low and endure glacial attack. They are topographic survivors,” Montgomery said.

“The answer is not so simple that the glaciers always win. The river valleys can hide under the glaciers and when the glaciers melt the rivers can go back to work.”

Montgomery is lead author of a paper describing the research, published online Dec. 5 in Nature Geoscience. Co-author is Oliver Korup of the University of Potsdam in Germany, who did the work while with the Swiss Federal Research Institutes in Davos, Switzerland.

The researchers used topographic data taken from laser-based (LIDAR) measurements to determine that, if the gorges were erased with each glacial episode, the rivers would have had to erode the bedrock from one-third to three-quarters of an inch per year since the last glacial period to get gorges as deep as they are today.

“That is screamingly fast. It’s really too fast for the processes,” Montgomery said. Such erosion rates would exceed those in all areas of the world except the most tectonically active regions, the researchers said, and they would have to maintain those rates for 1,000 years.

Montgomery and Korup found other telltale evidence, sediment from much higher elevations and older than the last glacial deposits, at the bottom of the river gorges. That material likely was pushed into the gorges as glaciers moved down the valleys, indicating the gorges formed before the last glaciers.

“That means the glaciers aren’t cutting down the bedrock as fast as the rivers do. If the glaciers were keeping up, each time they’d be able to erase the notch left by the river,” Montgomery said.

“They’re locked in this dance, working together to tear the mountains down.”

The work raises questions about how common the preservation of gorges might be in other mountainous regions of the world.

“It shows that inner gorges can persist, and so the question is, ‘How typical is that?’ I don’t think every inner gorge in the world survives multiple glaciations like that, but the Swiss Alps are a classic case. That’s where mountain glaciation was first discovered.”

What can ice reveal about fire?

This is a general view of the drill site D47 in Antarctica where the LGGE (Laboratoire du Glaciologie et Geophysique de l'Environnement) team drilled one of two cores used in the recent Southern Hemisphere biomass-burning study. The thermal drilling method the researchers used allowed them to collect a 12-centimeter-diameter core, from which carbon monoxide and its isotopes were measured at Stony Brook University. These analyses required close to one kilogram of ice per sample. -  Jerome Chappellaz, CNRS/LGGE
This is a general view of the drill site D47 in Antarctica where the LGGE (Laboratoire du Glaciologie et Geophysique de l’Environnement) team drilled one of two cores used in the recent Southern Hemisphere biomass-burning study. The thermal drilling method the researchers used allowed them to collect a 12-centimeter-diameter core, from which carbon monoxide and its isotopes were measured at Stony Brook University. These analyses required close to one kilogram of ice per sample. – Jerome Chappellaz, CNRS/LGGE

Scientists studying a column of Antarctic ice spanning 650 years have found evidence for fluctuations in biomass burning–the consumption of wood, peat and other materials in wildfires, cooking fires and communal fires–in the Southern Hemisphere.

The record, focused primarily on carbon monoxide (CO), differs substantially from the record in the Northern Hemisphere, suggesting changes may be necessary for several leading climate models.

The research appears in Science on Dec. 2, 2010, in an early online release.

The scientists studied variations in stable (non-radioactive, non-decaying) isotopes of carbon and oxygen, the first such measurements for carbon monoxide collected from ice-core samples.

“Combined with concentration measurements of CO, this record allows us to constrain the relative strength of biomass burning activity over the 650-year period in the Southern Hemisphere,” said co-author and research lead John Mak, a geoscientist at SUNY Stony Brook.

“What we find is that the amount of biomass burning has changed significantly over that time period,” Mak added, “and that biomass burning was in fact a significant source of CO during pre-industrial times.”

The biomass burning trends indicated by the CO largely agree with Southern Hemisphere records tracking charcoal particles in sediments and with measurements of methane from trapped ice.

Unexpectedly, the researchers found that biomass burning appears to have been more prevalent 100 to 150 years ago than it was during the 20th century.

“While this is consistent with previous findings,” added Mak, “there is still a common mis-perception that biomass burning rates are much higher today than in the past. This is significant since many researchers assume that human-induced biomass burning is much greater than ‘naturally’ occurring biomass burning. While this may still be the case–there were people around in the 18th century–the fact that today’s rates of [Southern Hemisphere] biomass burning seem to be lower than one to two centuries ago calls for a re-evaluation of sources.”

Global sea-level rise at the end of the last Ice Age

Southampton researchers have estimated that sea-level rose by an average of about 1 meter per century at the end of the last Ice Age, interrupted by rapid ‘jumps’ during which it rose by up to 2.5 meters per century. The findings, published in Global and Planetary Change, will help unravel the responses of ocean circulation and climate to large inputs of ice-sheet meltwater to the world ocean.

Global sea level rose by a total of more than 120 metres as the vast ice sheets of the last Ice Age melted back. This melt-back lasted from about 19,000 to about 6,000 years ago, meaning that the average rate of sea-level rise was roughly 1 metre per century.

Previous studies of sea-level change at individual locations have suggested that the gradual rise may have been marked by abrupt ‘jumps’ of sea-level rise at rates that approached 5 metres per century. These estimates were based on analyses of the distribution of fossil corals around Barbados and coastal drowning along the Sunda Shelf, an extension of the continental shelf of East Asia.

However, uncertainties in fossil dating, scarcity of sea-level markers, and the specific characteristics of individual sites can make it difficult to reconstruct global sea level with a high degree of confidence using evidence from any one site.

“Rather than relying on individual sites that may not be representative, we have compared large amounts of data from many different sites, taking into account all potential sources of uncertainty,” said Professor Eelco Rohling of the University of Southampton’s School of Ocean and Earth Science (SOES) based at the National Oceanography Centre (NOC) in Southampton.

The researchers brought together about 400 high-quality sea-level markers from study sites around the globe, concentrating on locations far removed from the distorting effects of the past massive ice sheets.

Using an extensive series of sophisticated statistical tests, they then reconstructed sea-level history of the last 21 thousand years with a high degree of statistical confidence.

Their analysis indicate that the gradual rise at an average rate of 1 meter per century was interrupted by two periods with rates of rise up to 2.5 meters per century, between 15 and 13 thousand years ago, and between 11 and 9 thousand years ago.

The first of these jumps in the amount of ice-sheet meltwater entering the world ocean coincides with the beginning of a period of global climate warming called the Bølling-Allerød period. The second jump appears to have happened shortly after the end the ‘big freeze’ called the Younger Dryas that brought the Bølling-Allerød period to an abrupt end.

“Our estimates of rates of sea-level rise are lower than those estimated from individual study sites, but they are statistically robust and therefore greatly improve our understanding of loss of ice volume due to the melting of the ice sheets at the end of the last Ice Age,” said lead author Dr Jennifer Stanford of SOES.

“The new findings will be used to refine models of the Earth climate system, and will thus help to improve forecasts of future sea-level responses to global climate change,” added Rohling.