2010 spike in Greenland ice loss lifted bedrock, GPS reveals

This is a composite photograph of a GNET GPS unit implanted in the southeastern Greenland bedrock. -  Image by Dana Caccamise, courtesy of Ohio State University
This is a composite photograph of a GNET GPS unit implanted in the southeastern Greenland bedrock. – Image by Dana Caccamise, courtesy of Ohio State University

An unusually hot melting season in 2010 accelerated ice loss in southern Greenland by 100 billion tons – and large portions of the island’s bedrock rose an additional quarter of an inch in response.

That’s the finding from a network of nearly 50 GPS stations planted along the Greenland coast to measure the bedrock’s natural response to the ever-diminishing weight of ice above it.

Every year as the Greenland Ice Sheet melts, the rocky coast rises, explained Michael Bevis, Ohio Eminent Scholar in Geodynamics and professor in the School of Earth Sciences at Ohio State University. Some GPS stations around Greenland routinely detect uplift of 15 mm (0.59 inches) or more, year after year. But a temperature spike in 2010 lifted the bedrock a detectably higher amount over a short five-month period – as high as 20 mm (0.79 inches) in some locations.

In a presentation Friday at the American Geophysical Union meeting in San Francisco, Bevis described the study’s implications for climate change.

“Pulses of extra melting and uplift imply that we’ll experience pulses of extra sea level rise,” he said. “The process is not really a steady process.”

Because the solid earth is elastic, Bevis and his team can use the natural flexure of the Greenland bedrock to measure the weight of the ice sheet, just like the compression of a spring in a bathroom scale measures the weight of the person standing on it.

Bevis is the principal investigator for the Greenland GPS Network (GNET), and he’s confident that the anomalous 2010 uplift that GNET detected is due to anomalous ice loss during 2010: “Really, there is no other explanation. The uplift anomaly correlates with maps of the 2010 melting day anomaly. In locations where there were many extra days of melting in 2010, the uplift anomaly is highest.”

In scientific parlance, a melting day “anomaly” refers to the number of extra melting days – that is, days that were warm enough to melt ice – relative to the average number of melting days per year over several decades.

In 2010, the southern half of Greenland lost an extra 100 billion tons of ice under conditions that scientists would consider anomalously warm.

GNET measurements indicate that as that ice melted away, the bedrock beneath it rose. The amount of uplift differed from station to station, depending on how close the station was to regions where ice loss was greatest.

Southern Greenland stations that were very close to zones of heavy ice loss rose as much as 20 mm (about 0.79 inches) over the five months. Even stations that were located far away typically rose at least 5 mm (0.2 inches) during the course of the 2010 melting season. But stations in the North of Greenland barely moved at all.

From 2007 to 2009, GNET installed GPS stations in the bedrock that lay exposed around the ice sheet margins along the Greenland coast. The research team is using the earth’s natural elasticity of to “weigh” the ice. As previous Ohio State studies of Antarctica revealed, ice weighs down bedrock, and when the ice melts away, the bedrock rises measurably in response.

GNET and similar GPS networks around the world could thus allow scientists to continue to measure ice loss after the Gravity Recovery and Climate Experiment (GRACE) satellites are retired in 2015. (GRACE is a joint project of NASA and the German Aerospace Center.)

Paleoclimate record points toward potential rapid climate changes

The average global surface temperature of Earth has risen by .8 degrees Celsius since 1880, and is now increasing at a rate of about .1 degree Celsius per decade. This image shows how 2010 temperatures compare to average temperatures from a baseline period of 1951-1980, as analyzed by scientists at NASA's Goddard Institute for Space Studies. -  NASA GISS
The average global surface temperature of Earth has risen by .8 degrees Celsius since 1880, and is now increasing at a rate of about .1 degree Celsius per decade. This image shows how 2010 temperatures compare to average temperatures from a baseline period of 1951-1980, as analyzed by scientists at NASA’s Goddard Institute for Space Studies. – NASA GISS

New research into the Earth’s paleoclimate history by NASA’s Goddard Institute for Space Studies director James E. Hansen suggests the potential for rapid climate changes this century, including multiple meters of sea level rise, if global warming is not abated.

By looking at how the Earth’s climate responded to past natural changes, Hansen sought insight into a fundamental question raised by ongoing human-caused climate change: “What is the dangerous level of global warming?” Some international leaders have suggested a goal of limiting warming to 2 degrees Celsius from pre-industrial times in order to avert catastrophic change. But Hansen said at a press briefing at a meeting of the American Geophysical Union in San Francisco on Tues, Dec. 6, that warming of 2 degrees Celsius would lead to drastic changes, such as significant ice sheet loss in Greenland and Antarctica.

Based on Hansen’s temperature analysis work at the Goddard Institute for Space Studies, the Earth’s average global surface temperature has already risen .8 degrees Celsius since 1880, and is now warming at a rate of more than .1 degree Celsius every decade. This warming is largely driven by increased greenhouse gases in the atmosphere, particularly carbon dioxide, emitted by the burning of fossil fuels at power plants, in cars and in industry. At the current rate of fossil fuel burning, the concentration of carbon dioxide in the atmosphere will have doubled from pre-industrial times by the middle of this century. A doubling of carbon dioxide would cause an eventual warming of several degrees, Hansen said.

In recent research, Hansen and co-author Makiko Sato, also of Goddard Institute for Space Studies, compared the climate of today, the Holocene, with previous similar “interglacial” epochs – periods when polar ice caps existed but the world was not dominated by glaciers. In studying cores drilled from both ice sheets and deep ocean sediments, Hansen found that global mean temperatures during the Eemian period, which began about 130,000 years ago and lasted about 15,000 years, were less than 1 degree Celsius warmer than today. If temperatures were to rise 2 degrees Celsius over pre-industrial times, global mean temperature would far exceed that of the Eemian, when sea level was four to six meters higher than today, Hansen said.

“The paleoclimate record reveals a more sensitive climate than thought, even as of a few years ago. Limiting human-caused warming to 2 degrees is not sufficient,” Hansen said. “It would be a prescription for disaster.”

Hansen focused much of his new work on how the polar regions and in particular the ice sheets of Antarctica and Greenland will react to a warming world.

Two degrees Celsius of warming would make Earth much warmer than during the Eemian, and would move Earth closer to Pliocene-like conditions, when sea level was in the range of 25 meters higher than today, Hansen said. In using Earth’s climate history to learn more about the level of sensitivity that governs our planet’s response to warming today, Hansen said the paleoclimate record suggests that every degree Celsius of global temperature rise will ultimately equate to 20 meters of sea level rise. However, that sea level increase due to ice sheet loss would be expected to occur over centuries, and large uncertainties remain in predicting how that ice loss would unfold.

Hansen notes that ice sheet disintegration will not be a linear process. This non-linear deterioration has already been seen in vulnerable places such as Pine Island Glacier in West Antarctica, where the rate of ice mass loss has continued accelerating over the past decade. Data from NASA’s Gravity Recovery and Climate Experiment (GRACE) satellite is already consistent with a rate of ice sheet mass loss in Greenland and West Antarctica that doubles every ten years. The GRACE record is too short to confirm this with great certainty; however, the trend in the past few years does not rule it out, Hansen said. This continued rate of ice loss could cause multiple meters of sea level rise by 2100, Hansen said.

Ice and ocean sediment cores from the polar regions indicate that temperatures at the poles during previous epochs – when sea level was tens of meters higher – is not too far removed from the temperatures Earth could reach this century on a “business as usual” trajectory.

“We don’t have a substantial cushion between today’s climate and dangerous warming,” Hansen said. “Earth is poised to experience strong amplifying feedbacks in response to moderate additional global warming.”

Detailed considerations of a new warming target and how to get there are beyond the scope of this research, Hansen said. But this research is consistent with Hansen’s earlier findings that carbon dioxide in the atmosphere would need to be rolled back from about 390 parts per million in the atmosphere today to 350 parts per million in order to stabilize the climate in the long term. While leaders continue to discuss a framework for reducing emissions, global carbon dioxide emissions have remained stable or increased in recent years.

Hansen and others noted that while the paleoclimate evidence paints a clear picture of what Earth’s earlier climate looked like, but that using it to predict precisely how the climate might change on much smaller timescales in response to human-induced rather than natural climate change remains difficult. But, Hansen noted, the Earth system is already showing signs of responding, even in the cases of “slow feedbacks” such as ice sheet changes.

The human-caused release of increased carbon dioxide into the atmosphere also presents climate scientists with something they’ve never seen in the 65 million year record of carbon dioxide levels – a drastic rate of increase that makes it difficult to predict how rapidly the Earth will respond. In periods when carbon dioxide has increased due to natural causes, the rate of increase averaged about .0001 parts per million per year – in other words, one hundred parts per million every million years. Fossil fuel burning is now causing carbon dioxide concentrations to increase at two parts per million per year.

“Humans have overwhelmed the natural, slow changes that occur on geologic timescales,” Hansen said.

Research study shows link between earthquakes and tropical cyclones

New study presented at AGU by University of Miami professor Shimon Wdowinski may help scientists identify regions at high risk for earthquakes. Wdowinski shows that earthquakes, including the recent 2010 temblors in Haiti and Taiwan, may be triggered by tropical cyclones and the wet rains that accompany them. -  Estelle Chaussard/RSMAS
New study presented at AGU by University of Miami professor Shimon Wdowinski may help scientists identify regions at high risk for earthquakes. Wdowinski shows that earthquakes, including the recent 2010 temblors in Haiti and Taiwan, may be triggered by tropical cyclones and the wet rains that accompany them. – Estelle Chaussard/RSMAS

A groundbreaking study led by University of Miami (UM) scientist Shimon Wdowinski shows that earthquakes, including the recent 2010 temblors in Haiti and Taiwan, may be triggered by tropical cyclones (hurricanes and typhoons). Wdowinski will discuss his findings during a presentation at the 2011 AGU Fall Meeting in San Francisco.

“Very wet rain events are the trigger,” said Wdowinski, associate research professor of marine geology and geophysics at the UM Rosenstiel School of Marine and Atmospheric Science. “The heavy rain induces thousands of landslides and severe erosion, which removes ground material from the Earth’s surface, releasing the stress load and encouraging movement along faults.”

Wdowinski and a colleague from Florida International University analyzed data from quakes magnitude-6 and above in Taiwan and Haiti and found a strong temporal relationship between the two natural hazards, where large earthquakes occurred within four years after a very wet tropical cyclone season.

During the last 50 years three very wet tropical cyclone events – Typhoons Morakot, Herb and Flossie – were followed within four years by major earthquakes in Taiwan’s mountainous regions. The 2009 Morakot typhoon was followed by a M-6.2 in 2009 and M-6.4 in 2010. The 1996 Typhoon Herb was followed by M-6.2 in 1998 and M-7.6 in 1999 and the 1969 Typhoon Flossie was followed by a M-6.2 in 1972.

The 2010 M-7 earthquake in Haiti occurred in the mountainous region one-and-a-half years after two hurricanes and two tropical storms drenched the island nation within 25 days.

The researchers suggest that rain-induced landslides and excess rain carries eroded material downstream. As a result the surface load above the fault is lessened.

“The reduced load unclamp the faults, which can promote an earthquake,” said Wdowinski.

Fractures in Earth’s bedrock from the movement of tectonic plates, known as faults, build up stress as they attempt to slide past each other, periodically releasing the stress in the form of an earthquake.

According to the scientists, this earthquake-triggering mechanism is only viable on inclined faults, where the rupture by these faults has a significant vertical movement.

Wdowinski also shows a trend in the tropical cyclone-earthquake pattern exists in M-5 and above earthquakes. The researchers plan to analyze patterns in other seismically active mountainous regions – such as the Philippines and Japan – that are subjected to tropical cyclones activity.

Landsat satellites track Yellowstone’s underground heat

Yellowstone National Park is outlined by red in each of the Landsat scenes. On the left is a true color image with vegetation shown in green. On the right, is the thermal image with higher heat emitted shown in white. -  NASA's Goddard Space Flight Center
Yellowstone National Park is outlined by red in each of the Landsat scenes. On the left is a true color image with vegetation shown in green. On the right, is the thermal image with higher heat emitted shown in white. – NASA’s Goddard Space Flight Center

Yellowstone National Park sits on top of a vast, ancient, and still active volcano. Heat pours off its underground magma chamber, and is the fuel for Yellowstone’s famous features — more than 10,000 hot springs, mud pots, terraces and geysers, including Old Faithful.

But expected development by energy companies right outside Yellowstone’s borders have some fearing that Old Faithful could be cheated out of its energy.

“If that geothermal development outside of the park begins, we need to know whether that’s going to cause Old Faithful to suddenly stop spewing,” says Rick Lawrence of Montana State University.

Geothermal energy development is here to stay, says Yellowstone Park geologist Cheryl Jaworowski, but it has also raised some big questions for the National Park Service, which is tasked by Congress to monitor and protect Yellowstone’s unique landscape.

The park funded a study by Lawrence and his co-author Shannon Savage to apply a new perspective to the problem of tracking geothermal activity. Their work is being presented at the American Geophysical Union conference in San Francisco on Friday, December 9. Lawrence and Savage used both visible light and heat-sensitive Landsat data channels to get a broad view of the park’s geothermal activity.

Their project is part of a new monitoring plan the park implemented in 2005. The plan uses remote sensing and airborne reconnaissance to observe geothermal changes across all of Yellowstone in a systematic and scientific manner. In the past, scientific studies on the ground tended to focus on individual features, and the only park wide estimate of Yellowstone’s heat was derived from a chemical product of geothermal systems that appears in the river system. But with different technology available today, says Jaworowski, the park wants to expand its monitoring options.

To understand Yellowstone’s geothermal system, “we need to start looking at the forest rather than the individual trees,” says Jawrorski. And one way to see Yellowstone’s geothermal “forest” is to get a view from space.

Circling Earth from a height of 438 miles, the Landsat satellites have been gathering for decades a huge amount of data about the land surface. A single scene can take in the entirety of Yellowstone National Park, and the data it gathers is much more than a pretty picture. In addition to measuring the visible light in the electromagnetic spectrum — what we can see — the Landsat satellites each have an instrument that detects waves in the thermal band — heat energy.

Earth radiates heat all the time because it is warmed by the sun. Like a sponge, the ground absorbs solar energy, and like when you squeeze off excess water, the Earth reemits some of that solar energy at a longer wavelength back into space. But in Yellowstone, the total energy picked up by the satellite includes energy produced by the Earth itself, geothermal energy.

“It’s very hard to tease out the geothermal energy,” says Lawrence. The amount of solar energy reemitted depends on air temperature, vegetation cover, and soil moisture among other variables, and geothermal energy is only a small fraction of the total. To estimate changes in the geothermal system, Lawrence and Savage looked back in time and selected one image per year from 1986 to 2007 (with a few gaps due to cloudy days). Because solar effects vary from year to year and with weather conditions, they subtracted out the average heat emitted from the surface of Yellowstone for each year. The observed changes from year to year would then be primarily attributable to geothermal changes. The scientists then compared these images with known geothermal events during that time period.

Minerva Terraces in the Mammoth Basin was one of those geothermal events. In 1998, mineral-rich, near-boiling water bubbled over the Minerva’s broad steps, depositing calcite on the face of each terrace. Heat-loving organisms colored the white surface a dozen shades of pink, yellow, and green. A year later, the Terraces were a ghost town. “There was no steam, no color, and then it started crumbling away because it was very soft calcite,” says Savage. Minerva’s colorful ecosystem collapsed when the hot water stopped flowing.

That collapse was reflected in the satellite data Lawrence says. In the Landsat scenes from 1998 and 1999, “the amount of energy coming off the Minerva Terrace area went down.”

But not all the changes were expected. A lone hiker on a boardwalk at Jewel Geyser snapped a picture of rocks flying everywhere in a geothermal explosion. But in the Landsat data where the scientists would have expected more heat, “the temperature actually went down, then it went back up afterward,” says Lawrence. At the time no ground temperature measurements were made so the science team doesn’t know why.

What this means for real-time monitoring at this stage of the project, says Lawrence, is that the satellite data can tell Park managers when big changes occur in a geothermal area, but not necessarily what is happening, or exactly where. Landsat thermal pixels used in this study are 120 meters on a side, much bigger than many of Yellowstone’s geothermal landmarks, many of which can be as small or smaller than a one meter.

This relatively large pixel size is one of the limiting factors on Landsat’s usefulness, says Savage. Many small events, like a hiking trail to Beaver Ponds that disappeared by Narrow Gauge Spring in the summer of 1998, are too small to appear in the Landsat data.

Despite these uncertainties, Landsat data’s long-term record, going back to 1984, gives scientists clues as to how geothermal events could be interconnected underground. If two areas tend to change in similar patterns, that suggests they might share the same plumbing. While geothermal sites outside the park were outside Lawrence and Savage’s study area, by using this type of analysis, scientists may be able to see if there are — or are not — any connections to areas inside the park. For example, two areas that were long thought to be connected, the Norris Geyser Basin and Mammoth Hot Springs, did not show any similar trends and so may not be connected underground in any way.

Using satellites to monitor changes in Yellowstone’s geothermal activity is still in its early stages, says Park geologist Cheryl Jaworowski. “We have some initial numbers but a lot more work needs to be done,” she says, particularly in further resolving geothermal from solar energy, which remains one of the biggest challenges. One thing she wants to try is to take Landsat thermal scenes at night to try to reduce the amount of solar energy obscuring the geothermal signal.

If they can resolve that problem and perhaps eventually get higher resolution thermal data in the future, Savage says that Landsat as a monitoring tool has a lot of potential. The next Landsat satellite, the Landsat Data Continuity Mission is scheduled to launch in early 2013, and it has a new thermal instrument that will add to the Yellowstone geothermal record in the coming decade.

Path to oxygen in Earth’s atmosphere: long series of starts and stops

This is a panorama of Russia's Imandra/Varzuga Greenstone Belt, where FAR DEEP drilling took place. -  Victor Melezhik, Geological Survey of Norway/University of Bergen
This is a panorama of Russia’s Imandra/Varzuga Greenstone Belt, where FAR DEEP drilling took place. – Victor Melezhik, Geological Survey of Norway/University of Bergen

The appearance of oxygen in the Earth’s atmosphere probably did not occur as a single event, but as a long series of starts and stops, according to geoscientists who investigated rock cores from the FAR DEEP project.

The Fennoscandia Arctic Russia – Drilling Early Earth Project–FAR DEEP–took place during the summer of 2007 near Murmansk in Northwest Russia.

The project, part of the International Continental Scientific Drilling Program, drilled a series of shallow, two-inch diameter cores and, by overlapping them, created a record of stone deposited during the Proterozoic Eon–2,500 million to 542 million years ago.

“We’ve always thought that oxygen came into the atmosphere really quickly during an event,” said Lee Kump, a geoscientist at Penn State University.

“We are no longer looking for an event. Now we’re looking for when and why oxygen became a stable part of the Earth’s atmosphere.”

The researchers report in this week’s issue of the journal Science Express that evaluation of these cores, in comparison with cores from Gabon previously analyzed by others, supports the conclusion that the Great Oxidation Event, the appearance of free oxygen in Earth’s atmosphere, played out over hundreds of millions of years. Kump is the lead author of the Science Express paper.

Oxygen levels gradually crossed the low atmospheric oxygen threshold for pyrite–an iron sulfur mineral–oxidation by 2,500 million years ago, and the loss of what scientists call mass-independently fractionated (MIF) sulfur by 2,400 million years ago.

Then oxygen levels rose at an ever-increasing rate through the Paleoproterozoic, achieving about one percent of the present atmospheric level.

“The definition of when an oxygen atmosphere occurred depends on which threshold you are looking for,” said Kump. “It could be when pyrite becomes oxidized, when sulfur MIF disappears, or when deep crustal oxidation occurs.”

When the MIF sulfur disappeared, the air on Earth was still not breathable by animal standards.

When red rocks containing iron oxides appeared 2,300 million years ago, the air was still unbreathable.

“At about one percent oxygen, the groundwater became strongly oxidized, making it possible for water seeping through rocks to oxidize organic materials,” said Kump.

Initially, any oxygen in the atmosphere, produced by the photosynthesis of single-celled organisms, was used up when sulfur, iron and other elements oxidized.

When sufficient oxygen accumulated in the atmosphere, it permeated the groundwater and began oxidizing buried organic material, oxidizing carbon to create carbon dioxide.

“Insights into Earth’s carbon cycle offer tantalizing clues to the history of atmospheric oxygen levels, and Kump and others have revealed unrecognized details of the timing and mechanism of the Great Oxidation Event,” said Enriqueta Barrera, program director in the National Science Foundation’s Division of Earth Sciences, which funded the research.

The cores from the FAR-DEEP project were compared with samples from Gabon using the ratio of carbon isotopes, or variants, 13 and 12 to see if the evidence for high rates of oxygen accumulation existed worldwide.

Both the FAR-DEEP project’s cores and the Gabon cores show large deposits of carbon in the form of fossilized petroleum.

Both sets of cores also show similar changes in carbon 13 through time, indicating that the changes in carbon isotopes occurred worldwide and that oxygen levels throughout the atmosphere were high.

“Although others have documented huge carbon isotope variations at later times in Earth history associated with stepwise increases in atmospheric oxygen, our results are less equivocal because we have many lines of data all pointing to the same thing,” said Kump.

“These indications include not only carbon 13 isotope profiles in organic mater from two widely separated locations, but also supporting profiles in limestones and no indication that processes occurring since that time have altered the signal.”

Working with Kump on the project were geoscientists Michael Arthur of Penn State; Christopher Junium of Syracuse University; Alex Brasier and Anthony Fallick of the Scottish Universities Environmental Research Centre; Victor Melezhik, Aivo Lepland and Alenka Crne at the Norwegian Geological Survey; and Genming Luo, China University of Geosciences.

The NASA Astrobiology Institute also supported the research.

Early Earth may have been prone to deep freezes

Two University of Colorado Boulder researchers who have adapted a three-dimensional, general circulation model of Earth’s climate to a time some 2.8 billion years ago when the sun was significantly fainter than present think the planet may have been more prone to catastrophic glaciation than previously believed.

The new 3-D model of the Archean Eon on Earth that lasted from about 3.8 billion years to 2.5 billion years ago, incorporates interactions between the atmosphere, ocean, land, ice and hydrological cycles, said CU-Boulder doctoral student Eric Wolf of the atmospheric and oceanic sciences department. Wolf has been using the new climate model — which is based on the Community Earth System Model maintained by the National Center for Atmospheric Research in Boulder — in part to solve the “faint young sun paradox” that occurred several billion years ago when the sun’s output was only 70 to 80 percent of that today but when geologic evidence shows the climate was as warm or warmer than now.

In the past, scientists have used several types of one-dimensional climate models — none of which included clouds or dynamic sea ice — in an attempt to understand the conditions on early Earth that kept it warm and hospitable for primitive life forms. But the 1-D model most commonly used by scientists fixes Earth’s sea ice extent at one specific level through time despite periodic temperature fluctuations on the planet, said Wolf.

“The inclusion of dynamic sea ice makes it harder to keep the early Earth warm in our 3-D model,” Wolf said. “Stable, global mean temperatures below 55 degrees Fahrenheit are not possible, as the system will slowly succumb to expanding sea ice and cooling temperatures. As sea ice expands, the planet surface becomes highly reflective and less solar energy is absorbed, temperatures cool, and sea ice continues to expand.”

Wolf and CU-Boulder Professor Brian Toon are continuing to search for the heating mechanism that apparently kept Earth warm and habitable back then, as evidenced by liquid oceans and primordial life forms. While their calculations show an atmosphere containing 6 percent carbon dioxide could have done the trick by keeping the mean temperatures at 57 degrees F, geological evidence from ancient soils on early Earth indicate such high concentrations of CO2 were not present at the time.

The CU-Boulder researchers are now looking at cloud composition and formation, the hydrological cycle, movements of continental masses over time and heat transport through Earth’s system as other possible modes of keeping early Earth warm enough for liquid water to exist. Wolf gave a presentation on the subject at the annual American Geophysical Union meeting held Dec. 5-9 in San Francisco.

Toon said 1-D models essentially balance the amount of sunshine reaching the atmosphere, clouds, and Earth’s terrestrial and aquatic surfaces with the amount of “earthshine” being emitted back into the atmosphere, clouds, and space, primarily in the infrared portion of the electromagnetic spectrum. “The advantage of a 3-D model is that the transport of energy across the planet and changes in all the components of the climate system can be considered in addition to the basic planetary energy balance.”

In the new 3-D model, preventing a planet-wide glaciation requires about three times more CO2 than predicted by the 1-D models, said Wolf. For all warm climate scenarios generated by the 3-D model, Earth’s mean temperature about 2.8 billion years ago was 5 to 10 degrees F warmer than the 1-D model, given the same abundance of greenhouse gases. “Nonetheless, the 3-D model indicates a roughly 55 degrees F mean temperature was still low enough to trigger a slide by early Earth into a runaway glacial event, causing what some scientists call a ‘Snowball Earth,'” said Wolf.

“The ultimate point of this study is to determine what Earth was like around the time that life arose and during the first half of the planet’s history,” said Toon. “It would have been shrouded by a reddish haze that would have been difficult to see through, and the ocean probably was a greenish color caused by dissolved iron in the oceans. It wasn’t a blue planet by any means.” By the end of the Archean Eon some 2.5 billion year ago, oxygen levels rose quickly, creating an explosion of new life on the planet, he said.

Testing the new 3-D model has required huge amounts of supercomputer computation time, said Toon, who also is affiliated with CU-Boulder’s Laboratory for Atmospheric and Space Physics. A single calculation for the study run on CU-Boulder’s powerful new Janus supercomputer can take up to three months.

Ancient dry spells offer clues about the future of drought

New climate modeling shows that widespread deforestation in pre-Columbian Central America corresponded with decreased levels of precipitation. This image shows how much precipitation declined from normal across the region between 800 C.E. and 950 C.E. It was during this period of time that the Mayan civilization reached its peak population and abruptly collapsed. -  Ben Cook, NASA's Goddard Institute for Space Studies
New climate modeling shows that widespread deforestation in pre-Columbian Central America corresponded with decreased levels of precipitation. This image shows how much precipitation declined from normal across the region between 800 C.E. and 950 C.E. It was during this period of time that the Mayan civilization reached its peak population and abruptly collapsed. – Ben Cook, NASA’s Goddard Institute for Space Studies

As parts of Central America and the U.S. Southwest endure some of the worst droughts to hit those areas in decades, scientists have unearthed new evidence about ancient dry spells that suggest the future could bring even more serious water shortages. Three researchers speaking at the annual meeting of the American Geophysical Union in San Francisco on Dec. 5, 2011, presented new findings about the past and future of drought.

Pre-Columbian Collapse

Ben Cook, a climatologist affiliated with NASA’s Goddard Institute for Space Studies (GISS) and Columbia University’s Lamont-Doherty Earth Observatory in New York City, highlighted new research that indicates the ancient Meso-American civilizations of the Mayans and Aztecs likely amplified droughts in the Yucatán Peninsula and southern and central Mexico by clearing rainforests to make room for pastures and farmland.

Converting forest to farmland can increase the reflectivity, or albedo, of the land surface in ways that affect precipitation patterns. “Farmland and pastures absorb slightly less energy from the sun than the rainforest because their surfaces tend to be lighter and more reflective,” explained Cook. “This means that there’s less energy available for convection and precipitation.”

Cook and colleagues used a high-resolution climate model developed at GISS to run simulations that compared how patterns of vegetation cover during pre-Columbian (before 1492 C.E.) and post-Columbian periods affected precipitation and drought in Central America. The pre-Columbian era saw widespread deforestation on the Yucatán Peninsula and throughout southern and central Mexico. During the post-Columbian period, forests regenerated as native populations declined and farmlands and pastures were abandoned.

Cook’s simulations include input from a newly published land-cover reconstruction that is one of the most complete and accurate records of human vegetation changes available. The results are unmistakable: Precipitation levels declined by a considerable amount — generally 10 to 20 percent — when deforestation was widespread. Precipitation records from stalagmites, a type of cave formation affected by moisture levels that paleoclimatologists use to deduce past climate trends, in the Yucatán agree well with Cook’s model results.

The effect is most noticeable over the Yucatán Peninsula and southern Mexico, areas that overlapped with the centers of the Mayan and Aztec civilizations and had high levels of deforestation and the most densely concentrated populations. Rainfall levels declined, for example, by as much as 20 percent over parts of the Yucatán Peninsula between 800 C.E. and 950 C.E.

Cook’s study supports previous research that suggests drought, amplified by deforestation, was a key factor in the rapid collapse of the Mayan empire around 950 C.E. In 2010, Robert Oglesby, a climate modeler based at the University of Nebraska, published a study in the Journal of Geophysical Research that showed that deforestation likely contributed to the Mayan collapse. Though Oglesby and Cook’s modeling reached similar conclusions, Cook had access to a more accurate and reliable record of vegetation changes.

During the peak of Mayan civilization between 800 C.E. and 950 C.E., the land cover reconstruction Cook based his modeling on indicates that the Maya had left only a tiny percentage of the forests on the Yucatán Peninsula intact. By the period between 1500 C.E. and 1650 C.E., in contrast, after the arrival of Europeans had decimated native populations, natural vegetation covered nearly all of the Yucatán. In modern times, deforestation has altered some areas near the coast, but a large majority of the peninsula’s forests remain intact.

“I wouldn’t argue that deforestation causes drought or that it’s entirely responsible for the decline of the Maya, but our results do show that deforestation can bias the climate toward drought and that about half of the dryness in the pre-Colonial period was the result of deforestation,” Cook said.

Northeastern Megadroughts

The last major drought to affect the Northeast occurred in the 1960s, persisted for about three years and took a major toll on the region. Dorothy Peteet, a paleoclimatologist also affiliated with NASA GISS and Columbia University, has uncovered evidence that shows far more severe droughts have occurred in the Northeast.

By analyzing sediment cores collected from several tidal marshes in the Hudson River Valley, Peteet and her colleagues at Lamont-Doherty have found evidence that at least three major dry spells have occurred in the Northeast within the last 6,000 years. The longest, which corresponds with a span of time known as the Medieval Warm Period, lasted some 500 years and began around 850 C.E. The other two took place more than 5,000 years ago. They were shorter, only about 20 to 40 years, but likely more severe.

“People don’t generally think about the Northeast as an area that can experience drought, but there’s geologic evidence that shows major droughts can and do occur,” Peteet said. “It’s something scientists can’t ignore. What we’re finding in these sediment cores has big implications for the region.”

Peteet’s team detected all three droughts using a method called X-ray fluorescence spectroscopy. They used the technique on a core collected at Piermont Marsh in New York to search for characteristic elements — such as bromine and calcium — that are more likely to occur at the marsh during droughts.

Fresh water from the Hudson River and salty water from the Atlantic Ocean were both predominant in Piermont Marsh at different time periods, but saltwater moves upriver during dry periods as the amount of fresh water entering the marsh declines. Peteet’s team detected extremely high levels of both bromine and calcium, both of them indicators of the presence of saltwater and the existence of drought, in sections of the sediment cores corresponding to 5,745 and 5,480 years ago.

During the Medieval Warm Period, the researchers also found striking increases in the abundance of certain types of pollen species, especially pine and hickory, that indicate a dry climate. Before the Medieval Warm Period, in contrast, there were more oaks, which prefer wetter conditions. They also found a thick layer of charcoal demonstrating that wildfires, which are more frequent during droughts, were common during the Medieval Warm Period.

“We still need to do more research before we can say with confidence how widespread or frequent droughts in the Northeast have been,” Peteet said. There are certain gaps in the cores Peteet’s team studied, for example, that she plans to investigate in greater detail. She also expects to expand the scope of the project to other marshes and estuaries in the Northeast and to collaborate with climate modelers to begin teasing out the factors that cause droughts to occur in the region.

The Future of Food

Climate change, with its potential to redistribute water availability around the globe by increasing rainfall in some areas while worsening drought in others, might negatively impact crop yields in certain regions of the world.

New research conducted by Princeton University hydrologist Justin Sheffield shows that areas of the developing world that are drought-prone and have growing population and limited capabilities to store water, such as sub-Saharan Africa, will be the ones most at risk of seeing their crops decrease their yields in the future.

Sheffield and his team ran hydrological model simulations for the 20th and 21st centuries and looked at how drought might change in the future according to different climate change scenarios. They found that the total area affected by drought has not changed significantly over the past 50 years globally.

However, the model shows reductions in precipitation and increases in evaporative demand are projected to increase the frequency of short-term droughts. They also found that the area across sub-Saharan Africa experiencing drought will rise by as much as twofold by mid-21st century and threefold by the end of the century.

When the team analyzed what these changes would mean for future agricultural productivity around the globe, they found that the impact on sub-Saharan Africa would be especially strong.

Agricultural productivity depends on a number of factors beyond water availability including soil conditions, available technologies and crop varieties. For some regions of sub-Saharan Africa, the researchers found that agricultural productivity will likely decline by over 20 percent by mid-century due to drying and warming.

‘Double tsunami’ doubled Japan destruction

A 3-D ocean model created by Y. Tony Song of NASA's Jet Propulsion Laboratory and colleagues replicates the March 11, 2011 tsunami. The red line shows the path of the Jason-1 satellite, which crossed the wave front at the site of a double-high wave. -  Image courtesy of NASA
A 3-D ocean model created by Y. Tony Song of NASA’s Jet Propulsion Laboratory and colleagues replicates the March 11, 2011 tsunami. The red line shows the path of the Jason-1 satellite, which crossed the wave front at the site of a double-high wave. – Image courtesy of NASA

Researchers have discovered that the destructive tsunami generated by the March 2011 Tōhoku earthquake was a long-hypothesized “merging tsunami” that doubled in intensity over rugged ocean ridges, amplifying its destructive power before reaching shore.

Satellites captured not just one wave front that day, but at least two, which merged to form a single double-high wave far out at sea – one capable of traveling long distances without losing its power. Ocean ridges and undersea mountain chains pushed the waves together, but only along certain directions from the tsunami’s origin.

The discovery helps explain how tsunamis can cross ocean basins to cause massive destruction at some locations while leaving others unscathed, and raises hope that scientists may be able to improve tsunami forecasts.

At a news conference Monday at the American Geophysical Union meeting in San Francisco, Y. Tony Song, a research scientist at NASA’s Jet Propulsion Laboratory (JPL); and C.K. Shum, professor and Distinguished University Scholar in the Division of Geodetic Science, School of Earth Sciences at Ohio State University, discussed the satellite data and simulations that enabled them to piece the story together.

“It was a one-in-ten-million chance that we were able to observe this double wave with satellites,” said Song, the study’s principal investigator. “Researchers have suspected for decades that such ‘merging tsunamis’ might have been responsible for the 1960 Chilean tsunami that killed many in Japan and Hawaii, but nobody had definitively observed a merging tsunami until now.”

“It was like looking for a ghost,” he continued. “A NASA/French Space Agency satellite altimeter happened to be in the right place at the right time to capture the double wave and verify its existence.”

Shum agreed. “We were very lucky, not only in the timing of the satellite, but also to have access to such detailed GPS-observed ground motion data from Japan to initiate Tony’s tsunami model, and to validate the model results using the satellite data. Now we can use what we learned to make better forecasts of tsunami danger in specific coastal regions anywhere in the world, depending on the location and the mechanism of an undersea quake.”

The NASA/Centre National d’Etudes Spaciales Jason-1 satellite passed over the tsunami on March 11, as did two other satellites: the NASA/European Jason-2 and the European Space Agency’s EnviSAT. All three carry a radar altimeter, which measures sea level changes to an accuracy of a few centimeters.

Each satellite crossed the tsunami at a different location. Jason-2 and EnviSAT measured wave heights of 20 cm (8 inches) and 30 cm (12 inches), respectively. But as Jason-1 passed over the undersea Mid-Pacific Mountains to the east, it captured a wave front measuring 70 cm (28 inches).

The researchers conjectured ridges and undersea mountain chains on the ocean floor deflected parts of the initial tsunami wave away from each other to form independent jets shooting off in different directions, each with its own wave front.

The sea floor topography nudges tsunami waves in varying directions and can make a tsunami’s destruction appear random. For that reason, hazard maps that try to predict where tsunamis will strike rely on sub-sea topography. Previously, these maps only considered topography near a particular shoreline. This study suggests scientists may be able to create maps that take into account all undersea topography, even sub-sea ridges and mountains far from shore.

Song and his team were able to verify the satellite data through model simulations based on independent data, including the GPS data from Japan and buoy data from the National Oceanic and Atmospheric Administration’s Deep-ocean Assessment and Reporting of Tsunamis program.

“Tools based on this research could help officials forecast the potential for tsunami jets to merge,” said Song. “This, in turn, could lead to more accurate coastal tsunami hazard maps to protect communities and critical infrastructure.”

Song and Shum’s collaborators include Ichiro Fukumori, an oceanographer and supervisor in JPL’s Ocean Circulation Group; and Yuchan Yi, a research scientist in the Division of Geodetic Science, School of Earth Sciences at Ohio State.

Researcher finds key to ancient weather patterns in Florida’s caves

This is Darrel Tremaine, a Florida State doctoral student. -  FSU Photography Services, Bill Lax
This is Darrel Tremaine, a Florida State doctoral student. – FSU Photography Services, Bill Lax

Darrel Tremaine has been known to go to extremes for his research, such as crawling on his hands and knees through a dark, muddy limestone cave in Northwest Florida to learn more about the weather thousands of years ago.

His goal? To compare ancient meteorological patterns with modern ones in the northern Gulf of Mexico region and ultimately inform policymakers on how to build a sustainable water supply.

On a recent morning, the Florida State University doctoral student in oceanography huddled with artisan Charlie Scott-Smith at Florida State’s Master Craftsman Studios (http://craft.fsu.edu/). The two were making molds of stalagmites, the natural formations rising from the floor of limestone caves that are formed by the dripping of water containing calcium carbonate. (Their counterparts, stalactites, hang from the ceilings of such caves.)
Video: Digging deep to study ancient climate

Surrounded by the studio’s eye-catching artifacts – models of architectural fittings, an ancient ship, even a copy of the sculpture “Winged Victory” from the Louvre Museum – Tremaine and Scott-Smith worked with a rubbery urethane compound to create stalagmite molds that resembled giant beeswax candles. Next, they filled the molds with cement and glass.

After the cement thoroughly dried, Tremaine returned the reproduction stalagmites to the cave, where, over time, dripping water will coat them with calcite and they will start growing again. The remote, Northwest Florida cave maintains a constant, year-round temperature of 72 degrees and a 100 percent relative humidity level, which means, as Tremaine likes to joke, “that if you start to sweat, you stay wet.”

As part of a three-year climate research project, he harvested the two stalagmites – one 4,000 years old, the other 25,000 years old – from the cave to analyze them for isotopic and trace element variations in an effort to build a 4,000-year paleo-rainfall record for North Florida. In a unique arrangement with the Southeastern Cave Conservancy Inc. (SCCI), a nonprofit group that owns dozens of caves in the southeastern United States, he was allowed to take the stalagmites as long as he made duplicates of them and placed the duplicates back in the cave – a measure of cave conservation.

The real stalagmites will be studied at the National High Magnetic Field Laboratory (http://www.magnet.fsu.edu/) at Florida State, where Tremaine, a graduate research assistant, is currently stationed in the geochemistry lab.

So far, Tremaine has been scrutinizing carbon, oxygen and strontium isotopes on modern calcite grown in the cave on glass microscope slides – what he calls “modern calibrations of ancient proxies.” Isotopes of an element are atoms with the same number of protons but different number of neutrons, thus a slightly different atomic mass. He will use that data to get a better idea of ancient ventilation patterns, the temperature inside of a cave when the stalagmites were forming, what type of vegetation was growing above the cave, of, and whether the weather was cold, warm or hot during a particular span of time.

“By looking at trace elements we can get an idea of very wet and very dry rainfall patterns and cycles,” Tremaine said. “We’ll better understand severe weather patterns.”

Tremaine, along with a six-member team of scientists, researchers and graduate students, will cut the stalagmites in half and then use a 50-micron laser to vaporize calcite that they will then measure with a spectrometer. The laser will allow them to study monthly weather patterns in the Northern Gulf Region thousands of years ago. By extracting calcite powders with a half-millimeter drill bit, they will examine the region’s wet and dry seasons in five-year increments. Eventually, they hope to create a high-resolution time series, analyzing monthly weather patterns over thousands of years.

“We will be the first to do this in the southeastern United States,” he explained. “The research is very important because we will be able to study our monsoonal weather patterns, which are much like India and China, with very wet and dry seasons. “

Tremaine’s six-member climate research team includes a wide swath of experts, from a retired professor to a Russian mathematician and an undergraduate cave researcher. They are Florida State faculty members Philip “Flip” Froelich, retired FSU Francis Eppes Professor of Oceanography; Bill Burnett, the Carl Henry Oppenheimer Professor of Oceanography; and Doron Nof, Distinguished Nansen Professor of Physical Oceanography. In addition, Guy “Harley” Means, assistant state geologist at the Florida Geological Survey; Brian Kilgore, a Florida State undergraduate majoring in biochemistry; and Karina Khazmutdinova, a mathematician and doctoral student at the FSU Geophysical Fluid Dynamics Institute, served on the team.

Their research work on isotopes was recently published in the journal Geochimica et Cosmochimica Acta. Tremaine and his team’s research on trace elements also will soon be published in the same research journal. They are also in the process of writing an article for the Journal of Hydrology.

“Records of past climates can be found in the ice caps and in the deep sea,” said Jeff Chanton, FSU’s John W. Winchester Professor of Oceanography, who has worked with Tremaine. “The unique aspect of Darrel’s work is that it will give us a record of local climate right here on the Northern Gulf Coast. This is important because a record of past climate in our region would help to predict what’s to come in response to human disturbances of atmospheric greenhouse gas concentrations.”

The 32-year-old Tremaine, who holds a master’s degree in oceanography from Florida State and an undergraduate degree in engineering from the University of Cincinnati, dreams of someday starting his own groundwater research lab, where he would also teach middle and high school students to do research.

“Working with younger kids and teaching them to do research early makes sense, because if we inform them, they will someday inform us,” he said.

But first, Tremaine and his team recently negotiated permission from the state of Florida to move their cave-monitoring equipment into one of the most pristine and highly guarded caves in Florida Caverns State Park, located near the Panhandle town of Marianna. Tremaine has already been in the cave for preliminary investigations, and the team began installing equipment in November.

Not easy work, by any means: “No one,” Tremaine explained one morning while he helped Florida State Master Craftsman artisans put the final touches on the stalagmite molds, “has been in that cave since 2006.”

Unique geologic insights from ‘non-unique’ gravity and magnetic interpretation

<IMG SRC="/Images/967678546.jpg" WIDTH="338" HEIGHT="450" BORDER="0" ALT="The December GSA TODAY science article, ‘Unique geologic insights from – The Geological Society of America”>
The December GSA TODAY science article, ‘Unique geologic insights from – The Geological Society of America

The December GSA TODAY science article, “Unique geologic insights from “non-unique” gravity and magnetic interpretation,” is now online at http://www.geosociety.org/gsatoday/. The article is open-access.

In many fields of applied science, such as geology, there are often tensions and disagreements between scientists who specialize in analyses of problems using mathematical models to describe sets of collected data, and those that rely on on-the-ground observations and empirical analyses. One common source of these disagreements arises from applications of geophysics — studies of variations in gravity or Earth’s magnetic field — that use models that are strictly (from a mathematical point of view) non-unique. For example, using theories derived from Isaac Newton’s studies of gravitational attraction, a geophysicist who measures local variations in gravitational acceleration that are produced by contrasts in the density of rocks below Earth’s surface can calculate an infinite set of mathematically valid sources (with different shapes, depths, and contrasts in density) that would explain the measured gravity difference (or anomaly). This theoretical non-uniqueness leads many geologists to conclude that such geophysical information is of limited value, given the infinite number of possible correct answers to those numerical problems.

In the December 2011 issue of GSA Today, Richard Saltus and Richard Blakely, two U.S. Geological Survey scientists with extensive experience using gravity and magnetic field models to help improve the understanding of a number of geological problems, present several excellent examples of unique interpretations that can be made from “non-unique” models. Their motivation for this article is to improve communication among various geologists regarding the ability (and limitations of) gravity and magnetic field data to yield important information about the subsurface geology of an area or region.

This communication barrier is an important issue, because a great deal of our understanding of the geology of Earth and the planets is primarily derived from these types of geophysical measurements. More practically, geophysical tools such as gravity and magnetic field measurements are used in mineral and hydrocarbon exploration, so the utilization of these methods can aid economic development by locating subsurface mineral resources more efficiently that other techniques (such as drilling and excavating).

In their article, Saltus and Blakely advocate a holistic approach to geological studies. By combining other observations — such as the surface location of a fault or the likely density contrast between a set of different rock units based on their composition — the infinite array of theoretical solutions to some of these potential-field geophysical models can be narrowed down to a few, or even one, best interpretation(s). They present a number of examples where this approach can successfully solve important geological issues — one of the best is an analysis of magnetic anomaly data from the Puget Sound area that allows a detailed image of the active Seattle Fault zone to be constructed.