Airborne expedition chases Arctic sea ice questions

CU-Boulder and NASA are teaming up this summer on a series of unmanned aircraft flights to study the receding Arctic sea ice and to better understand its life cycle and the long-term stability of the Arctic ice cover. -  Image courtesy James Maslanik, University of Colorado
CU-Boulder and NASA are teaming up this summer on a series of unmanned aircraft flights to study the receding Arctic sea ice and to better understand its life cycle and the long-term stability of the Arctic ice cover. – Image courtesy James Maslanik, University of Colorado

A small NASA aircraft completed its first successful science flight Thursday in partnership with the University of Colorado at Boulder as part of an expedition to study the receding Arctic sea ice and improve understanding of its life cycle and the long-term stability of the Arctic ice cover. The mission continues through July 24.

NASA’s Characterization of Arctic Sea Ice Experiment, known as CASIE, began a series of unmanned aircraft system flights in coordination with satellites. Working with CU-Boulder and its research partners, NASA is using the remotely piloted aircraft to image thick, old slabs of ice as they drift from the Arctic Ocean south through the Fram Strait — which lies between Greenland and Svalbard, Norway — and into the North Atlantic Ocean.

NASA’s Science Instrumentation Evaluation Remote Research Aircraft, or SIERRA, will weave a pattern over open ocean and sea ice to map and measure ice conditions below cloud cover to as low as 300 feet.

“Our project is attempting to answer some of the most basic questions regarding the most fundamental changes in sea-ice cover in recent years,” said CU-Boulder Research Professor James Maslanik of the aerospace engineering sciences department and principal investigator for the NASA mission. “Our analysis of satellite data shows that in 2009 the amount of older ice is just 12 percent of what it was in 1988 — a decline of 74 percent. The oldest ice types now cover only 2 percent of the Arctic Ocean as compared to 20 percent in the 1980s.”

SIERRA, laden with scientific instruments, travels long distances at low altitudes, flying below the clouds. The aircraft has high maneuverability and slow flight speed. SIERRA’s relatively large payload, approximately 100 pounds, combined with a significant range of 500 miles and a small, 20-foot wingspan makes it the ideal aircraft for the expedition.

The mission is conducted from the Ny-Alesund research base on the island of Svalbard, Norway, located near the northeastern tip of Greenland. Mission planners are using satellite data to direct flights of the aircraft.

“We demonstrated the utility of small- to medium-class unmanned aircraft systems for gathering science data in remote, harsh environments during the CASIE mission,” said Matt Fladeland, CASIE project and SIERRA manager at NASA’s Ames Research Center in Moffett Field, Calif.

The aircraft observations will be complemented by NASA satellite large-scale views of many different features of the Arctic ice. The Moderate Resolution Imaging Spectroradiometer aboard NASA’s Aqua satellite will be used to identify the ice edge location, ice features of interest and cloud cover. Other sensors such as the Advanced Microwave Scanning Radiometer-Earth Observing System on Aqua and the Quick Scatterometer satellite can penetrate cloud cover and analyze the physical properties of ice.

By using multiple types of satellite data, in conjunction with high-resolution aircraft products, more can be learned about ice conditions than is possible by using one or two data analysis methods.

NASA’s CASIE mission supports a larger NASA-funded research effort titled “Sea Ice Roughness as an Indicator of Fundamental Changes in the Arctic Ice Cover: Observations, Monitoring, and Relationships to Environmental Factors.” The project also supports the goals of the International Polar Year, a major international scientific research effort involving many NASA research efforts to study large-scale environmental changes in Earth’s polar regions.

Solar cycle linked to global climate, drives events similar to El Nino, La Nina

Sunrise over the ocean (©UCAR, photo by Carlye Calvin.)
Sunrise over the ocean (©UCAR, photo by Carlye Calvin.)

Establishing a key link between the solar cycle and global climate, new research led by the National Center for Atmospheric Research (NCAR) shows that maximum solar activity and its aftermath have impacts on Earth that resemble La Nina and El Nino events in the tropical Pacific Ocean. The research may pave the way toward better predictions of temperature and precipitation patterns at certain times during the Sun’s cycle, which lasts approximately 11 years.

The total energy reaching Earth from the Sun varies by only 0.1 percent across the solar cycle. Scientists have sought for decades to link these ups and downs to natural weather and climate variations and distinguish their subtle effects from the larger pattern of human-caused global warming.

Building on previous work, NCAR researchers used computer models of global climate and more than a century of ocean temperature data to answer longstanding questions about the connection between solar activity and global climate. Changes in greenhouse gases were also included in the model, but the main focus of the study is to examine the role of solar variability in climate change.

The research, published this month in the Journal of Climate, was funded by the National Science Foundation, NCAR’s sponsor, and by the Department of Energy.

“We have fleshed out the effects of a new mechanism to understand what happens in the tropical Pacific when there is a maximum of solar activity,” says NCAR scientist Gerald Meehl, the lead author. “When the Sun’s output peaks, it has far-ranging and often subtle impacts on tropical precipitation and on weather systems around much of the world.”

The new paper, along with an earlier one by Meehl and colleagues, shows that as the Sun reaches maximum activity, it heats cloud-free parts of the Pacific Ocean enough to increase evaporation, intensify tropical rainfall and the trade winds, and cool the eastern tropical Pacific. The result of this chain of events is similar to a La Nina event, although the cooling of about 1-2 degrees Fahrenheit is focused further east and is only about half as strong as for a typical La Nina.

Over the following year or two, the La Nina-like pattern triggered by the solar maximum tends to evolve into an El Nino-like pattern, as slow-moving currents replace the cool water over the eastern tropical Pacific with warmer-than-usual water. Again, the ocean response is only about half as strong as with El Nino.

True La Nina and El Nino events are associated with changes in the temperatures of surface waters of the eastern Pacific Ocean. They can affect weather patterns worldwide.

The new paper does not analyze the weather impacts of the solar-driven events. But Meehl and his co-author, Julie Arblaster of both NCAR and the Australian Bureau of Meteorology, found that the solar-driven La Nina tends to cause relatively warm and dry conditions across parts of western North America. More research will be needed to determine the additional impacts of these events on weather across the world.

“Building on our understanding of the solar cycle, we may be able to connect its influences with weather probabilities in a way that can feed into longer-term predictions, a decade at a time,” Meehl says.

An elusive puzzle

Scientists have known for years that long-term solar variations affect certain weather patterns, including droughts and regional temperatures. But establishing a physical connection between the decadal solar cycle and global climate patterns has proven elusive. One reason is that only in recent years have computer models been able to realistically simulate the processes underlying tropical Pacific warming and cooling associated with El Nino and La Nina. With those models now in hand, scientists can reproduce the last century’s solar behavior and see how it affects the Pacific.

To tease out these sometimes subtle connections between the Sun and Earth, Meehl and his colleagues analyzed sea surface temperatures from 1890 to 2006. They then used two computer models based at NCAR to simulate the response of the oceans to changes in solar output.

They found that, as the Sun’s output reaches a peak, the small amount of extra sunshine over several years causes a slight increase in local atmospheric heating, especially across parts of the tropical and subtropical Pacific where Sun-blocking clouds are normally scarce. That small amount of extra heat leads to more evaporation, producing extra water vapor. In turn, the moisture is carried by trade winds to the normally rainy areas of the western tropical Pacific, fueling heavier rains.

As this climatic loop intensifies, the trade winds strengthen. That keeps the eastern Pacific even cooler and drier than usual, producing La Nina-like conditions.

Although this Pacific pattern is produced by the solar maximum, the authors found that its switch to an El Nino-like state is likely triggered by the same kind of processes that normally lead from La Nina to El Nino. The transition starts when changes in the strength of the trade winds produce slow-moving off-equatorial pulses known as Rossby waves in the upper ocean, which take about a year to travel back west across the Pacific.

The energy then reflects from the western boundary of the tropical Pacific and ricochets eastward along the equator, deepening the upper layer of water and warming the ocean surface. As a result, the Pacific experiences an El Nino-like event about two years after solar maximum. The event settles down after about a year, and the system returns to a neutral state.

“El Nino and La Nina seem to have their own separate mechanisms,” says Meehl, “but the solar maximum can come along and tilt the probabilities toward a weak La Nina. If the system were heading toward a La Nina anyway,” he adds, “it would presumably be a larger one.”

Researchers to participate in seismic test of 7-story building

Rensselaer Associate Professor Michael Symans and incoming Dean of Engineering David Rosowsky are among the team of researchers who will converge in Japan next week to perform the largest earthquake simulation ever attempted on a wooden structure. The multi-university team has placed this seven-story building on the world's largest shake table, and will expose it to the force of an earthquake that hits only once every 2,500 years. -  Colorado State University
Rensselaer Associate Professor Michael Symans and incoming Dean of Engineering David Rosowsky are among the team of researchers who will converge in Japan next week to perform the largest earthquake simulation ever attempted on a wooden structure. The multi-university team has placed this seven-story building on the world’s largest shake table, and will expose it to the force of an earthquake that hits only once every 2,500 years. – Colorado State University

A destructive earthquake will strike a lone, wooden condominium in Japan next week, and Rensselaer Polytechnic Institute Professor Michael Symans will be on site to watch it happen.

Symans is among the team of researchers who will converge in the Japanese city of Miki to perform the largest earthquake simulation ever attempted on a wooden structure. The multi-university team, led by Colorado State University, has placed a seven-story building – loaded with sensing equipment and video cameras – on a massive shake table, and will expose the building to the force of an earthquake that hits once every 2,500 years.

The experiment will be Webcast live on Tuesday, July 14 at 11 a.m. EDT at, and should yield critical data and insight on how to make wooden structures stronger and better able to withstand major earthquakes.

“Right now, wood can’t compete with steel and concrete as building materials for mid-rise buildings, partly because we don’t have a good understanding of how taller wood-framed structures will perform in a strong earthquake,” said Symans, associate professor in Rensselaer’s Department of Civil and Environmental Engineering. “With this shaking table test, we’ll be collecting data that will help us to further the development of design approaches for such structures, which is one of the major goals of the project.”

The 1994 magnitude 6.7 earthquake in Northridge, Calif., and 1995 magnitude 6.9 earthquake in Kobe, Japan, clearly demonstrate the seismic vulnerability of wood-framed construction, Symans said. The shake table experiment will offer researchers a chance to better understand how wood reacts in an earthquake, he said, and the resulting data could lead to the advancement of engineering techniques for mitigating earthquake damage.

As the ground shakes, the energy that goes into a building needs to flow somewhere, Symans said. Typically, a large portion of this energy is spent moving – and damaging – the building. There are proven engineering techniques for absorbing or displacing some of this energy in order to minimize damage, but the technology for doing so has not yet been thoroughly evaluated for wooden structures. Next week’s shake should produce sufficient data to allow the research team to develop accurate computer models of mid-rise wood buildings, which can subsequently be used to advance and validate some of these seismic protection techniques.

As one example, Symans is working on the application of seismic damping systems for wooden buildings. These systems, which can be installed inside the walls of most wooden buildings, include metal bracing and dampers filled with viscous fluid. A portion of the energy generated by the earthquake is spent shaking the fluid back and forth in the dampers, which in turn reduces the energy available to damage the wall or building structure. Recently completed shaking table tests at Rensselaer on wooden walls outfitted with such a damping system have demonstrated the viability of such an approach to mitigating damage in wooden buildings.

“The system allows a significant portion of the wood-frame displacement to be transferred to the dampers where the energy can be harmlessly dissipated,” Symans said. “With dampers in place, we have a better ability to predict how a structure will react to and perform during an earthquake.”

In the 1994 Northridge earthquake, all but one of the 25 fatalities caused by building damage occurred in wooden buildings, and at least half of the $40 billion in property damage was attributed to wood buildings. The quake resulted in nearly 50,000 housing units rendered uninhabitable, most of them wood-framed buildings. The advancement of seismic protection systems could help to save lives and prevent or limit damage in similar future earthquakes, Symans said. This is particularly important considering that most residential structures in the United States, even in seismically active areas, have wooden frames.

The Miki shake is the capstone experiment of the four-year NEESWood project, which receives its primary support from the U.S. National Science Foundation Network for Earthquake Engineering Simulation (NEES) Program. NEESWood is led by Colorado State University, in collaboration with Rensselaer, the University at Buffalo, the University of Delaware, and Texas A&M University. One intended end result of NEESWood is the development of new tools, software, and best practices that result in building code revisions and allow engineers and architects to design wooden structures which can better withstand earthquakes.

The seven-story structure has been built with new seismic design methods informed by NEESWood research for mid-rise wood frame construction. The tests in Miki, to be performed at the Hyogo Earthquake Engineering Research Center, home of the world’s largest seismic shaking table, will be used to evaluate the performance of the building and, in turn, the new design methods.

David Rosowsky, who will join Rensselaer in August as the new dean of engineering, is also a co-investigator of the NEESWood project and will attend the shake in Miki next week.

“NEESWood aims to develop a new seismic design philosophy that will provide the necessary mechanisms to safely increase the height of wood-frame structures in active seismic zones of the United States, as well as mitigate earthquake damage to low-rise wood-frame structures. When this challenge is successfully met, mid-rise wood-frame construction will be an economic option in seismic regions in the United States and around the world,” said Rosowsky, currently the head of the Department of Civil Engineering at Texas A&M.

“It’s exciting for Rensselaer to be a part of the international team participating in the NEESWood project. This project has already brought tremendous visibility to the School of Engineering at Rensselaer which, with its geotechnical centrifuge facility, already is a part of the NEES network of world-class laboratories for earthquake engineering,” Rosowsky said.

Tremors on southern San Andreas Fault may mean increased earthquake risk

Parkfield is at the northern end of a locked segment of the San Andreas Fault (SAF) that, in 1857, ruptured south from Monarch Peak (MP) in the great 7.8 magnitude Ft. Tejon quake. As a result of nearby earthquakes in 2003 and 2004, tremors developed under Cholame and Monarch Peak. The black dots pinpoint 1250 well-located tremors. The square boxes are 30 kilometers (19 miles) on a side.  Color contours give regional shear-stress change at 20 km depth from the Parkfield earthquake (green segment) along the SAF. The thrust-type San Simeon earthquake rupture is represented by the gray rectangle and line with triangles labeled SS. The currently locked Cholame segment is about 63 km long (solid portion of the arrow) and is believed capable of rupturing on its own in a magnitude 7 earthquake. The gray lines within the Cholame box bound the west quadrant, where quasiperiodic episodes predominate. (Robert Nadeau/UC Berkeley, courtesy Science magazine)
Parkfield is at the northern end of a locked segment of the San Andreas Fault (SAF) that, in 1857, ruptured south from Monarch Peak (MP) in the great 7.8 magnitude Ft. Tejon quake. As a result of nearby earthquakes in 2003 and 2004, tremors developed under Cholame and Monarch Peak. The black dots pinpoint 1250 well-located tremors. The square boxes are 30 kilometers (19 miles) on a side. Color contours give regional shear-stress change at 20 km depth from the Parkfield earthquake (green segment) along the SAF. The thrust-type San Simeon earthquake rupture is represented by the gray rectangle and line with triangles labeled SS. The currently locked Cholame segment is about 63 km long (solid portion of the arrow) and is believed capable of rupturing on its own in a magnitude 7 earthquake. The gray lines within the Cholame box bound the west quadrant, where quasiperiodic episodes predominate. (Robert Nadeau/UC Berkeley, courtesy Science magazine)

Increases in mysterious underground tremors observed in several active earthquake fault zones around the world could signal a build-up of stress at locked segments of the faults and presumably an increased likelihood of a major quake, according to a new University of California, Berkeley, study.

Seismologist Robert M. Nadeau and graduate student Aurélie Guilhem of UC Berkeley draw these conclusions from a study of tremors along a heavily instrumented segment of the San Andreas Fault near Parkfield, Calif. The research is reported in the July 10 issue of Science.

They found that after the 6.5-magnitude San Simeon quake in 2003 and the 6.0-magnitude Parkfield quake in 2004, underground stress increased at the end of a locked segment of the San Andreas Fault near Cholame, Calif., at the same time as tremors became more frequent. The tremors have continued to this day at a rate significantly higher than the rate before the two quakes.

The researchers conclude that the increased rate of tremors may indicate that stress is accumulating more rapidly than in the past along this segment of the San Andreas Fault, which is at risk of breaking like it did in 1857 to produce the great 7.8 magnitude Fort Tejon earthquake. Strong quakes have also occurred just to the northwest along the Parkfield segment of the San Andreas about every 20 to 30 years.

“We’ve shown that earthquakes can stimulate tremors next to a locked zone, but we don’t yet have evidence that this tells us anything about future quakes,” Nadeau said. “But if earthquakes trigger tremors, the pressure that stimulates tremors may also stimulate earthquakes.”

While earthquakes are brief events originating, typically, no deeper than 15 kilometers (10 miles) underground in California, tremors are an ongoing, low-level rumbling from perhaps 15 to 30 kilometers (10-20 miles) below the surface. They are common near volcanoes as a result of underground fluid movement, but were a surprise when discovered in 2002 at a subduction zone in Japan, a region where a piece of ocean floor is sliding under a continent.

Tremors were subsequently detected at the Cascadia subduction zone in Washington, Oregon and British Columbia, where several Pacific Ocean plates dive under the North American continental plate. In 2005, Nadeau identified mysterious “noise” detected by the Parkfield borehole seismometers as tremor activity, and has focused on them ever since. Unlike the Japanese and Cascadia tremor sites, however, the Parkfield area is a strike/slip fault, where the Pacific plate is moving horizontally against the North American plate.

“The Parkfield tremors are smaller versions of the Cascadia and Japanese tremors,” Nadeau said. “Most last between three and 21 minutes, while some Cascadia tremors go on for days.”

Because in nearly all known instances the tremors originate from the edge of a locked zone – a segment of a fault that hasn’t moved in years and is at high risk of a major earthquake – seismologists have thought that increases in their activity may forewarn of stress build-up just before an earthquake.

The new report strengthens that association, Nadeau said.

For the new study, Nadeau and Guilhem pinpointed the location of nearly 2,200 tremors recorded between 2001 and 2009 by borehole seismometers implanted along the San Andreas Fault as part of UC Berkeley’s High-Resolution Seismic Network. During this period, two nearby earthquakes occurred: one in San Simeon, 60 kilometers from Parkfield, on Dec. 22, 2003, and one in Parkfield on the San Andreas Fault on Sept. 28, 2004.

Before the San Simeon quake, tremor activity was low beneath the Parkfield and Cholame segments of the San Andreas Fault, but it doubled in frequency afterward and was six times more frequent after the Parkfield quake. Most of the activity occurred along a 25-kilometer (16-mile) segment of the San Andreas Fault south of Parkfield, around the town of Cholame. Fewer than 10 percent of the tremors occurred at an equal distance above Parkfield, near Monarch Peak. While Cholame is at the northern end of a long-locked and hazardous segment of the San Andreas Fault, Monarch Peak is not. However, Nadeau noted, Monarch Peak is an area of relative complexity on the San Andreas Fault and also ruptured in 1857 in the Fort Tejon 7.8 earthquake.

The tremor activity remains about twice as high today as before the San Simeon quake, while periodic peaks of activity have emerged that started to repeat about every 50 days and are now repeating about every 100-110 days.

“What’s surprising is that the activity has not gone down to its old level,” Nadeau said. The continued activity is worrisome because of the history of major quakes along this segment of the fault, and the long-ago Fort Tejon quake, which ruptured southward from Monarch Peak along 350 kilometers (220 miles) of the San Andreas Fault.

A flurry of pre-tremors was detected a few days before the Parkfield quake, which makes Nadeau hopeful of seeing similar tremors preceding future quakes.

He noted that the source of tremors is still somewhat of a mystery. Some scientists think fluids moving underground generate the tremors, just as movement of underground magma, water and gas causes volcanic tremors. Nadeau leans more toward an alternative theory, that non-volcanic tremors are generated in a deep region of hot soft rock, somewhat like Silly Putty, that, except for a few hard rocks embedded like peanut brittle, normally flows without generating earthquakes. The fracturing of the brittle inclusions, however, may be generating swarms of many small quakes that combine into a faint rumble.

“If tremors are composed of a lot of little earthquakes, each should have a primary and secondary wave just like large quakes,” but they would overlap and produce a rumble, said Guilhem.

The stimulation of tremors by shear (tearing) stress rather than by compressional (opening and closing) stress is more consistent with deformation in the fault zone than with underground fluid movement, Nadeau said. The researchers’ mapping of the underground tremors also shows that the tremors are not restricted to the plane of the fault, suggesting that faults spread out as they dive into the deeper crust.

Whatever their cause, tremors “are not relieving a lot of stress or making the fault less hazardous, they just indicate a changes in stress next to locked faults,” said Nadeau.

Seismologists around the world are searching for tremors along other fault systems, Guilhem noted, although tremors can be hard to detect because of noise from oceans as well as from civilization. Brief tremor activity has been observed on a few faults, triggered by huge quakes far away, and these may be areas to focus on. Tremors were triggered on Northern California’s Calaveras Fault by Alaska’s Denali quake of 2002, Nadeau said.

Arctic climate under greenhouse conditions in the Late Cretaceous

Fossil diatom algae of Cretaceous age from the Alpha Ridge of the Arctic Ocean
Fossil diatom algae of Cretaceous age from the Alpha Ridge of the Arctic Ocean

New evidence for ice-free summers with intermittent winter sea ice in the Arctic Ocean during the Late Cretaceous – a period of greenhouse conditions – gives a glimpse of how the Arctic is likely to respond to future global warming.

Records of past environmental change in the Arctic should help predict its future behaviour. The Late Cretaceous, the period between 100 and 65 million years ago leading up to the extinction of the dinosaurs, is crucial in this regard because levels of carbon dioxide (CO2) were high, driving greenhouse conditions. But scientists have disagreed about the climate at this time, with some arguing for low Arctic late Cretaceous winter temperatures (when sunlight is absent during the Polar night) as against more recent suggestions of a somewhat milder 15°C mean annual temperature.

Writing in Nature, Dr Andrew Davies and Professor Alan Kemp of the University of Southampton’s School of Ocean and Earth Science based at the National Oceanography Centre, Southampton, along with Dr Jennifer Pike of Cardiff University take this debate a step forward by presenting the first seasonally resolved Cretaceous sedimentary record from the Alpha Ridge of the Arctic Ocean.

The scientists analysed the remains of diatoms – tiny free-floating plant-like organisms – preserved in late Cretaceous marine sediments. In modern oceans, diatoms play a dominant role in the ‘biological carbon pump’ by which carbon dioxide is drawn down from the atmosphere through photosynthesis and a proportion of it exported to the deep ocean. Unfortunately, the role of diatoms in the Cretaceous oceans has until now been unclear, in part because they are often poorly preserved in sediments.

But the researchers struck lucky. “With remarkable serendipity,” they explain, ” successive US and Canadian expeditions that occupied floating ice islands above the Alpha Ridge of the Arctic Ocean, recovered cores containing shallow buried upper Cretaceous diatom ooze with superbly preserved diatoms.” This has allowed them to conduct a detailed study of the diatom fossils using sophisticated electron microscopy techniques. In the modern ocean, scientists use floating sediment traps to collect and study settling material. These electron microscope techniques that have been pioneered by Professor Kemp’s group at Southampton have unlocked a ‘palaeo-sediment trap’ to reveal information about Late Cretaceous environmental conditions.

They find that the most informative sediment core samples display a regular alternation of microscopically thin layers composed of two distinctly different diatom assemblages, reflecting seasonal changes. Their analysis clearly demonstrates that seasonal blooming of diatoms was not related to the upwelling of nutrients, as has been previously suggested. Rather, production occurred within a stratified water column, indicative of ice-free summers. These summer blooms comprised specially adapted species resembling those of the modern North Pacific Subtropical Gyre, or preserved in relatively recent organically rich Mediterranean sediments called ‘sapropels’.

The sheer number of diatoms found in the Late Cretaceous sediment cores indicates exceptional abundances equalling modern values for the most productive areas of the Southern Ocean. “This Cretaceous production, dominated by diatoms adapted to stratified conditions of the polar summer may also be a pointer to future trends in the modern ocean,” say the researchers: “With increasing CO2 levels and global warming giving rise to increased ocean stratification, this style of (marine biological) production may become of increasing importance.”

However, thin accumulations of earthborn sediment within the diatom ooze are consistent with the presence of intermittent sea ice in the winter, a finding that supports “a wide body of evidence for low Arctic late Cretaceous winter temperatures rather than recent suggestions of a 15C mean annual temperature at this time.” The size distribution of clay and sand grains in the sediment points to the formation of sea ice in shallow coastal seas during autumn storms but suggests the absence of larger drop-stones suggests that the winters, although cold, were not cold enough to support thick glacial ice or large areas of anchored ice.

Commenting on the findings, Professor Kemp said: “Although seasonally-resolved records are rarely preserved, our research shows that they can provide a unique window into past Earth system behaviour on timescales immediately comparable and relevant to those of modern concern.”

CO2 higher today than last 2.1 million years

This is Bärbel Hönisch with a mass spectrometer used to measure boron isotopes to reconstruct past CO2. -  Lamont-Doherty Earth Observatory
This is Bärbel Hönisch with a mass spectrometer used to measure boron isotopes to reconstruct past CO2. – Lamont-Doherty Earth Observatory

Researchers have reconstructed atmospheric carbon dioxide levels over the past 2.1 million years in the sharpest detail yet, shedding new light on its role in the earth’s cycles of cooling and warming.

The study, in the June 19 issue of the journal Science, is the latest to rule out a drop in CO2 as the cause for earth’s ice ages growing longer and more intense some 850,000 years ago. But it also confirms many researchers’ suspicion that higher carbon dioxide levels coincided with warmer intervals during the study period.

The authors show that peak CO2 levels over the last 2.1 million years averaged only 280 parts per million; but today, CO2 is at 385 parts per million, or 38% higher. This finding means that researchers will need to look back further in time for an analog to modern day climate change.

In the study, Bärbel Hönisch, a geochemist at Lamont-Doherty Earth Observatory, and her colleagues reconstructed CO2 levels by analyzing the shells of single-celled plankton buried under the Atlantic Ocean, off the coast of Africa. By dating the shells and measuring their ratio of boron isotopes, they were able to estimate how much CO2 was in the air when the plankton were alive. This method allowed them to see further back than the precision records preserved in cores of polar ice, which go back only 800,000 years.

The planet has undergone cyclic ice ages for millions of years, but about 850,000 years ago, the cycles of ice grew longer and more intense-a shift that some scientists have attributed to falling CO2 levels. But the study found that CO2 was flat during this transition and unlikely to have triggered the change.

“Previous studies indicated that CO2 did not change much over the past 20 million years, but the resolution wasn’t high enough to be definitive,” said Hönisch. “This study tells us that CO2 was not the main trigger, though our data continues to suggest that greenhouse gases and global climate are intimately linked.”

The timing of the ice ages is believed to be controlled mainly by the earth’s orbit and tilt, which determines how much sunlight falls on each hemisphere. Two million years ago, the earth underwent an ice age every 41,000 years. But some time around 850,000 years ago, the cycle grew to 100,000 years, and ice sheets reached greater extents than they had in several million years-a change too great to be explained by orbital variation alone.

A global drawdown in CO2 is just one theory proposed for the transition. A second theory suggests that advancing glaciers in North America stripped away soil in Canada, causing thicker, longer lasting ice to build up on the remaining bedrock. A third theory challenges how the cycles are counted, and questions whether a transition happened at all.

The low carbon dioxide levels outlined by the study through the last 2.1 million years make modern day levels, caused by industrialization, seem even more anomalous, says Richard Alley, a glaciologist at Pennsylvania State University, who was not involved in the research.

“We know from looking at much older climate records that large and rapid increase in C02 in the past, (about 55 million years ago) caused large extinction in bottom-dwelling ocean creatures, and dissolved a lot of shells as the ocean became acidic,” he said. “We’re heading in that direction now.”

The idea to approximate past carbon dioxide levels using boron, an element released by erupting volcanoes and used in household soap, was pioneered over the last decade by the paper’s coauthor Gary Hemming, a researcher at Lamont-Doherty and Queens College. The study’s other authors are Jerry McManus, also at Lamont; David Archer at the University of Chicago; and Mark Siddall, at the University of Bristol, UK.

Earth’s most prominent rainfall feature creeping northward

The band of heavy precipitation indicates the intertropical convergence zone. The new findings are based on sediment cores from lakes and lagoons on Palau, Washington, Christmas and Galapagos islands. -  University of Washington
The band of heavy precipitation indicates the intertropical convergence zone. The new findings are based on sediment cores from lakes and lagoons on Palau, Washington, Christmas and Galapagos islands. – University of Washington

The rain band near the equator that determines the supply of freshwater to nearly a billion people throughout the tropics and subtropics has been creeping north for more than 300 years, probably because of a warmer world, according to research published in the July issue of Nature Geoscience.

If the band continues to migrate at just less than a mile (1.4 kilometers) a year, which is the average for all the years it has been moving north, then some Pacific islands near the equator – even those that currently enjoy abundant rainfall – may be drier within decades and starved of freshwater by midcentury or sooner. The prospect of additional warming because of greenhouse gases means that situation could happen even sooner.

The findings suggest “that increasing greenhouse gases could potentially shift the primary band of precipitation in the tropics with profound implications for the societies and economies that depend on it,” the article says.

“We’re talking about the most prominent rainfall feature on the planet, one that many people depend on as the source of their freshwater because there is no groundwater to speak of where they live,” says Julian Sachs, associate professor of oceanography at the University of Washington and lead author of the paper. “In addition many other people who live in the tropics but farther afield from the Pacific could be affected because this band of rain shapes atmospheric circulation patterns throughout the world.”

The band of rainfall happens at what is called the intertropical convergence zone. There, just north of the equator, trade winds from the northern and southern hemispheres collide at the same time heat pours into the atmosphere from the tropical sun. Rain clouds 30,000 feet thick in places proceed to dump as much as 13 feet (4 meters) of rain a year in some places. The band stretching across the Pacific is generally between 3 degrees and 10 degrees north of the equator depending on the time of year. It has recently been hypothesized that the intertropical convergence zone does not reside in the southern hemisphere for reasons having to do with the distribution of land masses and locations of major mountain ranges in the world, particularly the Andes mountains, that have not changed for millions of years.

The new article presents surprising evidence that the intertropical convergence zone hugged the equator some 3 ½ centuries ago during Earth’s little ice age, which lasted from 1400 to 1850.

The authors analyzed the record of rainfall in lake and lagoon sediments from four Pacific islands at or near the equator.

One of the islands they studied, Washington Island, is about 5 degrees north of the equator. Today it is at the southern edge of the intertropical convergence zone and receives nearly 10 feet (2.9 meters) of rain a year. But cores reveal a very different Washington Island in the past: It was arid, especially during the little ice age.

Among other things, the scientists looked for evidence in sediment cores of salt-tolerant microbes. On Washington Island they found that evidence in 400- to 1,000-year-old sediment underlying what is now a freshwater lake. Such organisms could only have thrived if rainfall was much reduced from today’s high levels on the island. Additional evidence for changes in rainfall were provided by ratios of hydrogen isotopes of material in the sediments that can only be explained by large changes in precipitation.

Sediment cores from Palau, which lies about 7 degrees north of the equator and in the heart of the modern convergence zone, also revealed arid conditions during the little ice age.

In contrast, the researchers present evidence that the Galapagos Islands, today an arid place on the equator in the Eastern Pacific, had a wet climate during the little ice age.

They write, “The observations of dry climates on Washington Island and Palau and a wet climate in the Galapagos between about 1420-1560/1640 provide strong evidence for an intertropical convergence zone located perennially south of Washington Island (5 degrees north) during that time and perhaps until the end of the eighteenth century.”

If the zone at that time experienced seasonal variations of 7 degrees latitude, as it does today, then during some seasons it would have extended southward to at least the equator, Sachs says. This has been inferred previously from studies of the intertropical convergence zone on or near the continents, but the new data from the Pacific Ocean region is clearer because the feature is so easy to identify there.

The remarkable southward shift in the location of the intertropical convergence zone during the little ice age cannot be explained by changes in the distribution of continents and mountain ranges because they were in the same places in the little ice age as they are now. Instead, the co-authors point out that the Earth received less solar radiation during the little ice age, about 0.1 percent less than today, and speculate that may have caused the zone to hover closer to the equator until solar radiation picked back up.

“If the intertropical convergence zone was 550 kilometers, or 5 degrees, south of its present position as recently as 1630, it must have migrated north at an average rate of 1.4 kilometers – just less than a mile – a year,” Sachs says. “Were that rate to continue, the intertropical convergence zone will be 126 kilometers – or more than 75 miles – north of its current position by the latter part of this century.

Plants save the earth from an icy doom

Mark Pagani is a researcher at Yale University. -  Pagani
Mark Pagani is a researcher at Yale University. – Pagani

Fifty million years ago, the North and South Poles were ice-free and crocodiles roamed the Arctic. Since then, a long-term decrease in the amount of CO2 in the atmosphere has cooled the Earth. Researchers at Yale University, the Carnegie Institution of Washington and the University of Sheffield now show that land plants saved the Earth from a deep frozen fate by buffering the removal of atmospheric CO2 over the past 24 million years.

While the upper limit for atmospheric CO2 levels has been a focus for discussions of global warming and the quality of life on Earth, this study points to the dynamics that maintain the lower sustainable limits of atmospheric CO2.

Volcanic gases naturally add CO2 to the atmosphere, and over millions of years CO2 is removed by the weathering of silica-based rocks like granite and then locked up in carbonates on the floor of the world’s oceans. The more these rocks are weathered, the more CO2 is removed from the atmosphere.

“Mountain building in places like Tibet and South America during the past 25 million years created conditions that should have sucked nearly all the CO2 out of the atmosphere, throwing the Earth into a deep freeze,” said senior author Mark Pagani, associate professor of geology and geophysics and a member of the Yale Climate and Energy Institute’s executive committee. “But as the CO2 concentration of Earth’s atmosphere decreased to about 200 to 250 parts per million, CO2 levels stabilized.”

The study, published in the XX issue of Nature, looked for a possible explanation They used simulations of the global carbon cycle and observations from plant growth experiments to show that as atmospheric CO2 concentrations began to drop towards near-starvation levels for land plants, the capacity of plants and vegetation to weather silicate rocks greatly diminished, slowing the draw-down of atmospheric CO2.

“When CO2 levels become suffocatingly low, plant growth is compromised and the health of forest ecosystems suffer,” said Pagani. “When this happens, plants can no longer help remove CO2 from the atmosphere faster than volcanoes and other sources can supply it.”

“Ultimately, we owe another large debt to plants” said co-author Ken Caldeira from the Carnegie Institution of Washington at Stanford University. “Aside from providing zesty dishes like eggplant parmesan, plants have also stabilized Earth’s climate by inhibiting critically low levels of CO2 that would have thrown Earth spinning into space like a frozen ice ball.”

Co-author David Beerling from Sheffield University adds, “Our research supports the emerging view that plants should be recognized as a geologic force of nature, with important consequences for all life on Earth”

The least sea ice in 800 years

There has never been so little sea ice in the area between Svalbard and Greenland in the last 800 years. -  NASA/GSFC.
There has never been so little sea ice in the area between Svalbard and Greenland in the last 800 years. – NASA/GSFC.

New research, which reconstructs the extent of ice in the sea between Greenland and Svalbard from the 13th century to the present indicates that there has never been so little sea ice as there is now. The research results from the Niels Bohr Institute, among others, are published in the scientific journal, Climate Dynamics.

There are of course neither satellite images nor instrumental records of the climate all the way back to the 13th century, but nature has its own ‘archive’ of the climate in both ice cores and the annual growth rings of trees and we humans have made records of a great many things over the years – such as observations in the log books of ships and in harbour records. Piece all of the information together and you get a picture of how much sea ice there has been throughout time.

Modern research and historic records

“We have combined information about the climate found in ice cores from an ice cap on Svalbard and from the annual growth rings of trees in Finland and this gave us a curve of the past climate” explains Aslak Grinsted, geophysicist with the Centre for Ice and Climate at the Niels Bohr Institute at the University of Copenhagen.

In order to determine how much sea ice there has been, the researchers needed to turn to data from the logbooks of ships, which whalers and fisherman kept of their expeditions to the boundary of the sea ice. The ship logbooks are very precise and go all the way back to the 16th century. They relate at which geographical position the ice was found. Another source of information about the ice are records from harbours in Iceland, where the severity of the winters have been recorded since the end of the 18th century.

By combining the curve of the climate with the actual historical records of the distribution of the ice, researchers have been able to reconstruct the extent of the sea ice all the way back to the 13th century. Even though the 13th century was a warm period, the calculations show that there has never been so little sea ice as in the 20th century.

In the middle of the 17th century there was also a sharp decline in sea ice, but it lastet only a very brief period. The greatest cover of sea ice was in a period around 1700-1800, which is also called the ‘Little Ice Age’.

“There was a sharp change in the ice cover at the start of the 20th century,” explains Aslak Grinsted. He explains, that the ice shrank by 300.000 km2 in the space of ten years from 1910-1920. So you can see that there have been sudden changes throughout time, but here during the last few years we have had some record years with very little ice extent.

“We see that the sea ice is shrinking to a level which has not been seen in more than 800 years”, concludes Aslak Grinsted.

Geologists to help communicate the dangers of Colombian volcano

During the past decade, residents of Pasto, Colombia, and neighboring villages near Galeras, Colombia’s most dangerous volcano, have been threatened with evacuation, but compliance varies. With each new eruption — the most recent explosion occurred June 7-9 — Colombian officials have grown increasingly concerned about the safety of the residents who live within striking distance of Galeras, located 700 km from Bogota.

Now, geologists from the University at Buffalo and the Universidad de Nariño have organized a special workshop in Colombia designed to tackle the communication issue, with support from the National Science Foundation and the Universidad de Nariño.

The purpose is to develop a consensus as to how best to raise awareness and protect these communities from dangerous eruptions at Galeras.

Unlike most scientific workshops, which are exclusively attended by scientists, this program will include the active participation of local residents and government officials working together with the scientists in all of the workshop sessions.

From July 6-11, Michael F. Sheridan, Ph.D., an internationally renowned volcanologist and director of UB’s Center for Geohazards Studies, and Gustavo Cordoba, Ph.D., a post-doctoral researcher in the UB center, will run the workshop on “Knowledge Sharing and Collaboration in Volcanic Risk Mitigation at Galeras Volcano, Colombia.” Complete information is at

The first half of the workshop, which will feature professors from the UB Department of Geology, the Universidad de Nariño in Colombia, officials from the local and federal government and the Red Cross, among others, will cover the history of volcanic eruptions at Galeras, volcanic crisis management, the physics and modeling of explosive volcanism and discussions about crisis management at Soufriere Hills Volcano, Chaiten Volcano,Vesuvius and others.

The second half of the workshop will begin July 10 with a session called “The People Speak.”

Sheridan said that this part of the workshop puts a spotlight on the critical connection between local populations affected by an adjacent hazard and the level of scientific understanding and certainty — or the lack of it — about that hazard.

“The villagers feel they are safe,” said Sheridan.

In one example, he said, some of them have said that there is a sacred stone with petroglyphs on it that lies directly in the path where volcanic debris is expected to flow, but it has been there for 500 years and has never been damaged by eruptions at Galeras.

The workshop will use the example of a bridge that connects a village in the region (La Florida) to the capitol city Pasto, a city of 400,000 located only six miles from the crater of Galeras.

“Using our computational tools, we will show that if mudflows from this volcano inundate the bridge, then the evacuation route will be gone,” he said.

At the workshop, scientists, officials and residents will analyze existing hazard maps and safety plans for Galeras in light of the latest research on forecasting volcanic hazards.

“Our hope is that through the presentations by scientists and crisis management experts about what has happened at other volcanoes, and by using some visual tools, like computational modeling of mud and debris flows, we can help people living around the volcano better understand the hazard they live with,” said Sheridan.

With decades of experience all over the globe, working with scientists, governments and local populations, Sheridan concedes that it will be a challenge to try to improve the residents’ preparedness by attempting to better communicate how vulnerable they may be to eruptions at Galeras.

Still, he says that that goal will ultimately ease the job of volcanologists and others involved with risk mitigation.

“I’d like to see the workshop end with a new approach to hazards that includes the opinions of the people who are actually living in the hazard location,” he said. “It may be too much to hope for, but if it’s possible to get them to buy into the safety plan, that would be the best outcome.”