Oil Spill Symposium to be streamed live online

The University of Georgia/Georgia Sea Grant Oil Spill Symposium, titled “Building Bridges in Crisis,” will be streamed live on www.uga.edu.

The event begins at 2 p.m. on Jan. 25 with a keynote address by Sylvia Earle, National Geographic Explorer-in-Residence and former chief scientist at NOAA, and continues on Jan. 26 with a full day of panel discussions featuring scientists, government officials, Gulf Coast community leaders and journalists from across the nation.

Panelists include Steve Murawski, former NOAA director of scientific programs and chief science advisor; UGA Professor of Marine Sciences Samantha Joye; and New York Times reporter Justin Gillis.

A new research project to focus on climate change in the Arctic

Mean temperatures in the Arctic regions have risen nearly twice as rapidly as the global mean temperature during the past 100 years. Warming in the Arctic is accompanied by an earlier onset of the spring melt, which means a longer melt season and diminishing volumes of long-term summer ice, as well as faster melting of glaciers in Greenland.

A longer melt season accelerates global warming

A longer melt season reduces the Earth’s ability to reflect sunlight back into space, thereby accelerating global warming. Reducing the amount of carbon dioxide in the atmosphere is the only way to slow down climate change. But even if major reductions in carbon dioxide emissions were achieved quickly, they would not necessarily have time to affect melting in the Arctic, since carbon dioxide has a long life in the atmosphere. In contrast, the warming of the Arctic environment could probably be slowed down by reducing the atmospheric volumes of short-lived compounds affecting the climate, such as black carbon. This would shorten the melt season and more sunlight would be reflected back into space, cooling the climate. By reducing the emissions of these short-lived compounds into the atmosphere, more time can be won for the Arctic regions until the measures to cut carbon dioxide emissions lead to reduced concentrations in the atmosphere.

Black carbon as the object of study

The new project coordinated by the Finnish Meteorological Institute concentrates on one short-lived compound, black carbon, which has been shown to affect the Earth’s radiation balance markedly. It is thought that black carbon speeds up climate change regionally, for instance, in the Arctic and in the Himalayas. Black carbon is generated when the combustion process is incomplete. Its sources are mostly anthropogenic; the only natural sources are forest fires. Black carbon emissions from Europe are thought to have a major impact on black carbon concentrations and the climate in the Arctic regions. The main sources of black carbon are small-scale burning of wood and emissions from transport.

The principal objective of the project is to use the best methods available for demonstrating that climate change in the Arctic can be mitigated by reducing the emissions of black carbon at mid-latitudes and especially in Europe. At the same time, it is possible to assess how the current legislation on air quality and the climate affects black carbon emissions and the transportation of black carbon into the Arctic. In addition, the impact of black carbon on the Arctic climate and its links with warming caused by carbon dioxide will be studied.

The research findings will help assess the possibilities of reducing black carbon emissions. Another objective is to transmit scientific information, for instance on the impacts of wood burning, to decision-makers and the general public within the EU. Apart from the Finnish Meteorological Institute, the University of Helsinki, the Finnish Environment Institute, and the International Institute for Applied Systems Analysis of Austria are participating in the project.

Rogue storm system caused Pakistan floods that left millions homeless

This photo, taken long after the initial floods hit in late July 2010, shows the significant effect of the monsoon on roads in the Muzaffargarrh district near central Pakistan. -  World Vision
This photo, taken long after the initial floods hit in late July 2010, shows the significant effect of the monsoon on roads in the Muzaffargarrh district near central Pakistan. – World Vision

Last summer’s disastrous Pakistan floods that killed more than 2,000 people and left more than 20 million injured or homeless were caused by a rogue weather system that wandered hundreds of miles farther west than is normal for such systems, new research shows.

Storm systems that bring widespread, long-lasting rain over eastern India and Bangladesh form over the Bay of Bengal, at the east edge of India, said Robert Houze, a University of Washington atmospheric sciences professor. But Pakistan, on the Arabian Sea west of India, is substantially more arid and its storms typically produce only locally heavy rainfall.

The flooding began in July and at one point it was estimated that 20 percent of Pakistan’s total land area was under water. Structural damage was estimated at more than $4 billion, and the World Health Organization estimated that as many as 10 million people had to drink unsafe water.

Houze and colleagues examined radar data from the Tropical Rainfall Measuring Mission satellite and were able to see that the rainfall that caused the Indus River in Pakistan to overflow was triggered over the Himalayas, within a storm system that had formed over the Bay of Bengal in late July and moved unusually far to the west. Because the rain clouds were within the moisture-laden storm from the east, they were able to pour abnormal amounts of rain on the barren mountainsides, which then ran into the Indus.

The progress of the storm system stood out in the satellite radar data, Houze said.

“We looked through 10 years of data from the satellite and we just never saw anything like this,” he said. “The satellite only passes over the area a couple of times a day, but it just happened to see these systems at a time when they were well developed.”

Houze is the lead author of a paper detailing the findings to be published in the Bulletin of the American Meteorological Society. Co-authors are Kristen Rasmussen, Socorro Medina and Stacy Brodzik of the UW and Ulrike Romatschke of the University of Vienna in Austria.

Houze also will discuss the findings during a session Tuesday (Jan. 25) at the American Meteorological Society’s annual meeting in Seattle

The storms were associated with a wind pattern that could be traced in the satellite data back to its origin over the Bay of Bengal, Houze said. Finding the storm system’s signature in the satellite data makes it possible to incorporate that information into weather forecast models. That could make it possible for meteorologists to forecast when conditions are favorable for such an event to occur again and provide a warning.

“I think this was a rare event, but it is one you want to be thinking about,” Houze said. “Understanding what happened could lead to better predictions of such disasters in the future.”

Scientists find that debris on certain Himalayan glaciers may prevent melting

These are crevasses of a steep glacier in the Sutlej Valley of the Western Himalaya. This glacier has a debris-covered toe. -  Bodo Bookhagen, UCSB
These are crevasses of a steep glacier in the Sutlej Valley of the Western Himalaya. This glacier has a debris-covered toe. – Bodo Bookhagen, UCSB

A new scientific study shows that debris coverage — pebbles, rocks, and debris from surrounding mountains — may be a missing link in the understanding of the decline of glaciers. Debris is distinct from soot and dust, according to the scientists.

Melting of glaciers in the Himalayan Mountains affects water supplies for hundreds of millions of people living in South and Central Asia. Experts have stated that global warming is a key element in the melting of glaciers worldwide.

Bodo Bookhagen, assistant professor in the Department of Geography at UC Santa Barbara, co-authored a paper on this topic in Nature Geoscience, published this week. The first author is Dirk Scherler, Bookhagen’s graduate student from Germany, who performed part of this research while studying at UCSB.

“With the aid of new remote-sensing methods and satellite images, we identified debris coverage to be an important contributor to glacial advance and retreat behaviors,” said Bookhagen. “This parameter has been almost completely neglected in previous Himalayan and other mountainous region studies, although its impact has been known for some time.”

The finding is one more element in a worldwide political controversy involving global warming. “Controversy about the current state and future evolution of Himalayan glaciers has been stirred up by erroneous reports by the Intergovernmental Panel on Climate Change (IPCC),” according to the paper.

“There is no ‘stereotypical’ Himalayan glacier,” said Bookhagen. “This is in clear contrast to the IPCC reports that lumps all Himalayan glaciers together.”

Bookhagen noted that glaciers in the Karakoram region of Northwestern Himalaya are mostly stagnating. However, glaciers in the Western, Central, and Eastern Himalaya are retreating, with the highest retreat rates — approximately 8 meters per year — in the Western Himalayan Mountains. The authors found that half of the studied glaciers in the Karakoram region are stable or advancing, whereas about two-thirds are in retreat elsewhere throughout High Asia. This is in contrast to the prevailing notion that all glaciers in the tropics are retreating.

Bookhagen explained the difference between debris and coverage by soot and dust on glaciers: “The debris cover has the opposite effect of soot and dust on glaciers. Debris coverage thickness above 2 centimeters, or about a half an inch, ‘shields’ the glacier and prevents melting. This is the case for many Himalayan glaciers that are surrounded by towering mountains that almost continuously shed pebbles, debris, and rocks onto the glacier.”

Thus, glaciers in the steep Himalaya are not only affected by temperature and precipitation, but also by debris coverage, and have no uniform and less predictable response, explained the authors. The debris coverage may be one of the missing links to creating a more coherent picture of glacial behavior throughout all mountains. The scientists contrast this Himalayan glacial study with glaciers from the gently dipping, low-relief Tibetan Plateau that have no debris coverage. Those glaciers behave in a different way, and their frontal changes can be explained by temperature and precipitation changes.

Bookhagen described results of another of his recent studies on this topic. He said that one of the key findings was that the Western Himalaya, including the Indus catchment and regions in Northern Pakistan and Northwestern India, depend heavily on seasonal snow and glacial melt waters, while Central Himalayan regions — Western India and Nepal — mostly depend on monsoonal rainfall.

The smaller seasonal water storage space in the Central Himalaya, which has only steep glaciers and no large snow fields, makes this region much more vulnerable to shifts in monsoonal strength and to glacial melting, explained Bookhagen. River discharge in these regions is crucial to sustain agriculture, hydropower, and drinking water. If the Indian monsoon season is weaker because of global atmospheric changes such as El NiƱo, then Central Nepal must primarily rely on water coming from the seasonal melting of glaciers and the small amount of snowmelt that is available.

“Retreating glaciers, and thus a reduction of seasonal water storage in this region, have a large impact on hundreds of millions of people living in the downstream section of these rivers,” said Bookhagen. “The mitigation and adaptation strategies in the Himalaya Mountains thus need to take into account the spatial climatic and topographic variability. There is no regional solution, but only different local strategies to the future water shortage. The geographic setting of High Asia poses political difficulties as future water treaties need to be carefully evaluated.”

What impact would sun dimming have on Earth’s weather?

Solar radiation management projects, also known as sun dimming, seek to reduce the amount of sunlight hitting the Earth to counteract the effects of climate change. Global dimming can occur as a side-effect of fossil fuels or as a result of volcanic eruptions, but the consequences of deliberate sun dimming as a geoengineering tool are unknown.

A new study by Dr Peter Braesicke, from the Centre for Atmospheric Science at Cambridge University, seeks to answer this question by focusing on the possible impacts of a dimming sun on atmospheric teleconnections.

Teleconnections, important for the predictability of weather regimes, are the phenomenon of distant climate anomalies being related to each other at large distances, such as the link between sea-level pressure at Tahiti and Darwin, Australia, which defines the Southern Oscillation.

“It is important that we look for unintended consequences of any sun dimming schemes,” said Braesicke. “We have to test our models continuously against observations to make sure that they are ‘fit-for-purpose’, and it’s important that we should not only look at highly averaged ‘global’ quantities.”

Dr Braesicke’s team believes that the link between tropical temperatures and extra-tropical circulation are well captured for the recent past and that the link changes when the sun is dimmed.

“This could have consequences for prevailing weather regimes,” said Braesicke, “particularly for the El Nino/Southern Oscillation (ENSO) teleconnection. Our research allows us to assess how forced atmospheric variability, exemplified by the northern polar region, might change in a geoengineered world with a dimmed sun.”

A dimmed sun will change the temperature structure of the atmosphere with a cooling throughout the atmosphere. In the troposphere, temperatures drop because less solar radiation reaches the ground and therefore less can be converted into heat. In the stratosphere, less shortwave radiation is available for absorption by ozone and, therefore, heating rates in the stratosphere are lower.

“We have shown that important teleconnections are likely to change in such a geoengineered future, due to chemistry-climate interactions and in particular, due to changing stratospheric ozone,” concluded Braesicke. “In our model, the forced variability of northern high latitude temperatures changes spatially, from a polecentred pattern to a pattern over the Pacific region when the solar irradiance is reduced. Future geoengineering studies need to consider the full evolution of the stratosphere, including its chemical behaviour.”

The Geoengineering Model Intercomparison Project

In an accompanying paper Ben Kravitz, from Rutgers University, reviews the new project to coordinate and compare experiments in aerosol geoengineering and evaluates the effects of stratospheric geoengineering with sulfate aerosols.

Since the idea of geoengineering was thrust back into the scientific arena many have wondered whether it could reduce global warming as a mitigation measure. Kravitz’s team argues that one of the most feasible methods is through stratospheric sulfate aerosols. While geoengineering projects are not yet favored by policy makers this method is inexpensive compared with other such projects and so may prove more attractive.

However, stratospheric geoengineering with sulfate aerosols may have unintended consequences. Research indicates that stratospheric geoengineering could, by compensating for increased greenhouse gas concentrations, reduce summer monsoon rainfall in Asia and Africa, potentially threatening the food supply for billions of people.

“Some unanswered questions include whether a continuous stratospheric aerosol cloud would have the same effect as a transient one, such as that from a volcano, and to what extent regional changes in precipitation would be compensated by regional changes in evapotranspiration,” said Kravitz.

A consensus has yet to be reached on these, as well as other, important issues and to answer these questions the team propose a suite of standardised climate modeling experiments, as well as a coordinating framework for performing such experiments, known as the Geoengineering Model Intercomparison Project (GeoMIP).

Storms, soccer matches hidden in seismometer noise

In the days of sail, sailors dreaded rounding the Horn, the southernmost tip of South America, because of the violence of the storms in Drake Passage. Geologists at Washington University think that water waves excited by these and other storms in the Southern Atlantic Ocean may be converted to seismic waves off the west coast of Africa and travel through the solid earth to seismometers, which pick them up as 'noise.' -  Dave Munroe/National Science Foundation
In the days of sail, sailors dreaded rounding the Horn, the southernmost tip of South America, because of the violence of the storms in Drake Passage. Geologists at Washington University think that water waves excited by these and other storms in the Southern Atlantic Ocean may be converted to seismic waves off the west coast of Africa and travel through the solid earth to seismometers, which pick them up as ‘noise.’ – Dave Munroe/National Science Foundation

If you wander up to a seismograph in a museum, unless you are lucky enough to be there right during an earthquake, all you will see is a small wiggly signal being recorded.

What’s inside the wiggles is called noise by seismologists, because the signal is always there and originates from the normal activity of the earth between the jolts caused by large earthquakes.

Up until recently, few researchers paid any heed to these apparently boring signals – analyzing them, it was thought, would be like critiquing elevator music.

But now a seismologist and his adviser from Washington University in St. Louis, building on a serendipitous, humorous find of three years ago linking seismic noise and soccer, have discovered a source of seismic noise in Africa near the island of Bioko in the Bight of Bonny in the Gulf of Guinea. Improbable as it may seem the strength of this source varies with the intensity of storm activity in the Southern Atlantic Ocean. During the largest storms, seismic waves from the Bight of Bonny are recorded by broadband seismometers all around the world.

Washington University doctoral candidate Garrett Euler, using a mathematical technique called cross correlation, analyzed four arrays of broadband seismometers in Cameroon, South Africa, Ethiopia and Tanzania and found that seismic noise oscillating at 28 and 26 second periods originates in the Bight of Bonny and varies with the intensity of storm activity in the Southern Atlantic Ocean. During the largest storms, seismic waves from the Bight of Bonny are recorded by broadband seismometers all around the world.

Although the exact mechanism causing seismic noise near Africa is unknown, Euler speculates that long-period ocean waves from storms in the Southern Atlantic Ocean reflect off the coast of Africa and focus near the island of Bioko. The interaction of the waves with the shallow seafloor changes the ocean wave energy into seismic waves that travel through solid earth. The noise source was first discovered by Jack Oliver, PhD, of Columbia University in 1962, but Euler’s work is the first to accurately locate the source.

“It’s said that one researcher’s noise is another’s signal,” says Euler’s co-advisor Douglas A. Wiens, PhD, professor and chair of Washington University’s earth and planetary sciences department in Arts & Sciences. “When we don’t understand it, we call it noise. When we do, we call it a signal. In the past, this kind of data didn’t stick out at all, but just recently people are coming to grips with how to analyze it. There are intriguing possibilities for what noise might reveal.”

Although seismic noise analysis is still developing, some seismologists are modeling noise to look for a signature that could reveal a global warming effect. For instance, as the number of storms increase, perhaps there are corresponding fluctuations in seismic noise. Others are considering using seismic noise to map volcanic magma chambers. There has even been some very preliminary work exploring the possibility that seismic noise might predict earthquakes. The idea is that the wave speed of a region might change as stress builds up before an earthquake.

Euler gave a presentation on his observations at the fall meeting of the American Geophysical Union in San Francisco this December.

“We have some very bizarre observations that we’re still trying to figure out,” says Euler, of his initial data. “One is the signal is at longer periods than we’d expected. It has multiple peaks in frequency – it ‘hums’ at 28 seconds, as well as 26 seconds. It’s really, really strong during some particular times that correlate with storms at sea.

“Another observation is that the signal shifts its location with frequency. The source of the 28-second period band is about 300 kilometers from the 26-second source, which is essentially at Mount Cameroon.”

Cross correlation compares the similarity of two seismic signals, usually recorded at two different locations, as a function of the time lag between them – essentially sliding one signal past the other until they match up. Peaks in the correlation function correspond to the average seismic velocity between the two locations. To cross correlate seismic noise, though, Euler faces a conundrum because there is no well-defined termination of seismic noise.

“This noise is made up of a cacophony of overlapping signals across quite long seismic records that have to be averaged, and that information comprises this noise field,” he says.

Euler wandered into the field of seismic noise in 2007 when he found consistent spikes in noise from one of 32 different seismic stations in Cameroon. The spikes turned out to correspond with joyous, celebratory foot-stomping of Cameroon’s avid soccer fans at various cities after goals were scored or key plays made during the African Cup of Nations games in 2006.

This was the first time widespread anthropogenic noise – created by humans – had been found in seismic signals. And it was the first known reporting of “footquakes.”

“When I got that data, I was stumped, because there hadn’t been any earthquakes recorded during that time,” he says. “We finally put two and two together and saw this as the result of thousands of fans spread out over many miles, reacting to things ranging from a goal, to the reaction of a star player, to the ultimate, a win. There were slight fluctuations in all the scenarios. That was the start of my interest in seismic noise. It’s grown a lot since.”

New melt record for Greenland ice sheet

New research shows that 2010 set new records for the melting of the Greenland Ice Sheet, expected to be a major contributor to projected sea level rises in coming decades.

“This past melt season was exceptional, with melting in some areas stretching up to 50 days longer than average,” said Dr. Marco Tedesco, director of the Cryospheric Processes Laboratory at The City College of New York (CCNY – CUNY), who is leading a project studying variables that affect ice sheet melting.

“Melting in 2010 started exceptionally early at the end of April and ended quite late in mid- September.”

The study, with different aspects sponsored by World Wildlife Fund (WWF), the National Science Foundation and NASA, examined surface temperature anomalies over the Greenland ice sheet surface, as well as estimates of surface melting from satellite data, ground observations and models.

In an article published today in “Environmental Research Letters,” Professor Tedesco and co-authors note that in 2010, summer temperatures up to 3C above the average were combined with reduced snowfall.

The capital of Greenland, Nuuk, had the warmest spring and summer since records began in 1873.

Bare ice was exposed earlier than the average and longer than previous years, contributing to the extreme record.

“Bare ice is much darker than snow and absorbs more solar radiation,” said Professor Tedesco. “Other ice melting feedback loops that we are examining include the impact of lakes on the glacial surface, of dust and soot deposited over the ice sheet and how surface meltwater affects the flow of the ice toward the ocean.”

WWF climate specialist Dr. Martin Sommerkorn said “Sea level rise is expected to top 1 meter by 2100, largely due to melting from ice sheets. And it will not stop there – the longer we take to limit greenhouse gas production, the more melting and water level rise will continue.”

Oxygen-free early oceans likely delayed rise of life on planet

This photo shows Clint Scott (left) and Timothy Lyons. -  Lyons lab, UC Riverside.
This photo shows Clint Scott (left) and Timothy Lyons. – Lyons lab, UC Riverside.

Geologists at the University of California, Riverside have found chemical evidence in 2.6-billion-year-old rocks that indicates that Earth’s ancient oceans were oxygen-free and, surprisingly, contained abundant hydrogen sulfide in some areas.

“We are the first to show that ample hydrogen sulfide in the ocean was possible this early in Earth’s history,” said Timothy Lyons, a professor of biogeochemistry and the senior investigator in the study, which appears in the February issue of Geology. “This surprising finding adds to growing evidence showing that ancient ocean chemistry was far more complex than previously imagined and likely influenced life’s evolution on Earth in unexpected ways – such as, by delaying the appearance and proliferation of some key groups of organisms.”

Ordinarily, hydrogen sulfide in the ocean is tied to the presence of oxygen in the atmosphere. Even small amounts of oxygen favor continental weathering of rocks, resulting in sulfate, which in turn gets transported to the ocean by rivers. Bacteria then convert this sulfate into hydrogen sulfide.

How then did the ancient oceans contain hydrogen sulfide in the near absence of oxygen, as the 2.6-million-year-old rocks indicate? The UC Riverside-led team explains that sulfate delivery in an oxygen-free environment can also occur in sufficient amounts via volcanic sources, with bacteria processing the sulfate into hydrogen sulfide.

Specifically, Lyons and colleagues examined rocks rich in pyrite – an iron sulfide mineral commonly known as fool’s gold – that date back to the Archean eon of geologic history (3.9 to 2.5 billion years ago) and typify very low-oxygen environments. Found in Western Australia, these rocks have preserved chemical signatures that constitute some of the best records of the very early evolutionary history of life on the planet.

The rocks formed 200 million years before oxygen amounts spiked during the so-called “Great Oxidation Event” – an event 2.4 billion years ago that helped set the stage for life’s proliferation on Earth.

“Our previous work showed evidence for hydrogen sulfide in the ocean more than 100 million years before the first appreciable accumulation of oxygen in the atmosphere at the Great Oxidation Event,” Lyons said. “The data pointing to this 2.5 billion-year-old hydrogen sulfide are fingerprints of incipient atmospheric oxygenation. Now, in contrast, our evidence for abundant 2.6 billion-year-old hydrogen sulfide in the ocean – that is, another 100 million years earlier – shows that oxygen wasn’t a prerequisite. The important implication is that hydrogen sulfide was potentially common for a billion or more years before the Great Oxidation Event, and that kind of ocean chemistry has key implications for the evolution of early life.”

Clint Scott, the first author of the research paper and a former graduate student in Lyons’s lab, said the team was also surprised to find that the Archean rocks recorded no enrichments of the trace element molybdenum, a key micronutrient for life that serves as a proxy for oceanic and atmospheric oxygen amounts.

The absence of molybdenum, Scott explained, indicates the absence of oxidative weathering of the continental rocks at this time (continents are the primary source of molybdenum in the oceans). Moreover, the development of early life, such as cyanobacteria, is determined by the amount of molybdenum in the ocean; without this life-affirming micronutrient, cyanobacteria could not become abundant enough to produce large quantities of oxygen.

“Molybdenum is enriched in our previously studied 2.5 billion-year-old Archean rocks, which ties to the earliest hints of atmospheric oxygenation as a harbinger of the Great Oxidation Event,” Scott said. “The scarcity of molybdenum in rocks deposited 100 million years earlier, however, reflects its scarcity also in the overlying water column. Such metal deficiencies suggest that cyanobacteria were probably struggling to produce oxygen when these rocks formed.

“Our research has important implications for the evolutionary history of life on Earth,” Scott added, “because biological evolution both initiated and responded to changes in ocean chemistry. We are trying to piece together the cause-and-effect relationships that resulted, billions of years later, in the evolution of animals and, ultimately, humans. This is really the story of how we got here.”

The first animals do not appear in the fossil record until around 600 million years ago – almost two billion years after the rocks studied by Scott and his team formed. The steady build-up of oxygen, which began towards the end of the Archean, played a key role in the evolution of new life forms.

“Future research needs to focus on whether sulfidic and oxygen-free conditions were prevalent throughout the Archean, as our model predicts,” Scott said.

Earth’s hot past: Prologue to future climate?

If carbon dioxide emissions continue on their current trajectory, Earth may someday return to an ancient, hotter climate when the Antarctic ice sheet didn't exist. -  NOAA
If carbon dioxide emissions continue on their current trajectory, Earth may someday return to an ancient, hotter climate when the Antarctic ice sheet didn’t exist. – NOAA

The magnitude of climate change during Earth’s deep past suggests that future temperatures may eventually rise far more than projected if society continues its pace of emitting greenhouse gases, a new analysis concludes.

The study, by National Center for Atmospheric Research (NCAR) scientist Jeffrey Kiehl, will appear as a “Perspectives” article in this week’s issue of the journal Science.

The work was funded by the National Science Foundation (NSF), NCAR’s sponsor.

Building on recent research, the study examines the relationship between global temperatures and high levels of carbon dioxide in the atmosphere tens of millions of years ago.

It warns that, if carbon dioxide emissions continue at their current rate through the end of this century, atmospheric concentrations of the greenhouse gas will reach levels that existed about 30 million to 100 million years ago.

Global temperatures then averaged about 29 degrees Fahrenheit (16 degrees Celsius) above pre-industrial levels.

Kiehl said that global temperatures may take centuries or millennia to fully adjust in response to the higher carbon dioxide levels.

Accorning to the study and based on recent computer model studies of geochemical processes, elevated levels of carbon dioxide may remain in the atmosphere for tens of thousands of years.

The study also indicates that the planet’s climate system, over long periods of times, may be at least twice as sensitive to carbon dioxide as currently projected by computer models, which have generally focused on shorter-term warming trends.

This is largely because even sophisticated computer models have not yet been able to incorporate critical processes, such as the loss of ice sheets, that take place over centuries or millennia and amplify the initial warming effects of carbon dioxide.

“If we don’t start seriously working toward a reduction of carbon emissions, we are putting our planet on a trajectory that the human species has never experienced,” says Kiehl, a climate scientist who specializes in studying global climate in Earth’s geologic past.

“We will have committed human civilization to living in a different world for multiple generations.”

The Perspectives article pulls together several recent studies that look at various aspects of the climate system, while adding a mathematical approach by Kiehl to estimate average global temperatures in the distant past.

Its analysis of the climate system’s response to elevated levels of carbon dioxide is supported by previous studies that Kiehl cites.

“This research shows that squaring the evidence of environmental change in the geologic record with mathematical models of future climate is crucial,” says David Verardo, Director of NSF’s Paleoclimate Program. “Perhaps Shakespeare’s words that ‘what’s past is prologue’ also apply to climate.”

Kiehl focused on a fundamental question: when was the last time Earth’s atmosphere contained as much carbon dioxide as it may by the end of this century?

If society continues its current pace of increasing the burning of fossil fuels, atmospheric levels of carbon dioxide are expected to reach about 900 to 1,000 parts per million by the end of this century.

That compares with current levels of about 390 parts per million, and pre-industrial levels of about 280 parts per million.

Since carbon dioxide is a greenhouse gas that traps heat in Earth’s atmosphere, it is critical for regulating Earth’s climate.

Without carbon dioxide, the planet would freeze over.

But as atmospheric levels of the gas rise, which has happened at times in the geologic past, global temperatures increase dramatically and additional greenhouse gases, such as water vapor and methane, enter the atmosphere through processes related to evaporation and thawing.

This leads to further heating.

Kiehl drew on recently published research that, by analyzing molecular structures in fossilized organic materials, showed that carbon dioxide levels likely reached 900 to 1,000 parts per million about 35 million years ago.

At that time, temperatures worldwide were substantially warmer than at present, especially in polar regions–even though the Sun’s energy output was slightly weaker.

The high levels of carbon dioxide in the ancient atmosphere kept the tropics at about 9-18 F (5-10 C) above present-day temperatures.

The polar regions were some 27-36 F (15-20 C) above present-day temperatures.

Kiehl applied mathematical formulas to calculate that Earth’s average annual temperature 30 to 40 million years ago was about 88 F (31 C)–substantially higher than the pre-industrial average temperature of about 59 F (15 C).

The study also found that carbon dioxide may have two times or more an effect on global temperatures than currently projected by computer models of global climate.

The world’s leading computer models generally project that a doubling of carbon dioxide in the atmosphere would have a heating impact in the range of 0.5 to 1.0 degrees Celsius watts per square meter. (The unit is a measure of the sensitivity of Earth’s climate to changes in greenhouse gases.)

However, the published data show that the comparable impact of carbon dioxide 35 million years ago amounted to about 2 C watts per square meter.

Computer models successfully capture the short-term effects of increasing carbon dioxide in the atmosphere.

But the record from Earth’s geologic past also encompasses longer-term effects, which accounts for the discrepancy in findings.

The eventual melting of ice sheets, for example, leads to additional heating because exposed dark surfaces of land or water absorb more heat than ice sheets.

“This analysis shows that on longer time scales, our planet may be much more sensitive to greenhouse gases than we thought,” Kiehl says.

Climate scientists are currently adding more sophisticated depictions of ice sheets and other factors to computer models.

As these improvements come on-line, Kiehl believes that the computer models and the paleoclimate record will be in closer agreement, showing that the impacts of carbon dioxide on climate over time will likely be far more substantial than recent research has indicated.

Because carbon dioxide is being pumped into the atmosphere at a rate that has never been experienced, Kiehl could not estimate how long it would take for the planet to fully heat up.

However, a rapid warm-up would make it especially difficult for societies and ecosystems to adapt, he says.

If emissions continue on their current trajectory, “the human species and global ecosystems will be placed in a climate state never before experienced in human history,” the paper states.

New method for reporting solar data

A straightforward new way to calculate, compile, and graphically present solar radiation measurements in a format that is accessible to decision makers and the general public has been developed by researchers at the University of Texas at Austin.

The method presents solar data in a framework that “can be used by policymakers, businesses, and the public to understand the magnitude of solar resources in a given region, which might aid consumers in selecting solar technologies, or policymakers in designing solar policies,” says David Wogan, a graduate student in mechanical engineering and public affairs at the University of Texas at Austin and the first author of a paper about the work in the American Institute of Physics’ Journal of Renewable and Sustainable Energy.

Wogan’s coauthors on the paper are Michael E. Webber, an assistant professor of mechanical engineering and the associate director of the Center for International Energy and Environmental Policy, and Alexandre K. da Silva, an assistant professor of mechanical engineering.

The method uses calculated estimates of solar insolation-the amount of solar radiation incident on the earth’s surface-and the total energy in each of Texas’s 254 counties, and presents the data in a geographic information system (GIS) format. Included in the model are daily, monthly, and yearly averages. This allows the method to be used, for example, to estimate the potential amount of solar-generated electricity that could be produced at a given location, in a given month.

In the paper, the researchers use Texas to illustrate the new method, “because its geography is very diverse,” Wogan says, “but the framework is not limited to Texas and can be expanded to other states and countries to understand how renewable energy resources are distributed, both geographically and through time.”