To a Fault: The Bottom Line on Earthquakes

Although many people think that California “owns” all the earthquakes, Ohio also has its share of faults. Unlike another earthquake that woke people on another April 18, 102 years ago, this quake was fairly mild.

Two of UC’s earthquake experts have had extensive experiences with earthquakes. Attila Kilinc is a professor in the Department of Geology in the McMicken College of Arts & Sciences and G. A. Rassati is an assistant professor in the Civil and Environmental Engineering Department in UC’s College of Engineering. Rassati just returned from presenting several seminars in Europe on structural engineering.

Rassati was inspired to become a structural engineer specializing in earthquakes after experiencing one as a child in Italy.

“I was four years old when a strong earthquake struck my region in Italy,” says Rassati. “I have a very strong memory of my Dad trying to get me out of my little bed but he couldn’t get to me. Earthquakes have always interested me ever since.” Rassati has studied the structural and seismic effects on infrastructures, especially buildings.

“Unfortunately, the money is drying up for earthquake research. I’m afraid it’s going to take another big one to draw attention to that,” says Rassati. “And we’re overdue.”

Q&A with Attila Kilinc

Q: How common are earthquakes in the Midwest and was the severity of this tremor a first for this area?

A: Between 1776 and the present, 170 earthquakes have been charted in Ohio of magnitude 2.0 or greater. There have been at least 150 below magnitude 2.0, which averages out to approximately 1½ earthquakes a year. This latest was not a first, severity-wise: Several others measured inbetween 5.3 and 5.4; in 1980, for example, an earthquake in Sharpsburg, Ky., measured 5.2.

Q: Can anyone predict a “big one” ever hitting the Midwest?

A: We haven’t reached that level of sophistication yet. That would require predicting, simultaneously, location, timing and magnitude, Kilinc says, “and that’s virtually impossible.” He adds that seismologists in San Francisco, may, however, say the probability of a magnitude 6 earthquake within the next 30 years is 50 percent.

Q: What is Cincinnati’s proximity to the nearest fault line?

A: Cincinnati is not on or close to a fault line, Kilinc says. The nearest active one is the New Madrid Fault Line, about 350 miles west of Cincinnati. The last major (7.5 or higher) New Madrid-line earthquake was in December 1811 and January 1812. The fault line actually closest to Cincinnati, Kilinc adds, is just south of Lexington, Ky., but it’s not currently active.

Q: People have commented that their dog or cat woke them up during this Midwest-based earthquake. Others say they’ve heard all their lives that animal behavior – and even illnesses of people – can predict an earthquake. How far back in history does such thinking go and is there any validity in it?

A: The U.S. Geological Survey says that references to unusual animal behavior before a significant earthquake date to 373 BC in Greece. Kilinc says that for many years, Chinese scientists in particular watched what they called precursors, such as animal behavior and radon in water, in terms of earthquake prediction. None of the signs they were watching for, he adds, showed up in Tangshan, China, on July 28, 1976. That’s the day an estimated 247,000 people died in China’s deadliest earthquake of the 20th century. Its magnitude was 7.8.

Q: For those who have been through a major earthquake in California, this has to seem like barely a rumble. Yet, for many Midwesterners, an earthquake can literally rattle the nerves! How seriously should we take such an occurrence and is there any preparation one can make for an earthquake?

A: Residents of any area should always be prepared for earthquakes, tornados, hurricanes and so forth, making logical preparations that are similar no matter the disaster. For example, if there’s a strong earthquake or tornado and electricity is lost, many people’s first reaction is to strike a match so they can see – and can cause an explosion in a gas line. Also, people tend to want to rush out of an area affected by an earthquake. Kilinc, a former California resident who’s been through many temblors in the Bay Area, says that most people are killed “trying to get in or out,” so staying put is important. Little things matter, too, such as not putting dangerous chemicals on upper shelves in a laundry room.

Q: Finally: Another widely spread urban legend claims that California will someday fall into the ocean. While that’s not going to happen, how long could it take, as the Pacific Plate moves, before Los Angeles is close to San Francisco?

A: California, Kilinc says, “will never fall into the ocean” because of a boundary called a transform fault. San Francisco will shift south and Los Angeles, north – but it will take a “long, long time” for them to meet. The USGS says that tectonic forces “in this part of the world are driving the Pacific Plate in a north-northwesterly direction with respect to the North American plate at approximately 46 millimeters per year in the San Francisco Bay Area.”

Lakes of Meltwater Can Crack Greenland’s Ice and Contribute to Faster Ice Sheet Flow

Scientists walk along the edge of a large canyon formed by many years of meltwater stream flow across the surface of the Greenland ice sheet. The lines along the wall of the canyon show the stratigraphic layers of ice and snow laid down over the years. (Photo by Sarah Das, Woods Hole Oceanographic Institution)
Scientists walk along the edge of a large canyon formed by many years of meltwater stream flow across the surface of the Greenland ice sheet. The lines along the wall of the canyon show the stratigraphic layers of ice and snow laid down over the years. (Photo by Sarah Das, Woods Hole Oceanographic Institution)

Researchers Make First Observations of Surface Meltwater Cutting through the Ice Sheet to Lubricate the Bottom

Researchers from the Woods Hole Oceanographic Institution (WHOI) and the University of Washington (UW) have for the first time documented the sudden and complete drainage of a lake of meltwater from the top of the Greenland ice sheet to its base.

From those observations, scientists have uncovered a plumbing system for the ice sheet, where meltwater can penetrate thick, cold ice and accelerate some of the large-scale summer movements of the ice sheet.

According to research by glaciologists Sarah Das of WHOI and Ian Joughin of UW, the lubricating effect of the meltwater can accelerate ice flow 50- to 100 percent in some of the broad, slow-moving areas of the ice sheet.

“We found clear evidence that supraglacial lakes-the pools of meltwater that form on the surface in summer-can actually drive a crack through the ice sheet in a process called hydrofracture,” said Das, an assistant scientist in the WHOI Department of Geology and Geophysics. “If there is a crack or defect in the surface that is large enough, and a sufficient reservoir of water to keep that crack filled, it can create a conduit all the way down to the bed of the ice sheet.”

But the results from Das and Joughin also show that while surface melt plays a significant role in overall ice sheet dynamics, it has a more subdued influence on the fast-moving outlet glaciers (which discharge ice to the ocean) than has frequently been hypothesized. (To learn more about this result, read the corresponding article.)

The research by Das and Joughin was compiled into two complementary papers and published on April 17 in the online journal Science Express. The papers will be printed in Science magazine on May 9.

Co-authors of the work include Mark Behn, Dan Lizarralde, and Maya Bhatia of WHOI; Ian Howat, Twila Moon, and Ben Smith of UW; and Matt King of Newcastle University.

Thousands of lakes form on top of Greenland’s glaciers every summer, as sunlight and warm air melt ice on the surface. Past satellite observations have shown that these supraglacial lakes can disappear in as little as a day, but scientists did not know where the water was going or how quickly, nor the impact on ice flow.

Researchers have hypothesized that meltwater from the surface of Greenland’s ice sheet might be lubricating the base. But until now, there were only theoretical predictions of how the meltwater could reach the base through a kilometer of subfreezing ice.

“We set out to examine whether the melting at the surface-which is sensitive to climate change-could influence how fast the ice can flow,” Das said. “To influence flow, you have to change the conditions underneath the ice sheet, because what’s going on beneath the ice dictates how quickly the ice is flowing.”

“If the ice sheet is frozen to the bedrock or has very little water available,” Das added, “then it will flow much more slowly than if it has a lubricating and pressurized layer of water underneath to reduce friction.”

In the summers of 2006 and 2007, Das, Joughin, and colleagues used seismic instruments, water-level monitors, and Global Positioning System sensors to closely monitor the evolution of two lakes and the motion of the surrounding ice sheet. They also used helicopter and airplane surveys and satellite imagery to monitor the lakes and to track the progress of glaciers moving toward the coast.

The most spectacular observations occurred in July 2006 when their instruments captured the sudden, complete draining of a lake that had once covered 5.6 square kilometers (2.2 square miles) of the surface and held 0.044 cubic kilometers (11.6 billion gallons) of water.

Like a draining bathtub, the entire lake emptied from the bottom in 24 hours, with the majority of the water flowing out in a 90-minute span. The maximum drainage rate was faster than the average flow rate over Niagara Falls.

Closer inspection of the data revealed that the pressure of the water from the lake split open the ice sheet from top to bottom, through 980 meters (3,200 feet) of ice. This water-driven fracture delivered meltwater directly to the base, raising the surface of the ice sheet by 1.2 meters in one location.

In the middle of the lake bottom, a 750-meter (2,400 foot) wide block of ice was raised by 6 meters (20 feet). The horizontal speed of the ice sheet–which is constantly in motion even under normal circumstances–became twice the average daily rate for that location.

“It’s hard to envision how a trickle or a pool of meltwater from the surface could cut through thick, cold ice all the way to the bed,” said Das. “For that reason, there has been a debate in the scientific community as to whether such processes could exist, even though some theoretical work has hypothesized this for decades.”

The seismic signature of the fractures, the rapid drainage, and the uplift and movement of the ice all showed that water had flowed all the way down to the bed. As cracks and crevasses form and become filled with water, the greater weight and density of the water forces the ice to crack open.

As water pours down through these cracks, it forms moulins (cylindrical, vertical conduits) through the ice sheet that allow rapid drainage and likely remain open for the rest of the melt season.

Das, Joughin, and their field research team will be featured this summer during an online- and museum-based outreach project known as Polar Discovery. Their return research expedition to Greenland will be chronicled daily through photo essays, and the researchers will conduct several live conversations with students, educators, and museum visitors via satellite phone.

Funding for the research was provided by the National Science Foundation, the National Aeronautics and Space Administration, the WHOI Clark Arctic Research Initiative, and the WHOI Oceans and Climate Change Institute.

While stability far from assured, Greenland perhaps not headed down too slippery a slope

Surface lakes of meltwater -- called supraglacial lakes -- dot the Greenland Ice Sheet. New research reveals that some are able to drain through half a mile or more of ice to bedrock where they lubricate the movement of the ice sheet. The largest lake in the image is about 2 3/4 miles at its widest. - I. Joughin/UW Polar Science Center
Surface lakes of meltwater — called supraglacial lakes — dot the Greenland Ice Sheet. New research reveals that some are able to drain through half a mile or more of ice to bedrock where they lubricate the movement of the ice sheet. The largest lake in the image is about 2 3/4 miles at its widest. – I. Joughin/UW Polar Science Center

Lubricating meltwater that makes its way from the surface down to where a glacier meets bedrock turns out to be only a minor reason why Greenland’s outlet glaciers accelerated their race to the sea 50 to 100 percent in the 1990s and early 2000s, according to University of Washington’s Ian Joughin and Woods Hole Oceanographic Institution’s Sarah Das. The two are lead co-authors of two papers posted this week on Science magazine’s Science Express.

The report also shows that surface meltwater is reaching bedrock farther inland under the Greenland Ice Sheet, something scientists had speculated was happening but had little evidence.

“Considered together, the new findings indicate that while surface melt plays a substantial role in ice sheet dynamics, it may not produce large instabilities leading to sea level rise,” says Joughin, a glaciologist with the UW’s Applied Physics Laboratory. Joughin goes on to stress that “there are still other mechanisms that are contributing to the current ice loss and likely will increase this loss as climate warms.”

Outlet glaciers are rapid flows of ice that start in the Greenland Ice Sheet and extend all the way to the ocean, where their fronts break apart in the water as icebergs, a process called calving. While most of the ice sheet moves less than one tenth a mile a year, some outlet glaciers gallop along at 7.5 miles a year, making outlet glaciers a concern because of their more immediate potential to cause sea level rise.

If surface meltwater lubrication at the intersection of ice and bedrock was playing a major role in speeding up the outlet glaciers, one could imagine how global warming, which would create ever more meltwater at the surface, could cause Greenland’s ice to shrink much more rapidly than expected — even catastrophically. Glacial ice is second only to the oceans as the largest reservoir of water on the planet and 10 percent of the Earth’s glacial ice is found in Greenland.

It turns out, however, that when considered over an entire year, surface meltwater was responsible for only a few percent of the movement of the six outlet glaciers monitored, says Joughin, lead author of “Seasonal Speedup along the Western Flank of the Greenland Ice Sheet.” Even in the summer it appears to contribute at most 15 percent, and often considerably less, to the total annual movement of these fast-moving outlet glaciers.

Calculations were made both by digitally comparing pairs of images acquired at different times from the Canadian RADARSAT satellite and by ground-based GPS measurements in a project funded by the National Science Foundation and National Aeronautics and Space Administration.

But while surface meltwater plays an inconsequential role in the movement of outlet glaciers, meltwater is responsible for 50 to 100 percent of the summer speed up for the large stretches near the edge of the ice sheet where there are no major outlet glaciers, a finding consistent with, but somewhat larger than, earlier observations.

“What Joughin, Das and their co-authors confirm is that iceflow speed up with meltwater is a widespread occurrence, not restricted to the one site where previously observed. But, they also show that the really fast-moving ice doesn’t speed up very much with this. So we can expect the ice sheet in a warming world to shrink somewhat faster than previously expected, but this mechanism will not cause greatly faster shrinkage,” says Richard Alley, professor of geosciences at Pennsylvania State University, who is not connected with the papers.

So what’s behind the speed up of Greenland’s outlet glaciers? Joughin says he thinks what’s considerably more significant is when outlet glaciers lose large areas of ice at their seaward ends through increased calving, which may be affected by warmer temperatures. He’s studied glaciers such as Jakobshavn Isbrae, one of Greenland’s fastest-moving glaciers, and says that as ice calves and icebergs float away it is like removing a dam, allowing ice farther uphill to stream through to the ocean more quickly. At present, iceberg calving accounts for approximately 50 percent of the ice loss of Greenland, much of which is balanced by snowfall each winter. Several other studies recently have shown that the loss from calving is increasing, contributing at present rates to a rise in sea level of 1 to 2 inches per century.

“We don’t yet know what warming temperatures means for increased calving of icebergs from the fronts of these outlet glaciers,” Joughin says.

Until now scientists have only speculated if, and how, surface meltwater might make it to bedrock from high atop the Greenland Ice Sheet, which is a half-mile or more thick in places. The paper “Fracture Propagation to the Base of the Greenland Ice Sheet During Supraglacial Lake Drainage,” with Woods Hole Oceanographic Institution’s glaciologist Das as lead author, presents evidence of how a lake that disappeared from the surface of the inland ice sheet generated so much pressure and cracking that the water made it to bedrock in spite of more than half a mile of ice.

The glacial lake described in the paper was 2 to 2 ½ miles at its widest point and 40 feet deep. Researchers installed monitoring instruments and, 10 days after leaving the area, a large fracture developed, a crack spanning nearly the full length of the lake. The lake drained in 90 minutes with a fury comparable to that of Niagara Falls. (The researchers were ever so glad they hadn’t been on the lake in their 10-foot boat with its 5-horsepower engine and don’t plan future instrument deployments when the lakes are full of water. They’ll get them in place only when the lakes are dry.)

Measurements after the event suggest there’s an efficient drainage system under the ice sheet that dispersed the meltwater widely. The draining of multiple lakes each could explain the observed net regional summer ice speedup, the authors write.

Along with Das and Joughin other authors on the two papers are Matt King, Newcastle University, UK; Ben Smith, Ian Howat (now at Ohio State) and Twila Moon of the UW’s Applied Physics Laboratory; Mark Behn and Dan Lizarralde of Woods Hole Oceanographic Institution; and Maya Bhatia, Massachusetts Institute of Technology/WHOI Joint Program.

New hazard estimates could downplay quake dangers

The dangers posed by a major earthquake in the New Madrid and Charleston, South Carolina zones in the Midwestern and Southern parts of the United States may be noticeably lower than current estimates if seismologists adjust one of the major assumptions that go into calculating seismic hazard, according to a study presented at the Seismological Society of America.

The study revolves around this question: is it unlikely that one major earthquake will follow directly on the heels of a big quake, or are other major earthquakes equally likely to occur any time after a major quake” Hazard estimates for a seismic zone depend on which scenario seismologists choose to plug into their hazard calculations.

The present hazard maps for New Madrid and Charleston use the second assumption. However when seismologist Seth Stein of Northwestern University and Northwestern senior James Hebden chose the first scenario-that a quake is unlikely to occur right after another quake, but that the likelihood of a new quake increases over time-they found that the seismic hazard maps of the New Madrid and Charleston areas looked a lot less dire than current predictions for the regions.

Their “time-dependent” model suggests that the likelihood of another earthquake is relatively low for the first two-thirds of the predicted average interval between earthquakes, after which the likelihood of another quake begins to climb.

The New Madrid and Charleston zones are still in the early years of their earthquake cycle, so the hazard may not be as great as suggested by the prevailing “time-independent” models that assume another quake is equally likely to occur at any moment, according to the researchers.

Stein says the idea behind the study is not to dismiss the risk of a major earthquake in the two regions, but to shed light on the assumptions that go into making hazard maps, which ultimately affect a region’s building codes and other costly preparations.

“We want to know how well we can predict that shaking. If we overpredict, communities could be spending enormous amounts of money [on earthquake preparation] that they could be spending on other things,” Stein said. “We look at it as whether you’re going to spend money putting steel in your schools that might be better spent hiring teachers.”

“What we’re saying is that this may be nowhere as serious a problem as you’ve been told, and you don’t need to prepare in St. Louis the way we do in Los Angeles, because that may be doing more harm than good,” he added.

The desire to prepare is understandable, given the devastation caused by the last major earthquakes in the New Madrid zone in 1811 and 1812, and in Charleston in 1886. The 1811-1812 New Madrid earthquakes uprooted entire forests and changed the course of the Mississippi River. The Charleston earthquake killed more than 60 people and caused damage to nearly every structure in the city, traces of which can still be seen today.

To prepare for the potential dangers of similar severe quakes in the future, seismologists construct hazard maps, which predict the extent of earthquake shaking that has a certain probability of occurring in a geographical area. The hazard maps take into account the possible magnitude of the next earthquake, the likely ground shaking, the time window in which the next quake is likely to occur, and whether earthquakes are time-dependent or time-independent processes.

It’s an admittedly “squishy” calculation, Stein says, even in places like California’s San Andreas zone that have experienced many more earthquakes in recent years and have been monitored by a blanket of instruments.

Stein and his colleagues have tested each of these variables, from magnitude to timing, to explore which factors may have the greatest effect on hazard mapping for the central U.S.. But he says that the question of time-dependent or time-independent earthquakes is “the meatiest scientific question” among the mapping variables.

The question goes to the heart of how earthquakes work. For instance, most seismologists think there is a buildup of elastic strain in the earth before a quake occurs, and that the strain is relieved for a time by the quake. Under this scenario, a time-dependent model of earthquakes might make more sense to use in hazard maps. But it’s far from clear that the popular strain buildup model completely describes the physics of earthquakes, Stein says.

“It’s actually kind of embarrassing that we don’t know the answer to this,” Stein jokes. “But when you do this kind of thing, you want to have a healthy humility in the face of the complexities of nature.”

“Time-Dependent Seismic Hazard Maps for the New Madrid Seismic Zone and Charleston, South Carolina Areas” Hebden, J.S. and Stein, S., Department of Earth and Planetary Sciences, Northwestern University, Evanston, IL 60208

Tiny tremors can track extreme storms in a warming planet

Data from faint earth tremors caused by wind-driven ocean waves-often dismissed as “background noise” at seismographic stations around the world-suggest extreme ocean storms have become more frequent over the past three decades, according to research presented at the annual meeting of the Seismological Society of America.

The International Panel on Climate Change (IPCC) and other prominent researchers have predicted that stronger and more frequent storms may occur as a result of global warming trends. The tiny tremors, or microseisms, offer a new way to discover whether these predictions are already coming true, said Richard Aster, a geophysics professor at the New Mexico Institute of Mining and Technology.

Unceasing as the ocean waves that trigger them, the microseisms show up as five- to 30-second oscillations of Earth’s surface at seismographic stations around the world. Even seismic monitoring stations “in the middle of a continent are sensitive to the waves crashing all around the continent,” Aster said.

As storm winds drive ocean waves higher, the microseism signals increase their amplitude as well, offering a unique way to track storm intensities across seasons, over time, and at different geographical locations. For instance, Aster and colleagues Daniel McNamara from the U.S. Geological Survey and Peter Bromirski of the Scripps Institution of Oceanography recently published analysis in the Seismological Society of America journal Seismological Research Letters showing that microseism data collected around the Pacific Basin and throughout the world could be used to detect and quantify wave activity from multi-year events such as the El Niño and La Niña ocean disruptions.

The findings spurred them to look for a microseism signal that would reveal whether extreme storms were becoming more common in a warming world. In fact, they saw “a remarkable thing,” among the worldwide microseism data collected from 1972 to 2008, Aster recalled. In 22 of the 22 stations included in the study, the number of extreme storm events had increased over time, they found.

While the work on evaluating changes in extreme storms is “still very much in its early stages”, Aster is “hoping that the study will offer a much more global look” at the effects of climate change on extreme storms and the wind-driven waves that they produce. At the moment, most of the evidence linking the two comes from studies of hurricane intensity and shoreline erosion in specific regions such as the Pacific Northwest Gulf of Mexico, he noted.

The researchers are also working on recovering and digitizing older microseism records, potentially creating a data set that stretches back to the 1930s. Aster praised the work of the long-term observatories that have collected the records, calling them a good example of the “Cinderella science”-unloved and overlooked-that often support significant discoveries.

“It’s absolutely great data on the state of the planet. We took a prosaic time series, and found something very interesting in it,” he said.

Methane sources over the last 30,000 years

Dr. Hubertus Fischer cutting an ice core at Kohnen Station, Antarctica. - Credit: Gerald Traufetter
Dr. Hubertus Fischer cutting an ice core at Kohnen Station, Antarctica. – Credit: Gerald Traufetter

New insights into natural changes in atmospheric methane concentrations

Ice cores are essential for climate research, because they represent the only archive which allows direct measurements of atmospheric composition and greenhouse gas concentrations in the past. Using novel isotopic studies, scientists from the European Project for Ice Coring In Antarctica (EPICA) were now able to identify the most important processes responsible for changes in natural methane concentrations over the transition from the last ice age into our warm period. The study now published in the scientific magazine nature shows that wetland regions emitted significantly less methane during glacial times. In contrast methane emissions by forest fire activity remained surprisingly constant from glacial to interglacial times.

In the current issue of Nature, members of the EPICA team publish new insights into natural changes in the atmospheric concentrations of the second most important greenhouse gas methane (CH4). The scientist present the first glacial/interglacial record of the carbon isotopic composition of methane (?13CH4) providing essential information on the sources being responsible for the observed CH4 concentration changes.

The well known glacial/interglacial changes in atmospheric methane concentrations are quite drastic. Glacial concentration were on average 350 ppbv (part per billion by volume) and increased to approximately 700 ppbv during the last glacial/interglacial transition superimposed by rapid shifts of about 200 ppbv connected to rapid climate changes. During the last centuries human methane emissions artificially increased CH4 concentrations to approximately 1750 ppbv.

But what caused these substantial changes in natural atmospheric CH4 concentrations prior to the human impact? To answer this question, the scientists developed a new analytical method that now allows to quantify changes in the isotopic ratio of 12CH4 and 13CH4 in ice core samples. This ratio provides insight into the responsible methane sources.

“These studies bring us much closer to a quantitative understanding of what happened with wetlands and methane in the past”, says Dr. Hubertus Fischer from the Alfred-Wegener-Institute for Polar and Marine Research, who is the lead author of the publication and coordinator of the gas studies on the EPICA ice cores. “This is essential to also improve our predictions of how the methane cycle will respond to an increased warming in the future”, he adds.

The study shows, that tropical wetlands emitted substantially less CH4 during glacials; most likely caused by changes in monsoonal precipitation patterns. Together with a reduced atmospheric lifetime, this explains major parts of the glacial CH4 reduction. In addition, boreal methane sources located in wetlands in higher northern latitudes were essentially switched off during the glacial due to the expansion of the northern ice sheets and the very cold temperatures in high northern latitude. However, these high latitude wetlands were quickly reactivated when rapid climate warming events occurred. Also forest fires emit a considerable amount of CH4, which, however, remained surprisingly constant over time. The isotopic measurements show no signs of CH4 emissions by a destabilization of marine gas hydrate reservoirs when climate was warming.

The current results were published by a team of scientist from Germany, France and Switzerland. As the German partner within EPICA, the Alfred-Wegener-Institute was responsible for the drilling operation of the ice core used for this study. In addition, it specialized on the development of new analytical techniques to measure isotopes in greenhouse gases and the interpretation of changes in biogeochemical cycles in the past. Coordinated by the European Science Foundation (ESF), EPICA is funded by the participating countries and the European Union. EPICA is one of the core projects of the AWI Research Program “Maritime, Coastal and Polar Systems” in the “Earth and Environment” research section of the Helmholtz-Gemeinschaft. For its outstanding effort and large impact on climate research, EPICA has recently received the Descartes Prize for Transnational, Collaborative Research awarded by the European Commission.

Unearthing clues of catastrophic earthquakes

‘An inviting tale of destruction’

The destruction and disappearance of ancient cultures mark the history of human civilization, making for fascinating stories and cautionary tales. The longevity of today’s societies may depend upon separating fact from fiction, and archeologists and seismologists are figuring out how to join forces to do just that with respect to ancient earthquakes, as detailed in new studies presented at the international conference of the Seismological Society of America.

“It’s an idea whose time has come, ” said Robert Kovach, professor of geophysics at Stanford University and a leading proponent that seismology needs to be included in any framework for understanding what happened to past civilizations. Very large earthquakes may have recurrence rates that exceed 500 years, making it very difficult to assign potential hazard estimates.

Archaeoseismology, a young scientific discipline that studies past earthquakes in the archaeological record, allows scientists to broaden the time window to detect these rare seismic catastrophic events. But archaeological evidence for past earthquakes raises a lot of reservations from seismologists, some of them strongly questioning whether man-made structures can be used as earthquake indicators at all.

Controversy stems from what is seen by some seismologists as haphazard blame placed on earthquakes by archaeologists for inexplicable phenomena on an archaeological site, adding drama to the site’s history. “We need to be wary of circular reasoning” said Tina Niemi, a geologist at the University of Missouri-Kansas City, who noted the temptation to assign evidence to match a preconceived notion that an earthquake may have caused damage.

“We are indeed at a turning point with respect to archaeoseismology — either earthquake evidence in archaeological sites remains in a world of conjecture and drama or a more objective and quantitative approach gets the upper hand,” said Manuel Sintubin, professor of geodynamics at Katholieke Universiteit Leuven in Belgium.

Earlier this month UNESCO awarded a five-year grant to Sintubin and his colleagues Niemi; Iain Stewart, geologist at University of Plymouth in the United Kingdom; and Erhan Altunel, geologist at the Eskisehir Osmangazi University in Turkey, to support archaeoseismology by broadening the field’s primary focus from the Near East to include the Far East.

“The importance of this effort is to create a long-term, worldwide platform for a broad multidisciplinary discussion on archaeoseismology. Our final objective is to assure that archaeoseismology will be considered as a legitimate and complementary source of seismic-hazard information.”

There is still much to be known about ancient earthquakes. The instrumental record for seismology is short, going back 100 years. The historical seismology record is a much longer, including written documentation such as news accounts and diaries, which vary widely by culture and region. The archeoseismic record serves as the bridge between historical accounts and the paleoseismic record of Earth’s history.

“It’s important to society to understand the risks posed by earthquakes with longer repeating cycles,” said Kovach. “Unless the world was drastically different than today, then it’s inconceivable that earthquakes did not play a role in the past to affect the cultures that occupied the land along the faults, some of which we do not even know of yet,” said Kovach.

Seismologists look for evidence that suggest an earthquake’s footprint. Sintubin and Niemi cite three distinct types of evidence: faulted and displaced archaeological relics, or “cultural piercing features”; ground-shaking induced damage to buildings and damage induced by secondary phenomena, such as tsunamis; and archaeological evidence, such as repairs to man-made structures.

Kovach looks at the issue of water, such as the damming of rivers and changing elevation of coasts. His research has focused on Banbhore, which is an inland city that was once the ancient coastal city of Debal, the gateway for Islam’s advent in the Indian subcontinent. According to Kovach, the site has witnessed at least four distinct Muslim occupations and three successive reconstructions that correlate to the written record by Arab historians. “There are numerous examples in the Indus Valley that earthquakes did affect the occupying history of these sites,” said Kovach. Today, most of Pakistan and the western states of India occupy the ancient Indus Valley, which experienced the earthquakes that, according to Kovach, altered the course of civilization there over the past millennium.

Sintubin and Stewart are proposing a standardized method to study an archaeological site with the purpose of identifying ancient earthquakes and to evaluate existing archaeoseismological data. The research is currently in process for publication by the Bulletin of the Seismological Society of America. Called the Archeological Quality Factor, or AQF, this proposed evaluative approach would document a degree of certainty of an ancient earthquake recorded at a site. According to Sintubin, the approach reveals the weaknesses in any earthquake hypothesis at a site and constitutes a significant step in the overall acknowledgement of archaeoseismology as a scientific discipline. Sintubin applied the method to research conducted at an excavation in Turkey. The resulting AQF (~5%) turns out to support with some certainty the hypothesis that the region has been struck in the 7th century AD by a previously unknown major earthquake.

While some remain cautious, others are eager to refine the role of earthquakes on past cultures. “A lot can be gleaned from going back to look at old reports,” said Kovach. “Past earthquakes have left an inviting tale of destruction.”

Archaeoseismological Methodologies: Principles and Practices, SSA Annual Convention, 1:30 – 5 PM, Wednesday, 16 April, in the Hilton Hotel, Mesa C

Absence of clouds caused pre-human supergreenhouse periods

In a world without human-produced pollution, biological productivity controls cloud formation and may be the lever that caused supergreenhouse episodes during the Cetaceous and Eocene, according to Penn State paleoclimatologists.

“Our motivation was the inability of climate models to reproduce the climate of the supergreenhouse episodes of the Cetaceous and Eocene adequately,” said Lee R. Kump, professor of geosciences. “People have tried increasing carbon dioxide in the models to explain the warming, but there are limits to the amounts that can be added because the existing proxies for carbon dioxide do not show such large amounts.”

In general, the proxies indicate that the Cretaceious and Eocene atmosphere never exceeded four times the current carbon dioxide level, which is not enough for the models to create supergreenhouse conditions. Some researchers have tried increasing the amount of methane, another greenhouse gas, but there are no proxies for methane. Another approach is to assume that ocean currents changed, but while researchers can insert new current information into the models, they cannot get the models to create these ocean current scenarios.

Kump and David Pollard, senior research associate, Earth and Environmental Systems Institute, looked for another way to create a world where mean annual temperatures in the tropics were above 100 degrees Fahrenheit and polar temperatures were in the 50-degree Fahrenheit range. Changing the Earth’s albedo — the amount of sunlight reflected into space – by changing cloud cover will produce supergreenhouse events, the researchers report in today’s (April 11) issue of Science.

According to the researchers, changes in the production of cloud condensation nuclei, the tiny particles around which water condenses to form rain drops and cloud droplets, decreased Earth’s cloud cover and increase the sun’s warming effect during supergreenhouse events.

Normal cloud cover reflects about 30 percent of the sun’s energy back into space. Kump and Pollard were looking for a scenario that allowed in 6 to 10 percent more sunlight.

“In today’s world, human generated aerosols, pollutants, serve as cloud condensation nuclei,” says Kump. “Biologically generated gases are dominant in the prehuman world. The abundance of these gases is correlated with the productivity of the oceans.”

Today, the air contains about 1,000 particles that can serve as cloud condensation nuclei (CCN) in a cubic centimeter (less than a tenth of a cubic inch). Pristine ocean areas lacking human produced aerosols are difficult to find, but in those areas algae produce dimethylsulfide that eventually becomes the CCNs of sulfuric acid or methane sulfonic acid.

Algae’s productivity depends on the amounts of nutrients in the water and these nutrients come to the surface by upwelling driven by the winds. Warming would lead to ocean stratification and less upwelling.

“The Cetaceous was biologically unproductive due to less upwelling in the ocean and thermal stress on land and in the sea,” says Kump. “That means fewer cloud condensation nuclei.”

When there are large numbers of CCN, there are more cloud droplets and smaller droplets, consequently more cloud cover and brighter clouds. With fewer CCN, there are fewer droplets and they are larger. The limit to droplet size is 16 to 20 microns because the droplets then are heavy enough to fall out as rain.

“We began with the assumption that what would change was not the extent of clouds, but their brightness,” says Kump. “The mechanism would lead to reduced reflection but not cloudiness.”

What they found was that the clouds were less bright and that there were also fewer clouds. If they lowered the production of biogenic CCNs too much, their model created a world with remarkable warming inconsistent with life. However, they could alter the productivity in the model to recreate the temperature regime during supergreenhouse events.

“The model reduces cloud cover from about 64 percent to 55 percent which lets in a large amount of direct sunlight,” Kump says. “The increased breaks in the clouds, fewer clouds and less reflective clouds produced the amount of warming we were looking for.”

Journey to the Center of the Earth: Discovery Sheds Light on Mantle Formation

Geologist unearths ancient rocks from ocean floor dating back two billion years

Uncovering a rare, two-billion-year-old window into the Earth’s mantle, a University of Houston professor and his team have found our planet’s geological history is more complex than previously thought.

Jonathan Snow, assistant professor of geosciences at UH, led a team of researchers in a North Pole expedition, resulting in a discovery that could shed new light on the mantle, the vast layer that lies beneath the planet’s outer crust. These findings are described in a paper titled “Ancient, highly heterogeneous mantle beneath Gakkel Ridge, Arctic Ocean,” appearing recently in Nature, the weekly scientific journal for biological and physical sciences research.

These two-billion-year-old rocks that time forgot were found along the bottom of the Arctic Ocean floor, unearthed during research voyages in 2001 and 2004 to the Gakkel Ridge, an approximately 1,000-mile-long underwater mountain range between Greenland and Siberia. This massive underwater mountain range forms the border between the North American and Eurasian plates beneath the Arctic Ocean, where the two plates diverge.

These were the first major expeditions ever undertaken to the Gakkel Ridge, and these latest published findings are the fruit of several years of research and millions of dollars spent to retrieve and analyze these rocks.

The mantle, the rock layer that comprises about 70 percent of the Earth’s mass, sits several miles below the planet’s surface. Mid-ocean ridges like Gakkel, where mantle rock is slowly pushing upward to form new volcanic crust as the tectonic plates slowly move apart, is one place geologists look for clues about the mantle. Gakkel Ridge is unique because it features – at some locations – the least volcanic activity and most mantle exposure ever discovered on a mid-ocean ridge, allowing Snow and his colleagues to recover many mantle samples.

“I just about fell off my chair,” Snow said. “We can’t exaggerate how important these rocks are – they’re a window into that deep part of the Earth.”

Venturing out aboard a 400-foot-long research icebreaker, Snow and his team sifted through thousands of pounds of rocks scooped up from the ocean floor by the ship’s dredging device. The samples were labeled and cataloged and then cut into slices thinner than a human hair to be examined under a microscope. That is when Snow realized he found something that, for many geologists, is as rare and fascinating as moon rocks – mantle rocks devoid of sea floor alteration. Analysis of the isotopes of osmium, a noble metal rarer than platinum within the mantle rocks, indicated they were two billion years old. The use of osmium isotopes underscores the significance of the results, because using them for this type of analysis is still a new, innovative and difficult technique.

Since the mantle is slowly moving and churning within the Earth, geologists believe the mantle is a layer of well-mixed rock. Fresh mantle rock wells up at mid-ocean ridges to create new crust. As the tectonic plates move, this crust slowly makes its way to a subduction zone, a plate boundary where one plate slides underneath another and the crust is pushed back into the mantle from which it came.

Because this process takes about 200 million years, it was surprising to find rocks that had not been remixed inside the mantle for two billion years. The discovery of the rocks suggests the mantle is not as well-mixed or homogenous as geologists previously believed, revealing that the Earth’s mantle preserves an older and more complex geologic history than previously thought. This opens the possibility of exploring early events on Earth through the study of ancient rocks preserved within the Earth’s mantle.

The rocks were found during two expeditions Snow and his team made to the Arctic, each lasting about two months. The voyages were undertaken while Snow was a research scientist at the Max Planck Institute in Germany, and the laboratory study was done by his research team that now stretches from Hawaii to Houston to Beijing.

Since coming to UH in 2005, Snow’s work stemming from the Gakkel Ridge samples has continued, with more research needed to determine exactly why these rocks remained unmixed for so long. Further study using a laser microprobe technique for osmium analysis available only in Australia is planned for next year.

Geologists Discover New Way of Estimating Size and Frequency of Meteorite Impacts

Geologists have discovered a new way of estimating the size of impacts from meteorites. - Credit: NASA
Geologists have discovered a new way of estimating the size of impacts from meteorites. – Credit: NASA

Meteorite linked to mass extinction 65 million years ago was four to six kilometers in diameter

Scientists have developed a new way of determining the size and frequency of meteorites that have collided with Earth.

Their work shows that the size of the meteorite that likely plummeted to Earth at the time of the Cretaceous-Tertiary (K-T) boundary 65 million years ago was four to six kilometers in diameter. The meteorite was the trigger, scientists believe, for the mass extinction of dinosaurs and other life forms.

François Paquay, a geologist at the University of Hawaii at Manoa (UHM), used variations (isotopes) of the rare element osmium in sediments at the ocean bottom to estimate the size of these meteorites. The results are published in this week’s issue of the journal Science.

When meteorites collide with Earth, they carry a different osmium isotope ratio than the levels normally seen throughout the oceans.

“The vaporization of meteorites carries a pulse of this rare element into the area where they landed,” says Rodey Batiza of the National Science Foundation (NSF)’s Division of Ocean Sciences, which funded the research along with NSF’s Division of Earth Sciences. “The osmium mixes throughout the ocean quickly. Records of these impact-induced changes in ocean chemistry are then preserved in deep-sea sediments.”

Paquay analyzed samples from two sites, Ocean Drilling Program (ODP) site 1219 (located in the Equatorial Pacific), and ODP site 1090 (located off of the tip of South Africa) and measured osmium isotope levels during the late Eocene period, a time during which large meteorite impacts are known to have occurred.

“The record in marine sediments allowed us to discover how osmium changes in the ocean during and after an impact,” says Paquay.

The scientists expect that this new approach to estimating impact size will become an important complement to a more well-known method based on iridium.

Paquay, along with co-author Gregory Ravizza of UHM and collaborators Tarun Dalai from the Indian Institute of Technology and Bernhard Peucker-Ehrenbrink from the Woods Hole Oceanographic Institution, also used this method to make estimates of impact size at the K-T boundary.

Even though these method works well for the K-T impact, it would break down for an event larger than that: the meteorite contribution of osmium to the oceans would overwhelm existing levels of the element, researchers believe, making it impossible to sort out the osmium’s origin.

Under the assumption that all the osmium carried by meteorites is dissolved in seawater, the geologists were able to use their method to estimate the size of the K-T meteorite as four to six kilometers in diameter.

The potential for recognizing previously unknown impacts is an important outcome of this research, the scientists say.

“We know there were two big impacts, and can now give an interpretation of how the oceans behaved during these impacts,” says Paquay. “Now we can look at other impact events, both large and small.”