New research puts focus on earthquake, tsunami hazard for southern California

Scientists will convene in San Diego to present the latest seismological research at the annual conference of the Seismological Society of America (SSA), April 17-19.

This year’s meeting is expected to draw a record number of registrants, with more than 630 scientists in attendance, and will feature 292 oral presentations and 239 poster presentations.

“For over 100 years the Annual Meeting of SSA has been the forum of excellence for presenting and discussing exciting new developments in seismology research and operations in the U.S. and globally,” said Christa von Hillebrandt-Andrade, president of SSA, which is a scientific society devoted to the advancement of earthquake science. von Hillebrandt-Andrade is manager of the NOAA National Weather Service Caribbean Tsunami Warning Program in Puerto Rico.

A special public town hall meeting is scheduled for the evening of April 17, featuring talks by experts on the seismic hazard to San Diego from future earthquakes and tsunamis.

“We are extremely excited by the range, depth, and quality of science to be presented at this meeting” said David Oglesby, associate professor of earth sciences at the University of California, Riverside. “The meeting will cover all aspects of seismology and earthquake science, from geology to numerical models, and from seismograms to tsunamis. Our location near the US-Mexican border also help to illuminate the exciting opportunities in international scientific collaborations,” said Oglesby, who is a co-organizer of the conference program along with Raul Castro, a seismologist at the Centro de Investigación Científica y de Educación Superior de Ensenada, Baja California.

The presentations by the international gathering of seismologists will focus on a broad range of topics, covering the Earth’s surface to its center. Some highlights that focus more closely on the San Diego area include:

Downtown San Diego:

The city of San Diego sits atop a fault system that poses considerable seismic hazard to the millions of the region’s residents. In an evaluation by Ivan Wong and colleagues from the URS Corporation, an international engineering consulting firm, the potential hazard from both strong ground shaking and surface faulting was quantified in the downtown area. Several rupture scenarios of the Rose Canyon fault system were considered including rupture of the associated San Diego fault that traverses downtown San Diego. The surface faulting hazard for locations along the San Diego fault is estimated to be low because of its low rate of activity but the ground shaking hazard is probably high throughout much of San Diego because of the distributed nature of the Rose Canyon fault system.

The behavior of the Rose Canyon fault system as it traverses San Diego is poorly understood. It is unclear what the role of individual faults in the fault system are in the vicinity of San Diego Bay and the downtown area in a large magnitude 7+ earthquake and how often such events may occur. “It is clear however that the threat to the city from a future large earthquake is considerable and that research is needed to define what that level of hazard is,” said Ivan Wong, principal seismologist and vice president of URS Corporation.

San Jacinto Fault Zone:

Geophysicist Tom Rockwell, and colleagues from San Diego State University will describe the latest research findings on the San Jacinto Fault (SJF) Zone, which is a seismically active, major component of the overall southern San Andreas Fault system, and of particular importance to the San Diego region. They have mapped evidence of past ruptures consistent with very large earthquakes along the Clark Fault, an individual strand associated with the SJF.

Tom Rockwell and other presenters will discuss their work at a news briefing on April 19, beginning at 12:10 p.m. (local time) in the Terrace Salon 2 room of the Town and Country Resort and Convention Hotel.

Offshore faults:

A new map of active faults off the coast of southern California could clarify some of the earthquake hazard for the region, say Jaime Conrad of the U.S. Geological Survey and colleagues. Although this area is crisscrossed by faults, the seismic hazard posed by their activity isn’t well understood, partly because it’s unclear how much the faults slip and how they interact.

The new map covers a series of faults in the near-shore portion of the region known as the Inner Continental Borderland, located between the coast and the San Clemente fault, about 35-40 miles offshore. The crumpled and uplifted seafloor from Santa Monica Bay to the Mexican border includes several high-angled and north-south trending faults. Using high-resolution seismic reflection data from a number of sources, including multiple sources of sonar beamed from research ships and unmanned underwater vehicles, the researchers were able to revise the current map in some surprising ways. The data show linkages between faults that were not known previously, for example, and in some cases show a fault slip rate of 1-2 millimeters per year.

Researchers call for a new direction in oil spill research

Inadequate knowledge about the effects of deepwater oil well blowouts such as the Deepwater Horizon event of 2010 threatens scientists’ ability to help manage and assess comparable events in future, according to an article that a multi-author group of specialists will publish in the May issue of BioScience. Even federal “rapid response” grants awarded to study the Deepwater Horizon event were far more focused on near-surface effects than on the deepwater processes that the BioScience authors judge to be most in need of more research.

The article, by a team led by Charles H. Peterson of the University of North Carolina, argues that a fundamentally new approach to the study of deepwater oil spills is needed. Previous research has focused mainly on effects on organisms found near the sea surface and on coasts. The new approach would also stress how oil and associated gas released at depth move through the sea and affect subsurface and bottom-dwelling organisms. The new approach is all the more important because the oil industry is now putting most of its exploration efforts into deep water.

Peterson and his colleagues point out that existing policies and legislation have notably failed to provide for research initiated promptly after a spill has been detected. This has prevented studies that might have guided emergency response procedures two years ago, in particular studies of the effects of chemical dispersants. These were used extensively while the Deepwater Horizon spill was in progress, although there is little consensus on their effectiveness.

There remain “serious gaps” in background information needed for longer-term assessments of comparable spills, according to Peterson and his coauthors. Much more information is needed about deep-sea ecology and the processes by which oil released at depth is degraded by microbes, for example. The gaps impede not only litigation and improvement of government policy, but also attempts to restore damaged ecosystems

New method to prevent undersea ice clogs

During the massive oil spill from the ruptured Deepwater Horizon well in 2010, it seemed at first like there might be a quick fix: a containment dome lowered onto the broken pipe to capture the flow so it could be pumped to the surface and disposed of properly. But that attempt quickly failed, because the dome almost instantly became clogged with frozen methane hydrate.

Methane hydrates, which can freeze upon contact with cold water in the deep ocean, are a chronic problem for deep-sea oil and gas wells. Sometimes these frozen hydrates form inside the well casing, where they can restrict or even block the flow, at enormous cost to the well operators.

Now researchers at MIT, led by associate professor of mechanical engineering Kripa Varanasi, say they have found a solution, described recently in the journal Physical Chemistry Chemical Physics. The paper’s lead author is J. David Smith, a graduate student in mechanical engineering.

The deep sea is becoming “a key source” of new oil and gas wells, Varanasi says, as the world’s energy demands continue to increase rapidly. But one of the crucial issues in making these deep wells viable is “flow assurance”: finding ways to avoid the buildup of methane hydrates. Presently, this is done primarily through the use of expensive heating systems or chemical additives.

“The oil and gas industries currently spend at least $200 million a year just on chemicals” to prevent such buildups, Varanasi says; industry sources say the total figure for prevention and lost production due to hydrates could be in the billions. His team’s new method would instead use passive coatings on the insides of the pipes that are designed to prevent the hydrates from adhering.

These hydrates form a cage-like crystalline structure, called clathrate, in which molecules of methane are trapped in a lattice of water molecules. Although they look like ordinary ice, methane hydrates form only under very high pressure: in deep waters or beneath the seafloor, Smith says. By some estimates, the total amount of methane (the main ingredient of natural gas) contained in the world’s seafloor clathrates greatly exceeds the total known reserves of all other fossil fuels combined.

Inside the pipes that carry oil or gas from the depths, methane hydrates can attach to the inner walls – much like plaque building up inside the body’s arteries – and, in some cases, eventually block the flow entirely. Blockages can happen without warning, and in severe cases require the blocked section of pipe to be cut out and replaced, resulting in long shutdowns of production. Present prevention efforts include expensive heating or insulation of the pipes or additives such as methanol dumped into the flow of gas or oil. “Methanol is a good inhibitor,” Varanasi says, but is “very environmentally unfriendly” if it escapes.

Varanasi’s research group began looking into the problem before the Deepwater Horizon spill in the Gulf of Mexico. The group has long focused on ways of preventing the buildup of ordinary ice – such as on airplane wings – and on the creation of superhydrophobic surfaces, which prevent water droplets from adhering to a surface. So Varanasi decided to explore the potential for creating what he calls “hydrate-phobic” surfaces to prevent hydrates from adhering tightly to pipe walls. Because methane hydrates themselves are dangerous, the researchers worked mostly with a model clathrate hydrate system that exhibits similar properties.

The study produced several significant results: First, by using a simple coating, Varanasi and his colleagues were able to reduce hydrate adhesion in the pipe to one-quarter of the amount on untreated surfaces. Second, the test system they devised provides a simple and inexpensive way of searching for even more effective inhibitors. Finally, the researchers also found a strong correlation between the “hydrate-phobic” properties of a surface and its wettability – a measure of how well liquid spreads on the surface.

The basic findings also apply to other adhesive solids, Varanasi says – for example, solder adhering to a circuit board, or calcite deposits inside plumbing lines – so the same testing methods could be used to screen coatings for a wide variety of commercial and industrial processes.

Has the Dead Sea used up its 9 lives?

Rapidly dropping water levels of the Dead Sea, the lowest point on the earth’s surface heralded for its medicinal properties, has been a source of ecological concern for years. Now a drilling project led by researchers from Tel Aviv University and Hebrew University reveals that water levels have risen and fallen by hundreds of meters over the last 200,000 years.

Directed by Prof. Zvi Ben-Avraham of TAU’s Minerva Dead Sea Research Center and Prof. Mordechai Stein of the Geological Survey of Israel, researchers drilled 460 meters beneath the sea floor and extracted sediments spanning 200,000 years. The material recovered revealed the region’s past climatic conditions and may allow researchers to forecast future changes.

Layers of salt indicated several periods of dryness and very little rainfall, causing water to recede and salt to gather at the center of the lake. During the last interglacial period, approximately 120,000 years ago, the sea came close to drying up entirely, the researchers found, with another period of extreme dryness taking place about 13,000 years ago.

Today, the Dead Sea lies 426 meters below sea level and is receding rapidly. Despite this historical precedent, there is still cause for concern, says Prof. Ben-Avraham. In the past the change was climate-driven, the result of natural conditions; today, the lake is threatened by human activity.

“What we see happening in the Middle East is something that mimics a severe dry period, but this is not climate-enforced, this is a man-made phenomenon,” he warns, caused by increasing amounts of water being taken from rivers for irrigation before it reaches the Dead Sea. Ultimately, this prevents the refilling of the sea by the waters of the Jordan River.

Shifting sands

Sand in an hourglass might seem simple and straightforward, but such granular materials are actually tricky to model. From far away, flowing sand resembles a liquid, streaming down the center of an hourglass like water from a faucet. But up close, one can make out individual grains that slide against each other, forming a mound at the base that holds its shape, much like a solid.

Sand’s curious behavior – part fluid, part solid – has made it difficult for researchers to predict how it and other granular materials flow under various conditions. A precise model for granular flow would be particularly useful in optimizing processes such as pharmaceutical manufacturing and grain production, where tiny pills and grains pour through industrial chutes and silos in mass quantities. When they aren’t well-controlled, such large-scale flows can cause blockages that are costly and sometimes dangerous to clear.

Now Ken Kamrin of MIT’s Department of Mechanical Engineering has come up with a model that predicts the flow of granular materials under a variety of conditions. The model improves on existing models by taking into account one important factor: how the size of a grain affects the entire flow. Kamrin used the new model to predict sand flow in several configurations – including a chute and a circular trough – and found that the model’s predictions were a near-perfect match with actual results. A paper detailing the new model will appear in the journal Physical Review Letters.

“The basic equations governing water flow have been known for over a century,” says Kamrin, the Class of ’56 Career Development Assistant Professor of Mechanical Engineering. “There hasn’t been something similar for sand, where I can give you a cupful of sand, and tell you which equations will be necessary to predict how it will squish around if I squeeze the cup.”

Blurring the lines


Kamrin explains that developing a flow model – also known as a continuum model – essentially means “blurring out” individual grains or molecules. While a computer may be programmed to predict the behavior of every single molecule in, say, a cup of flowing water, Kamrin says this exercise would take years. Instead, researchers have developed continuum models. They imagine dividing the cup into a patchwork of tiny cubes of water, each cube small compared to the size of the entire flow environment, yet large enough to contain many molecules and molecular collisions. Researchers can perform basic lab experiments on a single cube of water, analyzing how the cube deforms under different stresses. To efficiently predict how water flows in the cup, they solve a differential equation that applies the behavior of a single cube to every cube in the cup’s grid.

Such models work well for fluids like water, which is easily divisible into particles that are almost infinitesimally small. However, grains of sand are much larger than water molecules – and Kamrin found that the size of an individual grain can significantly affect the accuracy of a continuum model.

For example, a model can precisely estimate how water molecules flow in a cup, mainly because the size of a molecule is so much smaller than the cup itself. For the same relative scale in the flow of sand grains, Kamrin says, the sand’s container would have to be the size of San Francisco.

Neighboring chatter

But why exactly does size matter? Kamrin reasons that when modeling water flow, molecules are so small that their effects stay within their respective cubes. As a result, a model that averages the behavior of every cube in a grid, and assumes each cube is a separate entity, gives a fairly accurate flow estimate. However, Kamrin says in granular flow, much larger grains such as sand can cause “bleed over” into neighboring cubes, creating cascade effects that are not accounted for in existing models.

“There’s more chatter between neighbors,” Kamrin says. “It’s like the basic mechanical properties of a cube of grains become influenced by the movement of neighboring cubes.”

Kamrin modified equations for an existing continuum model to factor in grain size, and tested his model on several configurations, including sand flowing through a chute and rotating in a circular trough. The new model not only predicted areas of fast-flowing grains, but also where grains would be slow moving, at the very edges of each configuration – areas traditional models assumed would be completely static. The new model’s predictions matched very closely with particle-by-particle simulations in the same configurations.

The model, run on a computer, can produce accurate flow fields in minutes, and could benefit engineers designing manufacturing processes for pharmaceuticals and agricultural products. For example, Kamrin says, engineers could test various shapes of chutes and troughs in the model to find a geometry that maximizes flow, or mitigates potentially dangerous wall pressure, before ever actually designing or building equipment to process granular materials.

Kamrin says understanding how granular materials flow could also help predict geological phenomena such as landslides and avalanches and help engineers come up with new ways to generate better traction in sand.

“Granular material is the second-most-handled material in industry, second only to water,” Kamrin says. “I’m convinced there are a million applications.”

New online portal, app provide information on tsunami zones in the Northwest

A new suite of online portal and smartphone apps is providing information on tsunami zones in the U.S. Pacific Northwest. The Pacific Northwest Tsunami Evacuation Zones online portal and free apps provide an at-a-glance view of tsunami hazard zones along the coasts of Oregon and Washington.

“These are potentially life-saving tools now available for free to the people who live, work and play on our ocean and coastal waters in the Northwest,” said Zdenka Willis, U.S. Integrated Ocean Observing System (IOOS®) program director.

The Northwest Association of Networked Ocean Observing System (NANOOS), a regional IOOS® member, developed the tool and launched it in partnership with the Oregon Department of Geology and Mineral Industries and Washington State Department of Natural Resources, agencies responsible for the original development of the evacuation zones.

“The system integrates maps and allows users to see if they are in an evacuation zone, as well as plan evacuation routes,” said Jan Newton, executive director for NANOOS. “Planning tools like this are essential to safeguarding lives and property.”

View the online portal or search TsunamiEvac-NW in the iTunes App Store and Android Market.

IOOS® is a federal, regional, and private-sector partnership working to enhance our ability to collect, deliver and use ocean information. IOOS® delivers the data and information needed to increase understanding of our oceans and coasts, so that decision-makers can act to improve safety, enhance the economy, and protect our environment.

NOAA’s National Tsunami Hazard Mitigation Program provided funding to develop tsunami inundation models used to map the evacuation zones displayed in the app.

Ice sheet collapse and sea-level rise at the Boelling warming 14,600 years ago

International scientists have shown that a dramatic sea-level rise occurred at the onset of the first warm period of the last deglaciation, known as the Bølling warming, approximately 14,600 years ago. This event, referred to as Melt-Water Pulse 1A (MWP-1A), corresponds to a rapid collapse of massive ice sheets 14,600 years ago and resulted in global sea-level rise of ~14 m. These findings are published in the 29 March 2012 issue of the journal Nature (Volume 483, Issue 7391).

Collaboration between CEREGE (UMR Aix-Marseille Univ. – CNRS – IRD – College de France) and the Universities of Oxford and Tokyo, allowed an international science team to publish on research stemming from the Tahiti Sea-Level Expedition 310 of the Integrated Ocean Drilling Program (IODP). The Tahiti Sea-Level Expedition was carried out in 2005 by the European Consortium for Ocean Research Drilling (ECORD) and the ECORD Science Operator (ESO) on behalf of IODP.

Using U-Th dating of coral samples obtained from cores drilled in the Tahiti coral reefs, the researchers were able to reconstruct sea-level rise over the last deglaciation. Coral is extremely sensitive to sea-level changes and fossilized corals are therefore an excellent indicator for sea-level changes over time.

“Corals are outstanding archives to reconstruct past sea-level changes as they can be dated to within plus or minus 30 years stretching back thousands of years. Moreover, Tahitian reefs are ideally located to reconstruct the deglacial sea-level rise and to constrain short-term events that are thought to have punctuated the period between the Last Glacial Maximum and the present days. Tahiti is located at a sufficiently considerable distance from the major former ice sheets to give us close to the average of sea levels across the globe, as a non-volcanic island it is also subsiding into the ocean at a steady pace that we can easily adjust for.” Pierre Deschamps, first author based at CEREGE said.

The Nature article presents the view that most of the melting water that contributed to MWP-1A was sourced from the Antarctic ice sheet, highlighting dynamical behavior of this ice sheet in the past. The authors note that further research is needed using cored fossilized corals to better understand the sequence of events related to ice sheet collapse during the last deglaciation.

Today, half of world’s population, approximately 3.2 billion people, lives within 200 km of coastline, and a tenth of the population lives less than 10 meters above sea level. Tahiti itself is, of course, at risk from modern sea-level rise. The government of Tahiti cooperated on the IODP Expedition 310 by providing the necessary regulatory clearances to drill on the fossilized coral reefs.

Pierre Deschamps says “Insights into past sea-level changes may help to better constrain future changes. Our work sheds light onto an extreme event of rise in global sea levels in which ice-sheet collapse coincided with a rapid warming. Whether the freshwater pulse was a result of an already warming word or helped to warm the climate is currently unclear. However, our finding will help scientists currently modelling future climate change scenarios to factor in the dynamic behaviour of major ice sheets and finally to provide more reliable predictions of ice sheet responses to a warming climate”.

Much is yet unknown about the dynamics of sea-level change, in response to massive water discharge. However, IODP Expedition 310 has helped international scientists shed light on one of the most important climate event of the last deglaciation, the MWP-1A event, based on fossil corals obtained from off the coast of Tahiti.

Satellite observes rapid ice shelf disintegration in Antarctic

This image shows radar images from the Envisat satellite from 2002 to 2012 of the Larsen B ice shelf in Antarctica. Over the last decade, the ice shelf has disintegrated by 1790 sq km. -  ESA
This image shows radar images from the Envisat satellite from 2002 to 2012 of the Larsen B ice shelf in Antarctica. Over the last decade, the ice shelf has disintegrated by 1790 sq km. – ESA

One of the satellite’s first observations following its launch on 1 March 2002 was of break-up of a main section of the Larsen B ice shelf in Antarctica – when 3200 sq km of ice disintegrated within a few days due to mechanical instabilities of the ice masses triggered by climate warming.

Now, with ten years of observations using its Advanced Synthetic Aperture Radar (ASAR), Envisat has mapped an additional loss in Larsen B’s area of 1790 sq km over the past decade.

The Larsen Ice Shelf is a series of three shelves – A (the smallest), B and C (the largest) – that extend from north to south along the eastern side of the Antarctic Peninsula.

Larsen A disintegrated in January 1995. Larsen C so far has been stable in area, but satellite observations have shown thinning and an increasing duration of melt events in summer.

“Ice shelves are sensitive to atmospheric warming and to changes in ocean currents and temperatures,” said Prof. Helmut Rott from the University of Innsbruck.

“The northern Antarctic Peninsula has been subject to atmospheric warming of about 2.5°C over the last 50 years – a much stronger warming trend than on global average, causing retreat and disintegration of ice shelves.”

Larsen B decreased in area from 11512 sq km in early January 1995 to 6664 sq km in February 2002 due to several calving events. The disintegration in March 2002 left behind only 3463 sq km. Today, Envisat shows that only 1670 sq km remain.

Envisat has already doubled its planned lifetime, but is scheduled to continue observations of Earth’s ice caps, land, oceans and atmosphere for at least another two years.

This ensures the continuity of crucial Earth-observation data until the next generation of satellites – the Sentinels – begin operations in 2013.

“Long-term systematic observations are of particular importance for understanding and modelling cryospheric processes in order to advance the predictive capabilities on the response of snow and ice to climate change,” said Prof. Rott.

“Climate models are predicting drastic warming for high latitudes. The Envisat observations of the Larsen Ice Shelf confirm the vulnerability of ice shelves to climatic warming and demonstrate the importance of ice shelves for the stability of glaciers upstream.

“These observations are very relevant for estimating the future behaviour of the much larger ice masses of West Antarctica if warming spreads further south.”

Radars on Earth observation satellites, such as Envisat’s ASAR, are particularly useful for monitoring polar regions because they can acquire images through clouds and darkness.

The Sentinel missions – being developed as part of Europe’s Global Monitoring for Environment and Security (GMES) programme – will continue the legacy of radar observations.

Copper chains: Study reveals Earth’s deep-seated hold on copper

Earth is clingy when it comes to copper. A new Rice University study this week in the journal Science finds that nature conspires at scales both large and small — from the realms of tectonic plates down to molecular bonds — to keep most of Earth’s copper buried dozens of miles below ground.

“Everything throughout history shows us that Earth does not want to give up its copper to the continental crust,” said Rice geochemist Cin-Ty Lee, the lead author of the study. “Both the building blocks for continents and the continental crust itself, dating back as much as 3 billion years, are highly depleted in copper.”

Finding copper is more than an academic exercise. With global demand for electronics growing rapidly, some studies have estimated the world’s demand for copper could exceed supply in as little as six years. The new study could help, because it suggests where undiscovered caches of copper might lie.

But the copper clues were just a happy accident.

“We didn’t go into this looking for copper,” Lee said. “We were originally interested in how continents form and more specifically in the oxidation state of volcanoes.”

Earth scientists have long debated whether an oxygen-rich atmosphere might be required for continent formation. The idea stems from the fact that Earth may not have had many continents for at least the first billion years of its existence and that Earth’s continents may have begun forming around the time that oxygen became a significant component of the atmosphere.

In their search for answers, Lee and colleagues set out to examine Earth’s arc magmas — the molten building blocks for continents. Arc magmas get their start deep in the planet in areas called subduction zones, where one of Earth’s tectonic plates slides beneath another. When plates subduct, two things happen. First, they bring oxidized crust and sediments from Earth’s surface into the mantle. Second, the subducting plate drives a return flow of hot mantle upwards from Earth’s deep interior. During this return flow, the hot mantle not only melts itself but may also cause melting of the recycled sediments. Arc magmas are thought to form under these conditions, so if oxygen were required for continental crust formation, it would mostly likely come from these recycled segments.

“If oxidized materials are necessary for generating such melts, we should see evidence of it all the way from where the arc magmas form to the point where the new continent-building material is released from arc volcanoes,” Lee said.

Lee and colleagues examined xenoliths, rocks that formed deep inside Earth and were carried up to the surface in volcanic eruptions. Specifically, they studied garnet pyroxenite xenoliths thought to represent the first crystallized products of arc magmas from the deep roots of an arc some 50 kilometers below Earth’s surface. Rather than finding evidence of oxidation, they found sulfides — minerals that contain reduced forms of sulfur bonded to metals like copper, nickel and iron. If conditions were highly oxidizing, Lee said, these sulfide minerals would be destabilized and allow these elements, particularly copper, to bond with oxygen.

Because sulfides are also heavy and dense, they tend to sink and get left behind in the deep parts of arc systems, like a blob of dense material that stays at the bottom of a lava lamp while less dense material rises to the top.

“This explains why copper deposits, in general, are so rare,” Lee said. “The Earth wants to hold it deep and not give it up.”

Lee said deciding where to look for undiscovered copper deposits requires an understanding of the conditions needed to overcome the forces that conspire to keep it deep inside the planet.

“As a continental arc matures, the copper-rich sulfides are trapped deep and accumulate,” he said. “But if the continental arc grows thicker over time, the accumulated copper-bearing sulfides are driven to deeper depths where the higher temperatures can re-melt these copper-rich dregs, releasing them to rejoin arc magmas.”

These conditions were met in the Andes Mountains and in western North America. He said other potential sources of undiscovered copper include Siberia, northern China, Mongolia and parts of Australia.

Lee noted that a high school intern played a role in the research paper. Daphne Jin, now a freshman at the University of Chicago, made her contribution to the research as a high school intern from Clements High School in the Houston suburb of Sugarland.

“The paper really wouldn’t have been as broad without Daphne’s contribution,” Lee said. “I originally struggled with an assignment for her because I didn’t and still don’t have large projects where a student can just fit in. I try to make sure every student has a chance to do something new, but often I just run out of ideas.”

Lee eventually asked Jin to compile information from published studies about the average concentration of all the first-row of transition elements in the periodic table in various samples of continental crust and mantle collected the world over.

“She came back and showed me the results, and we could see that the average continental crust itself, which has been built over 3 billion years of Earth’s history in Africa, Siberia, North America, South America, etc., was all depleted in copper,” Lee said. “Up to that point we’d been looking at the building blocks of continents, but this showed us that the continents themselves followed the same pattern. It was all internally consistent.”

Thawing permafrost may have led to extreme global warming events

Scientists analyzing prehistoric global warming say thawing permafrost released massive amounts of carbon stored in frozen soil of Polar Regions exacerbating climate change through increasing global temperatures and ocean acidification.

Although the amounts of carbon involved in the ancient soil-thaw scenarios was likely much greater than today, the implications of this ground-breaking study are that the long-term future of carbon deposits locked into frozen permafrost of Polar Regions are vulnerable to climate warming caused as humans emit the greenhouse gas carbon dioxide by burning fossil fuels for energy generation.

Researchers in centres across America, Italy and the University of Sheffield, analysed a series of sudden, and extreme, global warming events – called hyperthermals – that occurred about 55 million years ago, linked to rising greenhouse gas concentrations and changes in Earth’s orbit, which led to a massive release of carbon into the atmosphere, ocean acidification, and a five degrees Celsius rise in global temperature within just a few thousand years.

It was previously thought that the source of carbon was in the ocean, in the form of frozen methane gas in ocean-floor sediments but now the experts believe the carbon released into the atmosphere millions of years ago came from the Polar Regions.

Professor David Beerling, of the University of Sheffield’s Department of Animal and Plant Sciences, said: “For the first time, we have linked these past global warming events with a climatically sensitive terrestrial carbon reservoir rather than a marine one. It shows that global warming can be amplified by carbon release from thawing permafrost.”

“The research suggests that carbon stored in permafrost stocks today in the Arctic region is vulnerable to warming. Warming causes permafrost thaw and decomposition of organic matter releasing more greenhouse gases back into the atmosphere.

“This feedback loop could accelerate future warming. It means we must arrest carbon dioxide emissions released by the combustion of fossil fuels if humanity wishes to avoid triggering these sorts of feedbacks in our modern world.”

The breakthrough was made through cross-disciplinary collaborations with climate and vegetation modellers, isotope geochemists and permafrost experts led by Rob DeConto at the University of Massachusetts, in collaboration with the University of Sheffield, Yale, the University of Colorado, Penn State, and the University of Urbino, Italy.

Rob DeConto added: “Similar dynamics are at play today. Global warming is degrading permafrost in the north Polar Regions, unlocking once-frozen carbon and methane and releasing it into the atmosphere. This will only exacerbate future warming in a positive feedback loop.”

The temperature of Earth’s atmosphere is a result of energy input from the sun minus what escapes back into space. Carbon dioxide in the atmosphere absorbs and traps heat that would otherwise return to space.

The global warming events were accompanied by a massive input of carbon to the atmosphere plus ocean acidification, and were characterized by a global temperature rise of about five degrees Celsius within a few thousand years.

Until now, scientists have been unable to account for the massive amounts of carbon required to cause such dramatic global warming events and Antarctica, which on today’s Earth is covered by kilometres of ice, has not been appreciated as an important player in such global carbon dynamics.

The research is published in the journal Nature.