Deep-sea volcanoes don’t just produce lava flows, they also explode!

This images shows bands of glowing magma from submarine volcano. -  NOAA/National Science Foundation
This images shows bands of glowing magma from submarine volcano. – NOAA/National Science Foundation

Between 75 and 80 per cent of all volcanic activity on Earth takes place at deep-sea, mid-ocean ridges. Most of these volcanoes produce effusive lava flows rather than explosive eruptions, both because the levels of magmatic gas (which fuel the explosions and are made up of a variety of components, including, most importantly CO2) tend to be low, and because the volcanoes are under a lot of pressure from the surrounding water.

Over about the last 10 years however, geologists have nevertheless speculated, based on the presence of volcanic ash in certain sites, that explosive eruptions can also occur in deep-sea volcanoes.

But no one has been able to prove it until now.

By using an ion microprobe, Christoph Helo, a PhD student in McGill’s Department of Earth and Planetary Sciences, has now discovered very high concentrations of CO2 in droplets of magma trapped within crystals recovered from volcanic ash deposits on Axial Volcano on the Juan de Fuca Ridge, off the coast of Oregon.

These entrapped droplets represent the state of the magma prior to eruption. As a result, Helo and fellow researchers from McGill, the Monterey Bay Aquarium Research Institute, and the Woods Hole Oceanographic Institution, have been able to prove that explosive eruptions can indeed occur in deep-sea volcanoes. Their work also shows that the release of CO2 from the deeper mantle to the Earth’s atmosphere, at least in certain parts of mid-ocean ridges, is much higher than had previously been imagined.

Given that mid-ocean ridges constitute the largest volcanic system on Earth, this discovery has important implications for the global carbon cycle which have yet to be explored.

Even Canadian rocks are different

Andrew Leier examined zircons from Lower Cretaceous sandstone near the Sulphur river in the Grande Cache, Alberta area. The prominent sandstone cliff is the Cretaceous sandstone. -  University of Calgary
Andrew Leier examined zircons from Lower Cretaceous sandstone near the Sulphur river in the Grande Cache, Alberta area. The prominent sandstone cliff is the Cretaceous sandstone. – University of Calgary

Canadians have always seen themselves as separate and distinct from their American neighbors to the south, and now they have geological proof.

New research published in April’s edition of Geology shows that rock formations roughly along the same political boundary as the two North American countries formed as early as 120 million years ago.

Dr. Andrew Leier, of the Department of Geoscience at the University of Calgary, set out to prove what he thought was the obvious: because the mountains are continuous between the U.S. and Canada, the ancient river systems that flowed from these uplands were likely interconnected. In other words, during Cretaceous Period,120 million years ago, rivers should have flowed north and south between the countries, paying no mind to the modern day political border.

“I thought that I could easily show that in my research,” says Leier who published a paper in Geology with co-author Dr. George Gehrels at the University of Arizona and, Leier adds, a lot of help from Cassandra Frosini, an undergraduate in geoscience at the University of Calgary.

But Leier was wrong. “I was surprised to learn the opposite, in fact, was true,” he says.

A tiny piece of sediment found in sandstone called zircon helped the researchers locate where the sediments had originally formed. Knowing its current location, Leier was able to determine just how far the rivers moved it and the direction from which it came.

During the Cretaceous Period, mountains were being created all along western North America, in both Canada and the United States.

“I thought the sediment transported by ancient rivers in Montana and Utah would flow out of the mountain ranges and then north into Alberta. This is similar with how the Ganges River runs parallel to the Himalayas. Our research shows this wasn’t the case,” says Leier.

Leier and Gehrels used recently developed laser-based techniques to reconstruct the origin of individual sand grains that were deposited during this period in western North America. This technique has applications to the petroleum industry as well, where it can be used to aide in determining drilling locations.

Researchers found slightly different rocks, when eroded, produced slightly different zircons.

“Cretaceous sediment in the United States have a clear American signature; whereas those in the Canadian Rockies have a different and definable Canadian signature,” says Leier.

“The demarcation is pretty much coincidental with the modern day border.”

Also the implication of the data suggests that the rivers that flowed west to east from the mountains in the United States stayed in the United States, and those in Canada stayed in Canada.

“In other words, there is no evidence that rivers in western North America were crossing what is today the border,” says Leier.

Algae, bacteria hogged oxygen after ancient mass extinction, slowed marine life recovery

A distant view of the field area in the Nanpanjiang Basin in south China where limestone contained evidence of a slow recovery of marine animal populations after the mass extinction 250 million years ago. -  Katja Meyer, Stanford University
A distant view of the field area in the Nanpanjiang Basin in south China where limestone contained evidence of a slow recovery of marine animal populations after the mass extinction 250 million years ago. – Katja Meyer, Stanford University

A mass extinction is hard enough for Earth’s biosphere to handle, but when you chase it with prolonged oxygen deprivation, the biota ends up with a hangover that can last millions of years.

Such was the situation with the greatest mass extinction in Earth’s history 250 million years ago, when 90 percent of all marine animal species were wiped out, along with a huge proportion of plant, animal and insect species on land.

A massive amount of volcanism in Siberia is widely credited with driving the disaster, but even after the immense outpourings of lava and toxic gases tapered off, oxygen levels in the oceans, which had been depleted, remained low for about 5 million years, slowing life’s recovery there to an unusual degree.

The reason for the lingering low oxygen levels has puzzled scientists, but now Stanford researchers have figured out what probably happened. By analyzing the chemical composition of some then-underwater limestone beds deposited over the course of the recovery in what is now southern China, they have determined that while it took several million years for most ecosystems in the ocean to recover, tiny single-celled algae and bacteria bounced back much more quickly.

In fact, according to biogeochemist Katja Meyer, the tiny organisms rebounded to such an extent that the bigger life forms couldn’t catch a break – much less their breath – because the little ones were enjoying a sustained population explosion.

As the vast hordes of tiny dead organisms rotted, dissolved oxygen in the seawater was consumed by aerobic microbes involved in the decay process, leaving scant oxygen for larger organisms in what became an oxygen-depleted, or anoxic, environment.

The driver of the ongoing population boom appears to have been the massive amounts of carbon dioxide pumped into the atmosphere during the volcanism, Meyer said, which caused the world to warm.

“More warmth means an invigorated hydrological cycle, so you get more rain and this rain is also more acidic because there is more carbon dioxide dissolved in the rain,” Meyer said.

The increased amounts of more acidic rain increased weathering of the land surface, which sent more nutrients into the ocean, which fueled explosions of life such as algae blooms.

“It is kind of counterintuitive that high productivity on the part of algae and bacteria would likely be generating these toxic geochemical conditions that prevent most of animal life from recovering from mass extinction,” Meyer said.

But the process, she said, is basically the same as when excess runoff from fertilizers goes into a body of water, whether it’s a pond on a golf course or the infamous dead zone in the Gulf of Mexico created by farm runoff carried down the Mississippi River.

“You get this giant bloom of algae and then it starts to smell bad as that algae decays, pulling oxygen out of the water and causing fish die-offs,” Meyer said.

In spite of the almost inestimably high numbers of algae and bacteria living and dying during this time, there is little direct evidence of them in the fossil record because such tiny, soft-bodied creatures just don’t preserve well.

So Meyer and her colleagues had to work with indirect evidence of the microorganisms to determine their abundance during the years after the mass extinction. The population proxy they used was the carbon present in the limestone.

Carbon – like all elements – comes in different varieties, called isotopes, distinguished by the number of neutrons each has in its nucleus. The researchers worked with two carbon isotopes, carbon 12, which has six neutrons, and carbon 13, which has seven.

Both isotopes are present in ocean water, but living things on Earth have always shown a preference for incorporating the lighter isotope, carbon 12, into their structures. Thus, where life is abundant, the ratio of carbon 13 to carbon 12 in seawater is higher than it is where there is no life.

Limestone records the composition of the seawater in which it was deposited, including the relative amounts of light and heavy carbon isotopes, so by analyzing the isotope ratio in the rocks, Meyer could infer the abundance of life in the water where the limestone formed.

Comparable modern environments, such as the Bahama Banks in the Caribbean Sea, where carbonate platforms similar to the limestones are forming, are typically teeming with life at the range of depths in which Meyer’s limestones formed. In these environments, the ratio of carbon 13 to carbon 12 is generally constant from shallow to deep water.

But microorganisms are typically most abundant in shallow waters, so if marine life in the era after the mass extinction had been confined to algae and bacteria, then the shallower depths should show a markedly greater ratio of carbon 13 to carbon 12 than would be found at depth.

Meyer’s analysis showed there was a difference of about 4 parts per thousand in carbon isotope ratios from the shallow waters to depths, roughly twice what it is today.

“We only see this gradient in the interval after the mass extinction prior to the recovery of animal life,” said Meyer.

Meyer is the lead author of a research paper about the study published last month in Earth and Planetary Science Letters. The extinction 250 million years ago is known as the Permian-Triassic mass extinction, as it coincides with the end of the Permian period and the beginning of the Triassic period on the geologic time scale.

“It appears there was a huge amount of biological productivity in the shallow waters that was making the bottom waters uninhabitable for animals,” said Jonathan Payne, assistant professor of geological and environmental sciences, who is a coauthor of the paper and in whose lab Meyer has been working.

“It looks like the whole recovery was slowed by having too much food available, rather than too little,” Payne said. “Most of us think that if the biota isn’t doing well, maybe we should feed it more. This is clearly an example where feeding it less would have been much better.”

Researchers help map tsunami and earthquake damage in Japan

The images show the progression of damage to the Fukushima Dai-ichi Nuclear Power Plant from March 12 to March 17. -  Analysis by RIT Digital Imaging and Remote Sensing Laboratory within the Chester F. Carlson Center for Imaging Science.
The images show the progression of damage to the Fukushima Dai-ichi Nuclear Power Plant from March 12 to March 17. – Analysis by RIT Digital Imaging and Remote Sensing Laboratory within the Chester F. Carlson Center for Imaging Science.

Japan needs maps. Not just any kind-detailed informational maps georegistered with latitude and longitude and annotated with simple, self-evident details: this bridge is out, this port is damaged, this farm field is scoured; this one is verdant.

Researchers at Rochester Institute of Technology are processing satellite imagery of regions in Japan affected by the 9.0 magnitude earthquake and tsunami that devastated sections of the country’s east coast on March 11. The U.S. Geological Survey, a member of the International Charter “Space and Major Disasters,” organized the volunteer effort involving about 10 organizations, including Harvard University, George Mason University, Penn State and the Jet Propulsion Laboratory.

RIT signed on to process images of the Fukushima Nuclear Power Plant and the cities of Hachinohe and Kesennuma. At the request of the Japanese, scientists at RIT created before-and-after images that can be printed on large sheets of paper. The team uploads 30 megabyte PDFs to the U.S. Geological Survey’s website for charter members and Japanese emergency responders to access.

“Once we upload it, it’s out of our hands,” says David Messinger, associate research professor and director of the Digital Imaging Remote Sensing Laboratory in RIT’s Chester F. Carlson Center for Imaging Science. “If you have the electronic version, you can make measurements on it,” he says. “The assumption is they want the big format so they can print it out, roll it up and take it into the field.”

The Japanese relief workers requested high-resolution images of the Fukushima Nuclear Power Plant. The RIT team processed imagery looking down into the reactors and the containment shells on March 12, the day after the earthquake and tsunami hit and prior to the explosions at the plant. High-resolution image-maps from March 18 show extensive damage and a smoldering reactor.

“We were tasked with the nuke plant Friday [March 18] morning and we uploaded it about 6 that night,” says Don McKeown, distinguished researcher in the Carlson Center for Imaging Science.

The 13-hour time difference has made the workflow difficult, Messinger notes. “While we’re doing this here, it’s the middle of the night there, so the feedback loops are slow.

“We were pushing hard,” he adds. “We wanted to get maps to them before their morning work shift started.”

They are mapping the area around the power plant as well, processing imagery from a broader view of the terrain used as farmland.

“We have a large image of Fukushima,” McKeown adds. “We’re committed to making a big map of this area. This is a very agricultural region and there are restrictions about food coming out of the area.”

The RIT team, led by McKeown and Messinger, includes graduate students Sanjit Maitra and Weihua “Wayne” Sun in the Center for Imaging Science and staff members Steve Cavilia, Chris DiAngelis, Jason Faulring and Nina Raqueño. They created the maps using imagery from WorldView 1 and WorldView 2 satellites operated by Digital Globe, a member of RIT’s Information Products Laboratory for Emergency Response (IPLER), and GeoEye 1, a high-resolution commercial satellite operated by GeoEye Inc.

“This really fits what IPLER is all about-information products,” McKeown says.

RIT and the University at Buffalo formed IPLER six months before the earthquake struck Haiti in January 2010. Connections with industry partners led RIT to capture and process multispectral and LIDAR images of Port-au-Prince and surrounding towns for the World Bank.

“With Haiti, we learned how, in a disaster, to send an imaging instrument into the field, collect the relevant data, get it back to campus and do the right processing to the imagery,” Messinger says. “In this case, we’re learning how to take imagery that we didn’t collect and produce the actual product that will be delivered to the first responders in the field in a very short time frame. We’ve learned a lot about the second phase of the process now.”

Measurements of winter Arctic sea ice shows continuing ice loss

The 2011 Arctic sea ice extent maximum that marks the beginning of the melt season appears to be tied for the lowest ever measured by satellites, say scientists at the University of Colorado Boulder’s National Snow and Ice Data Center.

The CU-Boulder research team believes the lowest annual maximum ice extent of 5,650,000 square miles occurred on March 7. The maximum ice extent was 463,000 square miles below the 1979-2000 average, an area slightly larger than the states of Texas and California combined. The 2011 measurements were tied with those from 2006 as the lowest maximum sea ice extents measured since satellite record keeping began in 1979.

Virtually all climate scientists believe shrinking Arctic sea ice is tied to warming temperatures in the region caused by an increase in human-produced greenhouse gases being pumped into Earth’s atmosphere. Because of the spiraling downward trend of Arctic sea ice extent in the last decade, some CU scientists are predicting the Arctic Ocean may be ice free in the summers within the next several decades.

The seven lowest maximum Arctic sea ice extents measured by satellites all have occurred in the last seven years, said CU-Boulder Research Scientist Walt Meier of the National Snow and Ice Data Center, who participated the latest study. “I’m not surprised by the new data because we’ve seen a downward trend in winter sea ice extent for some time now.”

Scientists believe Arctic sea ice functions like an air conditioner for the global climate system by naturally cooling air and water masses, playing a key role in ocean circulation and reflecting solar radiation back into space, said Meier. In the Arctic summer months, sunlight is absorbed by the growing amounts of open water, raising surface temperatures and causing more ice to melt.

“I think one of the reasons the Arctic sea ice maximum extent is declining is that the autumn ice growth is delayed by warmer temperatures and the ice extent is not able to ‘catch up’ through the winter,” said Meier. “In addition, the clock runs out on the annual ice growth season as temperatures start to rise along with the sun during the spring months.”

Since satellite record keeping began in 1979, the maximum Arctic sea ice extent has occurred as early as Feb. 18 and as late as March 31, with an average date of March 6. Since the CU-Boulder researchers determine the maximum sea ice extent using a five-day running average, there is small chance the data could change.

In early April CU-Boulder’s National Snow and Ice Data Center will issue a formal announcement on the 2011 maximum sea ice extent with a full analysis of the winter ice growth season, including graphics comparing 2011 to the long-term record.

Ancient trash heaps gave rise to Everglades tree islands

Garbage mounds left by prehistoric humans might have driven the formation of many of the Florida Everglades’ tree islands, distinctive havens of exceptional ecological richness in the sprawling marsh that are today threatened by human development.

Tree islands are patches of relatively high and dry ground that dot the marshes of the Everglades. Typically a meter (3.3 feet) or so high, many of them are elevated enough to allow trees to grow. They provide a nesting site for alligators and a refuge for birds, panthers, and other wildlife.

Scientists have thought for many years that the so-called fixed tree islands (a larger type of tree island frequently found in the Everglades’ main channel, Shark River Slough) developed on protrusions from the rocky layer of a mineral called carbonate that sits beneath the marsh. Now, new research indicates that the real trigger for island development might have been middens, or trash piles left behind from human settlements that date to about 5,000 years ago.

These middens, a mixture of bones, food discards, charcoal, and human artifacts (such as clay pots and shell tools), would have provided an elevated area, drier than the surrounding marsh, allowing trees and other vegetation to grow. Bones also leaked phosphorus, a nutrient for plants that is otherwise scarce in the Everglades.

“This goes to show that human disturbance in the environment doesn’t always have a negative consequence,” says Gail Chmura, a paleoecologist at McGill University in Montreal, Canada, and one of the authors of the study.

Chmura will be presenting her research tomorrow, Tuesday 22 March, at the American Geophysical Union’s Chapman Conference on Climates, Past Landscapes, and Civilizations. About 95 scientists have converged on Santa Fe this week to discuss the latest research findings from archeology, paleoclimatology, paleoecology, and other fields that reveal how changes in regional and global climate have impacted the development and fates of societies.

In a previous scientific investigation of tree islands, Margo Schwadron, an archeologist with the National Park Service, cut through the elevated bedrock at the base of two islands and discovered that it was actually a so-called “perched carbonate layer,” because there was more soil and a midden below. Later, a team including Chmura’s graduate student Maria-Theresia Graf performed additional excavations in South Florida and found more of the perched carbonate layers.

Chemical analysis of samples of these curious perched layers revealed that they are made up partially of carbonates that had dissolved from the bedrock below, Chmura says. The layer also contains phosphorus from dissolved bones, she adds. Her team concluded that trees are key to the formation of this layer: During South Florida’s dry season, their roots draw in large quantities of ground water but allow the phosphates and carbonates dissolved in it to seep out and coalesce into the stone-like layer.

The perched carbonate plays a key role in letting tree islands rebound after fires: because it does not burn, it protects the underlying soil, and it maintains the islands’ elevation, allowing vegetation to regrow after the fire. Humans are now threatening the existence of tree islands, by cutting down trees (whose roots keep the perched layer in place) and artificially maintaining high water levels year-round in some water control systems, which could cause the layer to dissolve.

Chmura’s team now wants to explore exactly when trees started growing on the tree islands.

Fault-finding coral reefs can predict the site of coming earthquakes

This is a 3-D illustration of the Gulf of Aqaba Sea floor and surrounding mountains. -  AFTAU
This is a 3-D illustration of the Gulf of Aqaba Sea floor and surrounding mountains. – AFTAU

In the wake of the devastating loss of life in Japan, the urgent question is where the next big earthquake will hit. To answer it, geologist Prof. Zvi Ben-Avraham and his doctoral student Gal Hartman of Tel Aviv University’s Department of Physics and Planetary Sciences in the Raymond and Beverly Sackler Faculty of Exact Sciences are examining coral reefs and submarine canyons to detect earthquake fault zones.

Working with an international team of Israelis, Americans and Jordanians, Prof. Ben-Avraham and his team are developing a new method to determine what areas in a fault zone region are most at risk. Using a marine vessel, he and his colleagues are surveying a unique geological phenomenon of the Red Sea, near the coastal cities of Eilat and Aqaba – but their research could be applied anywhere, including Japan and the west coast of the U.S.

Recently published in the journal Geo-Marine Letters, the research details a “mass wasting” of large detached blocks and collapsed walls of submarine canyons along the gulf region of the Red Sea. They believe the geological changes were triggered by earthquake activity.

What’s next for San Andreas?


The team has created the first underwater map of the Red Sea floor at the head of the Gulf of Aqaba, and more importantly, identified deformations on the sea floor indicating fault-line activity. They not only pinpointed the known fault lines along the Syrian-African rift, but located new ones that city engineers in Israel and Jordan should be alert to.

“Studying fossil coral reefs and how they’ve split apart over time, we’ve developed a new way to survey active faults offshore by looking at the movement of sediment and fossil structures across them,” says Hartman. “What we can’t say is exactly when the next major earthquake will hit. But we can tell city engineers where the most likely epicenter will be.” According to Hartman, the tourist area in the city of Eilat is particularly vulnerable.

While geologists have been tracking underwater faults for decades, the new research uniquely tracks lateral movements across a fault line (a “transform fault”) and how they impact the sediment around them. This is a significant predictive tool for studying the San Andreas Fault in California as well, says Hartman.

The research is supported by a USAID grant through the Middle East Regional Cooperation (MERC) program.

Marching orders for city engineers

Aboard a marine vessel that traversed the waters of Israel and Jordan and peering at depths as deep as 700 meters, the researchers analyzed the structure of the seabed and discovered active submarine canyons, mass wasting, landslides, and sediment slumps related to tectonic processes and earthquake activity.

“There are several indicators of seismic activity. The most significant is the location of the fault. Looking at and beneath the seafloor, we saw that the faults deform the upper sediments. The faults of the Red Sea are active. We managed to find some other faults too and now know just how many active faults are in the region. This should help make authorities aware of where the next big earthquake will strike,” says Hartman.

What made their study particularly unique is that they used the offset along linear structures, of fossil coral fringing-reefs to measure what they call “lateral slip across active faults.” With this knowledge, researchers were able to calculate total slip and slip-rates and how active the fault has become.

“We can now identify high-risk locations with more certainty, and this is a boon to city planners. It’s just a matter of time before we’ll need to test how well cities will withstand the force of the next earthquake. It’s a matter of proper planning,” concludes Hartman.

Ancient ‘hyperthermals’ a guide to anticipated climate changes

Sediment samples in the lab of Richard Norris obtained by the Ocean Drilling Program reveal the mark of 'hyperthermals,' warming events lasting thousands of years that changed the composition of the sediment and its color.  The packaged sediment sample on the left contains sediment formed in the wake of a 55-million-year-old warming event and the sample on the right is sediment from a later era after global temperatures stabilized. -  Scripps Institution of Oceanography, UC San Diego
Sediment samples in the lab of Richard Norris obtained by the Ocean Drilling Program reveal the mark of ‘hyperthermals,’ warming events lasting thousands of years that changed the composition of the sediment and its color. The packaged sediment sample on the left contains sediment formed in the wake of a 55-million-year-old warming event and the sample on the right is sediment from a later era after global temperatures stabilized. – Scripps Institution of Oceanography, UC San Diego

Bursts of intense global warming that have lasted tens of thousands of years have taken place more frequently throughout history than previously believe, according to evidence gathered by a team led by Scripps Institution of Oceanography, UC San Diego researchers.

Richard Norris, a professor of geology at Scripps who co-authored the report, said that releases of carbon dioxide sequestered in the deep oceans were the most likely trigger of these ancient “hyperthermal” events. Most of the events raised average global temperatures between 2° and 3° Celsius (3.6 and 5.4° F), an amount comparable to current conservative estimates of how much temperatures are expected to rise in coming decades as a consequence of anthropogenic global warming. Most hyperthermals lasted about 40,000 years before temperatures returned to normal.

The study appears in the March 17 issue of the journal Nature.

“These hyperthermals seem not to have been rare events,” Norris said, “hence there are lots of ancient examples of global warming on a scale broadly like the expected future warming. We can use these events to examine the impact of global change on marine ecosystems, climate and ocean circulation.”

The hyperthermals took place roughly every 400,000 years during a warm period of Earth history that prevailed some 50 million years ago. The strongest of them coincided with an event known as the Paleocene-Eocene Thermal Maximum, the transition between two geologic epochs in which global temperatures rose between 4° and 7° C (7.2° and 12.6° F) and needed 200,000 years to return to historical norms. The events stopped taking place around 40 million years ago, when the planet entered a cooling phase. No warming events of the magnitude of these hyperthermals have been detected in the geological record since then.

Phil Sexton, a former student of Norris’ now at the Open University in the United Kingdom, led the analysis of sediment cores collected off the South American coast. In the cores, evidence of the warm periods presented itself in bands of gray sediment layered within otherwise pale greenish mud. The gray sediment contained increased amounts of clay left after the calcareous shells of microscopic organisms were dissolved on the sea floor. These clay-rich intervals are consistent with ocean acidification episodes that would have been triggered by large-scale releases of carbon dioxide. Large influxes of carbon dioxide change the chemistry of seawater by producing greater amounts of carbonic acid in the oceans.

The authors concluded that a release of carbon dioxide from the deep oceans was a more likely cause of the hyperthermals than other triggering events that have been hypothesized. The regularity of the hyperthermals and relatively warm ocean temperatures of the period makes them less likely to have been caused by events such as large melt-offs of methane hydrates, terrestrial burning of peat or even proposed cometary impacts. The hyperthermals could have been set in motion by a build-up of carbon dioxide in the deep oceans caused by slowing or stopping of circulation in ocean basins that prevented carbon dioxide release.

Norris noted that the hyperthermals provide historical perspective on what Earth will experience as it continues to warm from widespread use of fossil fuels, which has increased carbon dioxide concentrations in the atmosphere nearly 50 percent since the beginning of the Industrial Revolution. Hyperthermals can help scientists produce a range of estimates for how long it will take for temperatures to fully revert to historical norms depending on how much warming human activities cause.

“In 100 to 300 years, we could produce a signal on Earth that takes tens of thousands of years to equilibrate, judging from the geologic record,” he said.

The scientists hope to better understand how fast the conditions that set off hyperthermals developed. Norris said that 50 million year old sediments in the North Sea are finely layered enough for scientists to distinguish decade-to-decade or even year-to-year changes.

New findings on the developments of the earthquake disaster

The earthquake disaster on 11 March 2011 was an event of the century not only for Japan. With a magnitude of Mw = 8.9, it was one of the strongest earthquakes ever recorded worldwide. Particularly interesting is that here, two days before, a strong foreshock with a magnitude Mw = 7.2 took place almost exactly at the breaking point of the tsunami-earthquake. The geophysicist Joachim Saul from the GFZ German Research Centre for Geosciences (Helmholtz Association) created an animation which shows the sequence of quakes since March 9.

The animated image is available at www.gfz-potsdam.de . It shows the earthquake activity in the region of Honshu, Japan, measured at the GFZ since 8 March 2011. After a seismically quiet 8th March, the morning (coordinated universal time UTC) of the March 9 began with an earthquake of magnitude 7.2 off the Japanese east coast, followed by a series of smaller aftershocks. The morning of March 11 sees the earthquake disaster that triggered the devastating tsunami. This earthquake is followed by many almost severe aftershocks, two of which almost reach the magnitude 8. In the following time period the activity slowly subsides, and is dominated today (March 16) by relatively small magnitude 5 quakes, though several earthquakes of magnitude 6 are being registered on a daily basis. The activity of aftershocks focuses mainly on the area of the March 11 earthquake. Based on the distribution of the aftershocks, the length of the fraction of the main quake can be estimated at about 400 km. Overall, 428 earthquakes in the region of Honshu were registered at the GFZ since March 9.

By analyzing over 500 GPS stations, the GFZ scientists Rongjiang Wang and Thomas Walter have found that horizontal displacements of up to five meters in an eastern direction occurred at the east coast of Japan. The cause lies in the earthquake zone, i.e. at the contact interface of the Pacific plate with Japan. Computer simulations of this surface show that an offset of up to 25 meters occurred during the earthquake. Calculations of the GFZ modeling group headed by Stephan Sobolev even yielded a displacement of up to 27 meters and a vertical movement of seven meters. This caused an abrupt elevation in the deep sea, and thus triggered the tsunami. The images of the GPS displacement vectors and the computer simulations can also be found among the online material provided by the GFZ.

Already shortly after the quake Andrey Babeyko and Stephan Sobolev of the GFZ modeled the propagation and wave heights of the tsunami in the Pacific over the first 16 hours. The tremendous force of the earthquake is highlighted here, too: in the open Pacific, relatively large wave heights of over one meter were calculated, which agrees very well with the observations. How high the tsunami is piled up on the coast is largely determined by water depth and the shape of the coastline. The GFZ material also contains an image and an animation regarding this work.

Viscous cycle: Quartz is key to plate tectonics

Quartz may play a major role in the movements of continents, known as plate tectonics. -  USGS
Quartz may play a major role in the movements of continents, known as plate tectonics. – USGS

More than 40 years ago, pioneering tectonic geophysicist J. Tuzo Wilson published a paper in the journal Nature describing how ocean basins opened and closed along North America’s eastern seaboard.

His observations, dubbed “The Wilson Tectonic Cycle,” suggested the process occurred many times during Earth’s long history, most recently causing the giant supercontinent Pangaea to split into today’s seven continents.

Wilson’s ideas were central to the so-called Plate Tectonic Revolution, the foundation of contemporary theories for processes underlying mountain-building and earthquakes.

Since his 1967 paper, additional studies have confirmed that large-scale deformation of continents repeatedly occurs in some regions but not others, though the reasons why remain poorly understood.

Now, new findings by Utah State University geophysicist Tony Lowry and colleague Marta Pérez-Gussinyé of Royal Holloway, University of London, shed surprising light on these restless rock cycles.

“It all begins with quartz,” says Lowry, who published results of the team’s recent study in the March 17 issue of Nature.

The scientists describe a new approach to measuring properties of the deep crust.

It reveals quartz’s key role in initiating the churning chain of events that cause Earth’s surface to crack, wrinkle, fold and stretch into mountains, plains and valleys.

“If you’ve ever traveled westward from the Midwest’s Great Plains toward the Rocky Mountains, you may have wondered why the flat plains suddenly rise into steep peaks at a particular spot,” Lowry says.

“It turns out that the crust beneath the plains has almost no quartz in it, whereas the Rockies are very quartz-rich.”

He thinks that those belts of quartz could be the catalyst that sets the mountain-building rock cycle in motion.

“Earthquakes, mountain-building and other expressions of continental tectonics depend on how rocks flow in response to stress,” says Lowry.

“We know that tectonics is a response to the effects of gravity, but we know less about rock flow properties and how they change from one location to another.”

Wilson’s theories provide an important clue, Lowry says, as scientists have long observed that mountain belts and rift zones have formed again and again at the same locations over long periods of time.

But why?

“Over the last few decades, we’ve learned that high temperatures, water and abundant quartz are all critical factors in making rocks flow more easily,” Lowry says. “Until now, we haven’t had the tools to measure these factors and answer long-standing questions.”

Since 2002, the National Science Foundation (NSF)-funded Earthscope Transportable Array of seismic stations across the western United States has provided remote sensing data about the continent’s rock properties.

“We’ve combined Earthscope data with other geophysical measurements of gravity and surface heat flow in an entirely new way, one that allows us to separate the effects of temperature, water and quartz in the crust,” Lowry says.

Earthscope measurements enabled the team to estimate the thickness, along with the seismic velocity ratio, of continental crust in the American West.

“This intriguing study provides new insights into the processes driving large-scale continental deformation and dynamics,” says Greg Anderson, NSF program director for EarthScope. “These are key to understanding the assembly and evolution of continents.”

Seismic velocity describes how quickly sound waves and shear waves travel through rock, offering clues to its temperature and composition.

“Seismic velocities are sensitive to both temperature and rock type,” Lowry says.

“But if the velocities are combined as a ratio, the temperature dependence drops out. We found that the velocity ratio was especially sensitive to quartz abundance.”

Even after separating out the effects of temperature, the scientists found that a low seismic velocity ratio, indicating weak, quartz-rich crust, systematically occurred in the same place as high lower-crustal temperatures modeled independently from surface heat flow.

“That was a surprise,” he says. “We think this indicates a feedback cycle, where quartz starts the ball rolling.”

If temperature and water are the same, Lowry says, rock flow will focus where the quartz is located because that’s the only weak link.

Once the flow starts, the movement of rock carries heat with it and that efficient movement of heat raises temperature, resulting in weakening of crust.

“Rock, when it warms up, is forced to release water that’s otherwise chemically bound in crystals,” he says.

Water further weakens the crust, which increasingly focuses the deformation in a specific area.