Mysterious Midcontinent Rift is a geological hybrid

The volcanic rocks of the 1.1 billion-year-old Midcontinent Rift play a prominent role in the natural beauty of Isle Royale National Park in Lake Superior. -  Seth Stein, Northwestern University
The volcanic rocks of the 1.1 billion-year-old Midcontinent Rift play a prominent role in the natural beauty of Isle Royale National Park in Lake Superior. – Seth Stein, Northwestern University

An international team of geologists has a new explanation for how the Midwest’s biggest geological feature — an ancient and giant 2,000-mile-long underground crack that starts in Lake Superior and runs south to Oklahoma and to Alabama — evolved.

Scientists from Northwestern University, the University of Illinois at Chicago (UIC), the University of Gottingen in Germany and the University of Oklahoma report that the 1.1 billion-year-old Midcontinent Rift is a geological hybrid, having formed in three stages: it started as an enormous narrow crack in the Earth’s crust; that space then filled with an unusually large amount of volcanic rock; and, finally, the igneous rocks were forced to the surface, forming the beautiful scenery seen today in the Lake Superior area of the Upper Midwest.

The rift produced some of the Midwest’s most interesting geology and scenery, but there has never been a good explanation for what caused it. Inspired by vacations to Lake Superior, Seth and Carol A. Stein, a husband-and-wife team from Northwestern and UIC, have been determined to learn more in recent years.

Their study, which utilized cutting-edge geologic software and seismic images of rock located below the Earth’s surface in areas of the rift, will be presented Oct. 20 at the Geological Society of America annual meeting in Vancouver.

“The Midcontinent Rift is a very strange beast,” said the study’s lead author, Carol Stein, professor of Earth and Environmental Sciences at UIC. “Rifts are long, narrow cracks splitting the Earth’s crust, with some volcanic rocks in them that rise to fill the cracks. Large igneous provinces, or LIPs, are huge pools of volcanic rocks poured out at the Earth’s surface. The Midcontinent Rift is both of these — like a hybrid animal.”

“Geologists call it a rift because it’s long and narrow,” explained Seth Stein, a co-author of the study, “but it’s got much more volcanic rock inside it than any other rift on a continent, so it’s also a LIP. We’ve been wondering for a long time how this could have happened.” He is the William Deering Professor of Geological Sciences at the Weinberg College of Arts and Sciences.

This question is one of those that EarthScope, a major National Science Foundation program involving geologists from across the U.S., seeks to answer. In this case, the team used images of the Earth at depth from seismic experiments across Lake Superior and EarthScope surveys of other parts of the Midcontinent Rift. The images show the rock layers at depth, much as X-ray photos show the bones in people’s bodies.

In reviewing the images, the researchers found the Midcontinent Rift appeared to evolve in three stages.

“First, the rocks were pulled apart, forming a rift valley,” Carol Stein said. “As the rift was pulling apart, magma flowed into the developing crack. After about 10 million years, the crack stopped growing, but more magma kept pouring out on top. Older magma layers sunk under the weight of new magma, so the hole kept deepening. Eventually the magma ran out, leaving a large igneous province — a 20-mile-thick pile of volcanic rocks. Millions of years later, the rift got squeezed as a new supercontinent reassembled, which made the Earth’s crust under the rift thicker.”

To test this idea, the Steins turned to Jonas Kley, professor of geology at Germany’s Gottingen University, their host during a research year in Germany sponsored by the Alexander von Humboldt Foundation.

Kley used software that allows geologic time to run backwards. “We start with the rocks as they are today,” Kley explained, “and then undo movement on faults and vertical movements. It’s like reconstructing a car crash. When we’re done we have a picture of what happened and when. This lets us test ideas and see if they work.”

Kley’s analysis showed that the three-stage history made sense — the Midcontinent Rift started as a rift and then evolved into a large igneous province. The last stage brought rocks in the Lake Superior area to the surface.

Other parts of the picture fit together nicely, the Steins said. David Hindle, also from Gottingen University, used a computer model to show that the rift’s shape seen in the seismic images results from the crust bending under weight of magma.

Randy Keller, a professor and director of the Oklahoma Geological Survey, found that the weight of the dense magma filling the rift explains the stronger pull of gravity measured above the rift. He points out that these variations in the gravity field are the major evidence used to map the extent of the rift.

“It’s funny,” Seth Stein mused. “Carol and I have been living in Chicago for more than 30 years. We often have gone up to Lake Superior for vacations but didn’t think much about the geology. It’s only in the past few years that we realized there’s a great story there and started working on it. There are many studies going on today, which will give more results in the next few years.”

The Steins now are working with other geologists to help park rangers and teachers tell this story to the public. For example, a good way to think about how rifts work is to observe what happens if you pull both ends of a Mars candy bar: the top chocolate layer breaks, and the inside stretches.

“Sometimes people think that exciting geology only happens in places like California,” Seth Stein said. “We hope results like this will encourage young Midwesterners to study geology and make even further advances.”

Hydraulic fracturing linked to earthquakes in Ohio

Hydraulic fracturing triggered a series of small earthquakes in 2013 on a previously unmapped fault in Harrison County, Ohio, according to a study published in the journal Seismological Research Letters (SRL).

Nearly 400 small earthquakes occurred between Oct. 1 and Dec. 13, 2013, including 10 “positive” magnitude earthquake, none of which were reported felt by the public. The 10 positive magnitude earthquakes, which ranged from magnitude 1.7 to 2.2, occurred between Oct. 2 and 19, coinciding with hydraulic fracturing operations at nearby wells.

This series of earthquakes is the first known instance of seismicity in the area.

Hydraulic fracturing, or fracking, is a method for extracting gas and oil from shale rock by injecting a high-pressure water mixture directed at the rock to release the gas inside. The process of hydraulic fracturing involves injecting water, sand and chemicals into the rock under high pressure to create cracks. The process of cracking rocks results in micro-earthquakes. Hydraulic fracturing usually creates only small earthquakes, ones that have magnitude in the range of negative 3 (−3) to negative 1 (-1).

“Hydraulic fracturing has the potential to trigger earthquakes, and in this case, small ones that could not be felt, however the earthquakes were three orders of magnitude larger than normally expected,” said Paul Friberg, a seismologist with Instrumental Software Technologies, Inc. (ISTI) and a co-author of the study.

The earthquakes revealed an east-west trending fault that lies in the basement formation at approximately two miles deep and directly below the three horizontal gas wells. The EarthScope Transportable Array Network Facility identified the first earthquakes on Oct. 2, 2013, locating them south of Clendening Lake near the town of Uhrichsville, Ohio. A subsequent analysis identified 190 earthquakes during a 39-hour period on Oct. 1 and 2, just hours after hydraulic fracturing began on one of the wells.

The micro-seismicity varied, corresponding with the fracturing activity at the wells. The timing of the earthquakes, along with their tight linear clustering and similar waveform signals, suggest a unique source for the cause of the earthquakes — the hydraulic fracturing operation. The fracturing likely triggered slip on a pre-existing fault, though one that is located below the formation expected to confine the fracturing, according to the authors.

“As hydraulic fracturing operations explore new regions, more seismic monitoring will be needed since many faults remain unmapped.” Friberg co-authored the paper with Ilya Dricker, also with ISTI, and Glenda Besana-Ostman originally with Ohio Department of Natural Resources, and now with the Bureau of Reclamation at the U.S. Department of Interior.

Warm US West, cold East: A 4,000-year pattern

<IMG SRC="/Images/485889256.jpg" WIDTH="350" HEIGHT="262" BORDER="0" ALT="University of Utah geochemist Gabe Bowen led a new study, published in Nature Communications, showing that the curvy jet stream pattern that brought mild weather to western North America and intense cold to the eastern states this past winter has become more dominant during the past 4,000 years than it was from 8,000 to 4,000 years ago. The study suggests global warming may aggravate the pattern, meaning such severe winter weather extremes may be worse in the future. – Lee J. Siegel, University of Utah.”>
University of Utah geochemist Gabe Bowen led a new study, published in Nature Communications, showing that the curvy jet stream pattern that brought mild weather to western North America and intense cold to the eastern states this past winter has become more dominant during the past 4,000 years than it was from 8,000 to 4,000 years ago. The study suggests global warming may aggravate the pattern, meaning such severe winter weather extremes may be worse in the future. – Lee J. Siegel, University of Utah.

Last winter’s curvy jet stream pattern brought mild temperatures to western North America and harsh cold to the East. A University of Utah-led study shows that pattern became more pronounced 4,000 years ago, and suggests it may worsen as Earth’s climate warms.

“If this trend continues, it could contribute to more extreme winter weather events in North America, as experienced this year with warm conditions in California and Alaska and intrusion of cold Arctic air across the eastern USA,” says geochemist Gabe Bowen, senior author of the study.

The study was published online April 16 by the journal Nature Communications.

“A sinuous or curvy winter jet stream means unusual warmth in the West, drought conditions in part of the West, and abnormally cold winters in the East and Southeast,” adds Bowen, an associate professor of geology and geophysics at the University of Utah. “We saw a good example of extreme wintertime climate that largely fit that pattern this past winter,” although in the typical pattern California often is wetter.

It is not new for scientists to forecast that the current warming of Earth’s climate due to carbon dioxide, methane and other “greenhouse” gases already has led to increased weather extremes and will continue to do so.

The new study shows the jet stream pattern that brings North American wintertime weather extremes is millennia old – “a longstanding and persistent pattern of climate variability,” Bowen says. Yet it also suggests global warming may enhance the pattern so there will be more frequent or more severe winter weather extremes or both.

“This is one more reason why we may have more winter extremes in North America, as well as something of a model for what those extremes may look like,” Bowen says. Human-caused climate change is reducing equator-to-pole temperature differences; the atmosphere is warming more at the poles than at the equator. Based on what happened in past millennia, that could make a curvy jet stream even more frequent and-or intense than it is now, he says.

Bowen and his co-authors analyzed previously published data on oxygen isotope ratios in lake sediment cores and cave deposits from sites in the eastern and western United States and Canada. Those isotopes were deposited in ancient rainfall and incorporated into calcium carbonate. They reveal jet stream directions during the past 8,000 years, a geological time known as middle and late stages of the Holocene Epoch.

Next, the researchers did computer modeling or simulations of jet stream patterns – both curvy and more direct west to east – to show how changes in those patterns can explain changes in the isotope ratios left by rainfall in the old lake and cave deposits.

They found that the jet stream pattern – known technically as the Pacific North American teleconnection – shifted to a generally more “positive phase” – meaning a curvy jet stream – over a 500-year period starting about 4,000 years ago. In addition to this millennial-scale change in jet stream patterns, they also noted a cycle in which increases in the sun’s intensity every 200 years make the jet stream flatter.

Bowen conducted the study with Zhongfang Liu of Tianjin Normal University in China, Kei Yoshimura of the University of Tokyo, Nikolaus Buenning of the University of Southern California, Camille Risi of the French National Center for Scientific Research, Jeffrey Welker of the University of Alaska at Anchorage, and Fasong Yuan of Cleveland State University.

The study was funded by the National Science Foundation, National Natural Science Foundation of China, Japan Society for the Promotion of Science and a joint program by the society and Japan’s Ministry of Education, Culture, Sports, Science and Technology: the Program for Risk Information on Climate Change.

Sinuous Jet Stream Brings Winter Weather Extremes

The Pacific North American teleconnection, or PNA, “is a pattern of climate variability” with positive and negative phases, Bowen says.

“In periods of positive PNA, the jet stream is very sinuous. As it comes in from Hawaii and the Pacific, it tends to rocket up past British Columbia to the Yukon and Alaska, and then it plunges down over the Canadian plains and into the eastern United States. The main effect in terms of weather is that we tend to have cold winter weather throughout most of the eastern U.S. You have a freight car of arctic air that pushes down there.”

Bowen says that when the jet stream is curvy, “the West tends to have mild, relatively warm winters, and Pacific storms tend to occur farther north. So in Northern California, the Pacific Northwest and parts of western interior, it tends to be relatively dry, but tends to be quite wet and unusually warm in northwest Canada and Alaska.”

This past winter, there were times of a strongly curving jet stream, and times when the Pacific North American teleconnection was in its negative phase, which means “the jet stream is flat, mostly west-to-east oriented,” and sometimes split, Bowen says. In years when the jet stream pattern is more flat than curvy, “we tend to have strong storms in Northern California and Oregon. That moisture makes it into the western interior. The eastern U.S. is not affected by arctic air, so it tends to have milder winter temperatures.”

The jet stream pattern – whether curvy or flat – has its greatest effects in winter and less impact on summer weather, Bowen says. The curvy pattern is enhanced by another climate phenomenon, the El Nino-Southern Oscillation, which sends a pool of warm water eastward to the eastern Pacific and affects climate worldwide.

Traces of Ancient Rains Reveal Which Way the Wind Blew

Over the millennia, oxygen in ancient rain water was incorporated into calcium carbonate deposited in cave and lake sediments. The ratio of rare, heavy oxygen-18 to the common isotope oxygen-16 in the calcium carbonate tells geochemists whether clouds that carried the rain were moving generally north or south during a given time.

Previous research determined the dates and oxygen isotope ratios for sediments in the new study, allowing Bowen and colleagues to use the ratios to tell if the jet stream was curvy or flat at various times during the past 8,000 years.

Bowen says air flowing over the Pacific picks up water from the ocean. As a curvy jet stream carries clouds north toward Alaska, the air cools and some of the water falls out as rain, with greater proportions of heavier oxygen-18 falling, thus raising the oxygen-18-to-16 ratio in rain and certain sediments in western North America. Then the jet stream curves south over the middle of the continent, and the water vapor, already depleted in oxygen-18, falls in the East as rain with lower oxygen-18-to-16 ratios.

When the jet stream is flat and moving east-to-west, oxygen-18 in rain is still elevated in the West and depleted in the East, but the difference is much less than when the jet stream is curvy.

By examining oxygen isotope ratios in lake and cave sediments in the West and East, Bowen and colleagues showed that a flatter jet stream pattern prevailed from about 8,000 to 4,000 years ago in North America, but then, over only 500 years, the pattern shifted so that curvy jet streams became more frequent or severe or both. The method can’t distinguish frequency from severity.

The new study is based mainly on isotope ratios at Buckeye Creek Cave, W. Va.; Lake Grinell, N.J.; Oregon Caves National Monument; and Lake Jellybean, Yukon.

Additional data supporting increasing curviness of the jet stream over recent millennia came from seven other sites: Crawford Lake, Ontario; Castor Lake, Wash.; Little Salt Spring, Fla.; Estancia Lake, N.M.; Crevice Lake, Mont.; and Dog and Felker lakes, British Columbia. Some sites provided oxygen isotope data; others showed changes in weather patterns based on tree ring growth or spring deposits.

Simulating the Jet Stream

As a test of what the cave and lake sediments revealed, Bowen’s team did computer simulations of climate using software that takes isotopes into account.

Simulations of climate and oxygen isotope changes in the Middle Holocene and today resemble, respectively, today’s flat and curvy jet stream patterns, supporting the switch toward increasing jet stream sinuosity 4,000 years ago.

Why did the trend start then?

“It was a when seasonality becomes weaker,” Bowen says. The Northern Hemisphere was closer to the sun during the summer 8,000 years ago than it was 4,000 years ago or is now due to a 20,000-year cycle in Earth’s orbit. He envisions a tipping point 4,000 years ago when weakening summer sunlight reduced the equator-to-pole temperature difference and, along with an intensifying El Nino climate pattern, pushed the jet stream toward greater curviness.

Earthquake simulation tops 1 quadrillion flops

This shows a visualization of vibrations inside the Merapi volcano (island of Java) computed with the earthquake simulation software SeisSol. -  Alex Breuer (TUM) / Christian Pelties (LMU)
This shows a visualization of vibrations inside the Merapi volcano (island of Java) computed with the earthquake simulation software SeisSol. – Alex Breuer (TUM) / Christian Pelties (LMU)

Geophysicists use the SeisSol earthquake simulation software to investigate rupture processes and seismic waves beneath the Earth’s surface. Their goal is to simulate earthquakes as accurately as possible to be better prepared for future events and to better understand the fundamental underlying mechanisms. However, the calculations involved in this kind of simulation are so complex that they push even super computers to their limits.

In a collaborative effort, the workgroups led by Dr. Christian Pelties at the Department of Geo and Environmental Sciences at LMU and Professor Michael Bader at the Department of Informatics at TUM have optimized the SeisSol program for the parallel architecture of the Garching supercomputer “SuperMUC”, thereby speeding up calculations by a factor of five.

Using a virtual experiment they achieved a new record on the SuperMUC: To simulate the vibrations inside the geometrically complex Merapi volcano on the island of Java, the supercomputer executed 1.09 quadrillion floating point operations per second. SeisSol maintained this unusually high performance level throughout the entire three hour simulation run using all of SuperMUC’s 147,456 processor cores.

Complete parallelization

This was possible only following the extensive optimization and the complete parallelization of the 70,000 lines of SeisSol code, allowing a peak performance of up to 1.42 petaflops. This corresponds to 44.5 percent of Super MUC’s theoretically available capacity, making SeisSol one of the most efficient simulation programs of its kind worldwide.

“Thanks to the extreme performance now achievable, we can run five times as many models or models that are five times as large to achieve significantly more accurate results. Our simulations are thus inching ever closer to reality,” says the geophysicist Dr. Christian Pelties. “This will allow us to better understand many fundamental mechanisms of earthquakes and hopefully be better prepared for future events.”

The next steps are earthquake simulations that include rupture processes on the meter scale as well as the resultant destructive seismic waves that propagate across hundreds of kilometers. The results will improve the understanding of earthquakes and allow a better assessment of potential future events.

“Speeding up the simulation software by a factor of five is not only an important step for geophysical research,” says Professor Michael Bader of the Department of Informatics at TUM. “We are, at the same time, preparing the applied methodologies and software packages for the next generation of supercomputers that will routinely host the respective simulations for diverse geoscience applications.”

Volcano discovered smoldering under a kilometer of ice in West Antarctica

Mount Sidley, at the leading edge of the Executive Committee Range in Marie Byrd Land is the last volcano in the chain that rises above the surface of the ice. But a group of seismologists has detected new volcanic activity under the ice about 30 miles ahead of Mount Sidley in the direction of the range's migration. The new finding suggests that the source of magma is moving beyond the chain beneath the crust and the Antarctic Ice Sheet. -  Doug Wiens
Mount Sidley, at the leading edge of the Executive Committee Range in Marie Byrd Land is the last volcano in the chain that rises above the surface of the ice. But a group of seismologists has detected new volcanic activity under the ice about 30 miles ahead of Mount Sidley in the direction of the range’s migration. The new finding suggests that the source of magma is moving beyond the chain beneath the crust and the Antarctic Ice Sheet. – Doug Wiens

It wasn’t what they were looking for but that only made the discovery all the more exciting.

In January 2010 a team of scientists had set up two crossing lines of seismographs across Marie Byrd Land in West Antarctica. It was the first time the scientists had deployed many instruments in the interior of the continent that could operate year-round even in the coldest parts of Antarctica.

Like a giant CT machine, the seismograph array used disturbances created by distant earthquakes to make images of the ice and rock deep within West Antarctica.

There were big questions to be asked and answered. The goal, says Doug Wiens, professor of earth and planetary science at Washington University in St. Louis and one of the project’s principle investigators, was essentially to weigh the ice sheet to help reconstruct Antarctica’s climate history. But to do this accurately the scientists had to know how the earth’s mantle would respond to an ice burden, and that depended on whether it was hot and fluid or cool and viscous. The seismic data would allow them to map the mantle’s properties.

In the meantime, automated-event-detection software was put to work to comb the data for anything unusual.

When it found two bursts of seismic events between January 2010 and March 2011, Wiens’ PhD student Amanda Lough looked more closely to see what was rattling the continent’s bones.

Was it rock grinding on rock, ice groaning over ice, or, perhaps, hot gases and liquid rock forcing their way through cracks in a volcanic complex?

Uncertain at first, the more Lough and her colleagues looked, the more convinced they became that a new volcano was forming a kilometer beneath the ice.

The discovery of the new as yet unnamed volcano is announced in the Nov. 17 advanced online issue of Nature Geoscience.

Following the trail of clues

The teams that install seismographs in Antarctica are given first crack at the data. Lough had done her bit as part of the WUSTL team, traveling to East Antarctica three times to install or remove stations in East Antarctica.

In 2010 many of the instruments were moved to West Antarctica and Wiens asked Lough to look at the seismic data coming in, the first large-scale dataset from this part of the continent.

“I started seeing events that kept occurring at the same location, which was odd, “Lough said. “Then I realized they were close to some mountains-but not right on top of them.”

“My first thought was, ‘Okay, maybe its just coincidence.’ But then I looked more closely and realized that the mountains were actually volcanoes and there was an age progression to the range. The volcanoes closest to the seismic events were the youngest ones.”

The events were weak and very low frequency, which strongly suggested they weren’t tectonic in origin. While low-magnitude seismic events of tectonic origin typically have frequencies of 10 to 20 cycles per second, this shaking was dominated by frequencies of 2 to 4 cycles per second.

Ruling out ice

But glacial processes can generate low-frequency events. If the events weren’t tectonic could they be glacial?

To probe farther, Lough used a global computer model of seismic velocities to “relocate” the hypocenters of the events to account for the known seismic velocities along different paths through the Earth. This procedure collapsed the swarm clusters to a third their original size.

It also showed that almost all of the events had occurred at depths of 25 to 40 kilometers (15 to 25 miles below the surface). This is extraordinarily deep-deep enough to be near the boundary between the earth’s crust and mantle, called the Moho, and more or less rules out a glacial origin.

It also casts doubt on a tectonic one. “A tectonic event might have a hypocenter 10 to 15 kilometers (6 to 9 miles) deep, but at 25 to 40 kilometers, these were way too deep,” Lough says.

A colleague suggested that the event waveforms looked like Deep Long Period earthquakes, or DPLs, which occur in volcanic areas, have the same frequency characteristics and are as deep. “Everything matches up,” Lough says.

An ash layer encased in ice

The seismologists also talked to Duncan Young and Don Blankenship of the University of Texas who fly airborne radar over Antarctica to produce topographic maps of the bedrock. “In these maps, you can see that there’s elevation in the bed topography at the same location as the seismic events,” Lough says.

The radar images also showed a layer of ash buried under the ice. “They see this layer all around our group of earthquakes and only in this area,” Lough says.

“Their best guess is that it came from Mount Waesche, an existing volcano near Mt Sidley. But that is also interesting because scientists had no idea when Mount Waesche was last active, and the ash layer is sets the age of the eruption at 8,000 years ago. “

What’s up down there?

The case for volcanic origin has been made. But what exactly is causing the seismic activity?

“Most mountains in Antarctica are not volcanic,” Wiens says, “but most in this area are. Is it because East and West Antarctica are slowly rifting apart? We don’t know exactly. But we think there is probably a hot spot in the mantle here producing magma far beneath the surface.”

“People aren’t really sure what causes DPLs,” Lough says. “It seems to vary by volcanic complex, but most people think it’s the movement of magma and other fluids that leads to pressure-induced vibrations in cracks within volcanic and hydrothermal systems.”

Will the new volcano erupt?

“Definitely,” Lough says. “In fact because of the radar shows a mountain beneath the ice I think it has erupted in the past, before the rumblings we recorded.

Will the eruptions punch through a kilometer or more of ice above it?

The scientists calculated that an enormous eruption, one that released a thousand times more energy than the typical eruption, would be necessary to breach the ice above the volcano.

On the other hand a subglacial eruption and the accompanying heat flow will melt a lot of ice. “The volcano will create millions of gallons of water beneath the ice-many lakes full,” says Wiens. This water will rush beneath the ice towards the sea and feed into the hydrological catchment of the MacAyeal Ice Stream, one of several major ice streams draining ice from Marie Byrd Land into the Ross Ice Shelf.

By lubricating the bedrock, it will speed the flow of the overlying ice, perhaps increasing the rate of ice-mass loss in West Antarctica.

“We weren’t expecting to find anything like this,” Wiens says

Google street view — tool for recording earthquake damage

These images use Google street view scans to identify quake-related damage. -  K-G Hinzen/SRL
These images use Google street view scans to identify quake-related damage. – K-G Hinzen/SRL

A scientist from Cologne University has used Google’s online street view scans to document the damage caused by the 2009 L’Aquila earthquake and suggests that the database would be a useful tool for surveying damage caused by future earthquakes. The findings are published in the November issue of the Seismological Research Letters.

The magnitude 6.3 2009 L’Aquila earthquake in the Italian Abruzzi Mountains caused widespread damage in the city and surrounding villages.

In 2011 Klaus-G. Hinzen, a seismologist with Cologne University in Germany, and colleagues from Italy conducted a field survey, taking 3D laser scans to document earthquake rotated objects. Later Hinzen used Google Earth software to map the exact locations of numerous photos of damaged constructions and when consulting Google street views, discovered the scans had been taken less than one year before the earthquake, providing an unexpected opportunity to compare the locations captured by the 2011 photos with Google street view scans.

Google Earth’s aerial views have helped capture an overview of damage to L’Aquila and specific collapsed structures. But the Google street views show the details – fractures, plaster breaks and collapsed walls. The scans help identify the damage caused by the quake rather than a lack of building maintenance or disrepair.

Hinzen suggests that any planned systematic survey of earthquake damage could benefit from the use of Google street view, if available for the area under investigation.

3D model reveals new information about iconic volcano

The volcano on the Scottish peninsula Ardnamurchan is a popular place for the study of rocks and structures in the core of a volcano. Geology students read about it in text books and geologists have been certain that the Ardnamurchan volcano have three successive magma chambers. However, an international group of researchers, lead from Uppsala University, Sweden, has now showed that the volcano only has one single magma chamber.

The new study is published in Scientific Reports, the new open access journal of the Nature Publishing Group.

The 58 million year old Ardnamurchan volcano is an iconic site for the study of rocks and structures in the core of a volcano, which is why thousands of geology students from all over the world visit Ardnamurchan every year. Since the early days of modern geology the Ardnamurchan volcano is believed to have had three successive magma chambers (or centres) that fed hundreds of thin arcuate basalt intrusions, so-called cone sheets, that are exposed all over the peninsula.

The researchers from the universities of Uppsala (Sweden), Quebec (Canada), Durham and St. Andrews (UK), challenges the 3-centre concept using a 3D model of the subsurface beneath today’s land surface. According to this model, the Ardnamurchan volcano was underlain by a single but elongate magma chamber.

Studying extinct volcanoes is a way for geologists to understand the interior of volcanic edifices and to gain knowledge on the processes that occur within active volcanoes today. It is therefore that the volcanic centres of western Scotland and northeastern Ireland were intensely studied by British geologists in the late 19th and early 20th century. It was in these eroded volcanoes that the foundation for modern volcanology was laid. Ardnamurchan in particular has an iconic status among geologists everywhere in the world. Geology students read about it in text books and visit it during field excursions.

“It came as a bit of a surprise to us that there is still so much to learn from a place that has received so much attention by geologists, in particular since we used the original data collected in 1930 by Richey and Thomas.” said Dr Steffi Burchardt, senior lecturer at Uppsala University.

“Modern software allows visualizing field measurements in 3D and opens up a range of new perspectives. After projecting hundreds of cone sheets in the computer model, we were unable to identify three separate centres. The cone sheets instead appear to originate from a single, large, and elongate magma chamber about 1.5 km below today’s land surface.”

This magma chamber beneath Ardnamurchan was up to 6 km long and has the shape of an elongate saucer.

“These types of magma chambers are known to exist for example within volcanoes in Iceland have have been detected in the North Sea bedrock. Ardnamurchan’s new magma chamber is hence much more realistic considering everything we have learned about Ardnamurchan and other extinct and active volcanoes since the time of Richey and Thomas” said Prof. Valentin Troll, chair in petrology at Uppsala University.

Birth of Earth’s continents

New research led by a University of Calgary geophysicist provides strong evidence against continent formation above a hot mantle plume, similar to an environment that presently exists beneath the Hawaiian Islands.

The analysis, published this month in Nature Geoscience, indicates that the nuclei of Earth’s continents formed as a byproduct of mountain-building processes, by stacking up slabs of relatively cold oceanic crust. This process created thick, strong ‘keels’ in the Earth’s mantle that supported the overlying crust and enabled continents to form.

The scientific clues leading to this conclusion derived from computer simulations of the slow cooling process of continents, combined with analysis of the distribution of diamonds in the deep Earth.

The Department of Geoscience’s Professor David Eaton developed computer software to enable numerical simulation of the slow diffusive cooling of Earth’s mantle over a time span of billions of years.

Working in collaboration with former graduate student, Assistant Professor Claire Perry from the Universite du Quebec a Montreal, Eaton relied on the geological record of diamonds found in Africa to validate his innovative computer simulations.

“For the first time, we are able to quantify the thermal evolution of a realistic 3D Earth model spanning billions of years from the time continents were formed,” states Perry.

Mantle plumes consist of an upwelling of hot material within Earth’s mantle. Plumes are thought to be the cause of some volcanic centres, especially those that form a linear volcanic chain like Hawaii. Diamonds, which are generally limited to the deepest and oldest parts of the continental mantle, provide a wealth of information on how the host mantle region may have formed.

“Ancient mantle keels are relatively strong, cold and sometimes diamond-bearing material. They are known to extend to depths of 200 kilometres or more beneath the ancient core regions of continents,” explains Professor David Eaton. “These mantle keels resisted tectonic recycling into the deep mantle, allowing the preservation of continents over geological time and providing suitable environments for the development of the terrestrial biosphere.”

His method takes into account important factors such as dwindling contribution of natural radioactivity to the heat budget, and allows for the calculation of other properties that strongly influence mantle evolution, such as bulk density and rheology (mechanical strength).

“Our computer model emerged from a multi-disciplinary approach combining classical physics, mathematics and computer science,” explains Eaton. “By combining those disciplines, we were able to tackle a fundamental geoscientific problem, which may open new doors for future research.”

This work provides significant new scientific insights into the formation and evolution of continents on Earth.

Click on this image to view the .mp4 video
This computer simulation spanning 2.5 billion years of Earth history is showing density difference of the mantle, compared to an oceanic reference, starting from a cooler initial state. Density is controlled by mantle composition as well as slowly cooling temperature; a keel of low-density material extending to about 260 km depth on the left side (x < 600 km) provides buoyancy that prevents continents from being subducted ('recycled' into the deep Earth). Graph on the top shows a computed elevation model. – David Eaton, University of Calgary.

3-D Earth model developed at Sandia Labs more accurately pinpoints source of earthquakes, explosions

Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth's mantle and crust designed to help pinpoint the location of all types of explosions. -  Photo by Randy Montoya, Sandia National Laboratories
Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth’s mantle and crust designed to help pinpoint the location of all types of explosions. – Photo by Randy Montoya, Sandia National Laboratories

During the Cold War, U.S. and international monitoring agencies could spot nuclear tests and focused on measuring their sizes. Today, they’re looking around the globe to pinpoint much smaller explosives tests.

Under the sponsorship of the National Nuclear Security Administration’s Office of Defense Nuclear Nonproliferation R&D, Sandia National Laboratories and Los Alamos National Laboratory have partnered to develop a 3-D model of the Earth’s mantle and crust called SALSA3D, or Sandia-Los Alamos 3D. The purpose of this model is to assist the US Air Force and the international Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) in Vienna, Austria, more accurately locate all types of explosions.

The model uses a scalable triangular tessellation and seismic tomography to map the Earth’s “compressional wave seismic velocity,” a property of the rocks and other materials inside the Earth that indicates how quickly compressional waves travel through them and is one way to accurately locate seismic events, Sandia geophysicist Sandy Ballard said. Compressional waves – measured first after seismic events – move the particles in rocks and other materials minute distances backward and forward between the location of the event and the station detecting it.

SALSA3D also reduces the uncertainty in the model’s predictions, an important feature for decision-makers who must take action when suspicious activity is detected, he added.

“When you have an earthquake or nuclear explosion, not only do you need to know where it happened, but also how well you know that. That’s a difficult problem for these big 3-D models. It’s mainly a computational problem,” Ballard said. “The math is not so tough, just getting it done is hard, and we’ve accomplished that.”

A Sandia team has been writing and refining code for the model since 2007 and is now demonstrating SALSA3D is more accurate than current models.

In recent tests, SALSA3D was able to predict the source of seismic events over a geographical area that was 26 percent smaller than the traditional one-dimensional model and 9 percent smaller than a recently developed Regional Seismic Travel Time (RSTT) model used with the one-dimensional model.

GeoTess software release

Sandia recently released SALSA3D’s framework – the triangular tessellated grid on which the model is built – to other Earth scientists, seismologists and the public. By standardizing the framework, the seismological research community can more easily share models of the Earth’s structure and global monitoring agencies can better test different models. Both activities are hampered by the plethora of models available today, Ballard said. (See box.)

“GeoTess makes models compatible and standardizes everything,” he said. “This would really facilitate sharing of different models, if everyone agreed on it.”

Seismologists and researchers worldwide can now download GeoTess, which provides a common model parameterization for multidimensional Earth models and a software support system that addresses the construction, population, storage and interrogation of data stored in the model. GeoTess is not specific to any particular data, so users have considerable flexibility in how they store information in the model. The free package, including source code, is being released under the very liberal BSD Open Source License. The code is available in Java and C++, with interfaces to the C++ version written in C and Fortran90. GeoTess has been tested on multiple platforms, including Linux, SunOS, MacOSX and Windows. GeoTess is available here.

When an explosion goes off, the energy travels through the Earth as waves that are picked up by seismometers at U.S. and international ground monitoring stations associated with nuclear explosion monitoring organizations worldwide. Scientists use these signals to determine the location.

They first predict the time taken for the waves to travel from their source through the Earth to each station. To calculate that, they have to know the seismic velocity of the Earth’s materials from the crust to the inner core, Ballard said.

“If you have material that has very high seismic velocity, the waves travel very quickly, but the energy travels less quickly through other kinds of materials, so it takes the signals longer to travel from the source to the receiver,” he says.

For the past 100 years, seismologists have predicted the travel time of seismic energy from source to receiver using one-dimensional models. These models, which are still widely used today, account only for radial variations in seismic velocity and ignore variations in geographic directions. They yield seismic event locations that are reasonably accurate, but not nearly as precise as locations calculated with high fidelity 3-D models.

Modern 3-D models of the Earth, like SALSA3D, account for distortions of the seismic wavefronts caused by minor lateral differences in the properties of rocks and other materials.

For example, waves are distorted when they move through a geological feature called a subduction zone, such as the one beneath the west coast of South America where one tectonic plate under the Pacific Ocean is diving underneath the Andes Mountains. This happens at about the rate at which fingernails grow, but, geologically speaking, that’s fast, Ballard said.

One-dimensional models, like the widely used ak135 developed in the 1990s, are good at predicting the travel time of waves when the distance from the source to the receiver is large because these waves spend most of their time traveling through the deepest, most homogenous parts of the Earth. They don’t do so well at predicting travel time to nearby events where the waves spend most of their time in the Earth’s crust or the shallowest parts of the mantle, both of which contain a larger variety of materials than the lower mantle and the Earth’s core.

RSTT, a previous model developed jointly by Sandia, Los Alamos and Lawrence Livermore national laboratories, tried to solve that problem and works best at ranges of about 60-1,200 miles (100-2,000 kilometers).

Still, “the biggest errors we get are close to the surface of the Earth. That’s where the most variability in materials is,” Ballard said.

Seismic tomography gives SALSA3D accuracy

Today, Earth scientists are mapping three dimensions: the radius, latitude and longitude.

Anyone who’s studied a globe or world atlas knows that the traditional grid of longitudinal and latitudinal lines work all right the closer you are to the equator, but at the poles, the lines are too close together. For nuclear explosion monitoring, Earth models must accurately characterize the polar regions even though they are remote because seismic waves travel under them, Ballard said.

Triangular tessellation solves that with nodes, or intersections of the triangles, that can be accurately modeled even at the poles. The triangles can be smaller where more detail is needed and larger in areas that require less detail, like the oceans. Plus the model extends into the Earth like columns of stacked pieces of pie without the rounded crust edges.

The way Sandia calculates the seismic velocities uses the same math that is used to detect a tumor in an MRI, except on a global, rather than a human, scale.

Sandia uses historical data from 118,000 earthquakes and 13,000 current and former monitoring stations worldwide collected by Los Alamos Lab’s Ground Truth catalog.

“We apply a process called seismic tomography where we take millions of observed travel times and invert them for the seismic velocities that would create that data set. It’s mathematically similar to doing linear regression, but on steroids,” Sandy says. Linear regression is a simple mathematical way to model the relationship between a known variable and one or more unknown variables. Because the Sandia team models hundreds of thousands of unknown variables, they apply a mathematical method called least squares to minimize the discrepancies between the data from previous seismic events and the predictions.

With 10 million data points, Sandia uses a distributed computer network with about 400 core processors to characterize the seismic velocity at every node.

Monitoring agencies could use SALSA3D to precompute the travel time from each station in their network to every point on Earth. When it comes time to compute the location of a new seismic event in real-time, source-to-receiver travel times can be computed in a millisecond and pinpoint the energy’s source in about a second, he said.

Uncertainty modeling a SALSA3D feature

But no model is perfect, so Sandia has developed a way to measure the uncertainty in each prediction SALSA3D makes, based on uncertainty in the velocity at each node and how that uncertainty affects the travel time prediction of each wave from a seismic event to each monitoring station.

SALSA3D estimates for the users at monitoring stations the most likely location of a seismic event and the amount of uncertainty in the answer to help inform their decisions.

International test ban treaties require that on-site inspections can only occur within a 1,000-square-kilometer (385-square-mile) area surrounding a suspected nuclear test site. Today, 3-D Earth models like SALSA3D are helping to meet and sometimes significantly exceed this threshold in most parts of the world.

“It’s extremely difficult to do because the problem is so large,” Ballard said. “But we’ve got to know it within 1,000 square kilometers or they might search in the wrong place.”

Online tools accelerating earthquake-engineering progress

Santiago Pujol, at far left, a Purdue associate professor of civil engineering, surveys a private residence damaged in a Haiti earthquake. The building was among 170 surveyed by civil engineers studying the effects of the January 2010 earthquake. Such photos and research-related information regarding earthquakes are part of a database maintained and serviced by the National Science Foundation's George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue. (Purdue University photo/Kari T. Nasi)
A publication-quality image is available at -  (Purdue University photo/Kari T. Nasi)
Santiago Pujol, at far left, a Purdue associate professor of civil engineering, surveys a private residence damaged in a Haiti earthquake. The building was among 170 surveyed by civil engineers studying the effects of the January 2010 earthquake. Such photos and research-related information regarding earthquakes are part of a database maintained and serviced by the National Science Foundation’s George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue. (Purdue University photo/Kari T. Nasi)
A publication-quality image is available at – (Purdue University photo/Kari T. Nasi)

A new study has found that online tools, access to experimental data and other services provided through “cyberinfrastructure” are helping to accelerate progress in earthquake engineering and science.

The research is affiliated with the National Science Foundation’s George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), based at Purdue University. NEES includes 14 laboratories for earthquake engineering and tsunami research, tied together with cyberinfrastructure to provide information technology for the network.

The cyberinfrastructure includes a centrally maintained, Web-based science gateway called NEEShub, which houses experimental results and makes them available for reuse by researchers, practitioners and educational communities.

“It’s a one-stop shopping site for the earthquake-engineering community to access really valuable intellectual contributions as well as experimental data generated from projects at the NEES sites,” said Thomas Hacker, an associate professor in the Department of Computer and Information Technology at Purdue and co-leader of information technology for NEES. “The NEES cyberinfrastructure provides critical information technology services in support of earthquake engineering research and helps to accelerate science and engineering progress in a substantial way.”

Findings from a recent study about cyberinfrastructure’s impact on the field were detailed in a paper published in a special issue of the Journal of Structural Engineering, which coincides with a NEES Quake Summit 2013 on Aug. 7-8 in Reno. The paper was authored by Hacker; Rudolf Eigenmann, a professor in Purdue’s School of Electrical and Computer Engineering; and Ellen Rathje, a professor in the Department of Civil, Architectural, and Environmental Engineering at the University of Texas, Austin.

A major element of the NEES cyberinfrastructure is a “project warehouse” that provides a place for researchers to upload project data, documents, papers and dissertations containing important experimental knowledge for the NEES community to access.

“A key factor in our efforts is the very strong involvement of experts in earthquake engineering and civil engineering in every aspect of our IT,” Hacker said. “The software we develop and services we provide are driven by user requirements prioritized by the community. This is an example of a large-scale cyberinfrastructure project that is really working to address big-data needs and developing technologies and solutions that work today. It’s a good example of how cyberinfrastructure can help knit together distributed communities or researchers into something greater than the sum of its parts.”

The effort requires two key aspects: technological elements and sociological elements.

“The technological elements include high-speed networks, laptops, servers and software,” he said. “The sociology includes the software-development process, the way we gather and prioritize user requirements and needs and our work with user communities. To be successful, a cyberinfrastructure effort needs to address both the technology and social elements, which has been our approach.”

The project warehouse and NEEShub collects “metadata,” or descriptive information about research needed to ensure that the information can be accessed in the future.

“Say you have an experiment with sensors over a structure to collect data like voltages over time or force displacements over time,” Eigenmann said. “What’s important for context is not only the data collected, but from which sensor, when the experiment was conducted, where the sensor was placed on the structure. When someone comes along later to reuse the information they need the metadata.”

The resources are curated, meaning the data are organized in a fashion that ensures they haven’t been modified and are valid for reference in the future.
“We take extra steps to ensure the long-term integrity of the data,” Hacker said.

NEEShub contains more than 1.6 million project files stored in more than 398,000 project directories and has been shown to have at least 65,000 users over the past year. Other metrics information is available at

“We are seeing continued growth in the number of users,” Rathje said. “We are helping to facilitate and enable the discovery process. We have earthquake engineering experts and civil engineering experts closely involved with every aspect of our IT and cyberinfrastructure, and we are constantly getting feedback and prototyping.”

To help quantify the impact on research, projects are ranked by how many times they are downloaded. One project alone has had 3.3 million files downloaded.

“We have a curation dashboard for each project, which gives the curation status of the information so that users know whether it’s ready to be cited and used,” Hacker said.

The site also has a DOI, or digital object identifier, for each project.

“It’s like a permanent identifier that goes with the data set,” he said. “It gives you a permanent link to the data.”
NEES researchers will continue to study the impact of cyberinfrastructure on engineering and scientific progress.

“The use and adoption of cybeinfrastructure by a community is a process,” Hacker said. “At the beginning of the process we can measure the number of visitors and people accessing information. The ultimate impact of the cyberinfrastructure will be reflected in outcomes such as the number of publications that have benefited from using the cyberinfrastructure. It takes several years to follow that process and we are in the middle of that right now, but evidence points to a significant impact.”