Birth of Earth’s continents

New research led by a University of Calgary geophysicist provides strong evidence against continent formation above a hot mantle plume, similar to an environment that presently exists beneath the Hawaiian Islands.

The analysis, published this month in Nature Geoscience, indicates that the nuclei of Earth’s continents formed as a byproduct of mountain-building processes, by stacking up slabs of relatively cold oceanic crust. This process created thick, strong ‘keels’ in the Earth’s mantle that supported the overlying crust and enabled continents to form.

The scientific clues leading to this conclusion derived from computer simulations of the slow cooling process of continents, combined with analysis of the distribution of diamonds in the deep Earth.

The Department of Geoscience’s Professor David Eaton developed computer software to enable numerical simulation of the slow diffusive cooling of Earth’s mantle over a time span of billions of years.

Working in collaboration with former graduate student, Assistant Professor Claire Perry from the Universite du Quebec a Montreal, Eaton relied on the geological record of diamonds found in Africa to validate his innovative computer simulations.

“For the first time, we are able to quantify the thermal evolution of a realistic 3D Earth model spanning billions of years from the time continents were formed,” states Perry.

Mantle plumes consist of an upwelling of hot material within Earth’s mantle. Plumes are thought to be the cause of some volcanic centres, especially those that form a linear volcanic chain like Hawaii. Diamonds, which are generally limited to the deepest and oldest parts of the continental mantle, provide a wealth of information on how the host mantle region may have formed.

“Ancient mantle keels are relatively strong, cold and sometimes diamond-bearing material. They are known to extend to depths of 200 kilometres or more beneath the ancient core regions of continents,” explains Professor David Eaton. “These mantle keels resisted tectonic recycling into the deep mantle, allowing the preservation of continents over geological time and providing suitable environments for the development of the terrestrial biosphere.”

His method takes into account important factors such as dwindling contribution of natural radioactivity to the heat budget, and allows for the calculation of other properties that strongly influence mantle evolution, such as bulk density and rheology (mechanical strength).

“Our computer model emerged from a multi-disciplinary approach combining classical physics, mathematics and computer science,” explains Eaton. “By combining those disciplines, we were able to tackle a fundamental geoscientific problem, which may open new doors for future research.”

This work provides significant new scientific insights into the formation and evolution of continents on Earth.




Video
Click on this image to view the .mp4 video
This computer simulation spanning 2.5 billion years of Earth history is showing density difference of the mantle, compared to an oceanic reference, starting from a cooler initial state. Density is controlled by mantle composition as well as slowly cooling temperature; a keel of low-density material extending to about 260 km depth on the left side (x < 600 km) provides buoyancy that prevents continents from being subducted ('recycled' into the deep Earth). Graph on the top shows a computed elevation model. – David Eaton, University of Calgary.

Birth of Earth’s continents

New research led by a University of Calgary geophysicist provides strong evidence against continent formation above a hot mantle plume, similar to an environment that presently exists beneath the Hawaiian Islands.

The analysis, published this month in Nature Geoscience, indicates that the nuclei of Earth’s continents formed as a byproduct of mountain-building processes, by stacking up slabs of relatively cold oceanic crust. This process created thick, strong ‘keels’ in the Earth’s mantle that supported the overlying crust and enabled continents to form.

The scientific clues leading to this conclusion derived from computer simulations of the slow cooling process of continents, combined with analysis of the distribution of diamonds in the deep Earth.

The Department of Geoscience’s Professor David Eaton developed computer software to enable numerical simulation of the slow diffusive cooling of Earth’s mantle over a time span of billions of years.

Working in collaboration with former graduate student, Assistant Professor Claire Perry from the Universite du Quebec a Montreal, Eaton relied on the geological record of diamonds found in Africa to validate his innovative computer simulations.

“For the first time, we are able to quantify the thermal evolution of a realistic 3D Earth model spanning billions of years from the time continents were formed,” states Perry.

Mantle plumes consist of an upwelling of hot material within Earth’s mantle. Plumes are thought to be the cause of some volcanic centres, especially those that form a linear volcanic chain like Hawaii. Diamonds, which are generally limited to the deepest and oldest parts of the continental mantle, provide a wealth of information on how the host mantle region may have formed.

“Ancient mantle keels are relatively strong, cold and sometimes diamond-bearing material. They are known to extend to depths of 200 kilometres or more beneath the ancient core regions of continents,” explains Professor David Eaton. “These mantle keels resisted tectonic recycling into the deep mantle, allowing the preservation of continents over geological time and providing suitable environments for the development of the terrestrial biosphere.”

His method takes into account important factors such as dwindling contribution of natural radioactivity to the heat budget, and allows for the calculation of other properties that strongly influence mantle evolution, such as bulk density and rheology (mechanical strength).

“Our computer model emerged from a multi-disciplinary approach combining classical physics, mathematics and computer science,” explains Eaton. “By combining those disciplines, we were able to tackle a fundamental geoscientific problem, which may open new doors for future research.”

This work provides significant new scientific insights into the formation and evolution of continents on Earth.




Video
Click on this image to view the .mp4 video
This computer simulation spanning 2.5 billion years of Earth history is showing density difference of the mantle, compared to an oceanic reference, starting from a cooler initial state. Density is controlled by mantle composition as well as slowly cooling temperature; a keel of low-density material extending to about 260 km depth on the left side (x < 600 km) provides buoyancy that prevents continents from being subducted ('recycled' into the deep Earth). Graph on the top shows a computed elevation model. – David Eaton, University of Calgary.

West Antarctica ice sheet existed 20 million years earlier than previously thought

Adelie penguins walk in file on sea ice in front of US research icebreaker Nathaniel B. Palmer in McMurdo Sound. -  John Diebold
Adelie penguins walk in file on sea ice in front of US research icebreaker Nathaniel B. Palmer in McMurdo Sound. – John Diebold

The results of research conducted by professors at UC Santa Barbara and colleagues mark the beginning of a new paradigm for our understanding of the history of Earth’s great global ice sheets. The research shows that, contrary to the popularly held scientific view, an ice sheet on West Antarctica existed 20 million years earlier than previously thought.

The findings indicate that ice sheets first grew on the West Antarctic subcontinent at the start of a global transition from warm greenhouse conditions to a cool icehouse climate 34 million years ago. Previous computer simulations were unable to produce the amount of ice that geological records suggest existed at that time because neighboring East Antarctica alone could not support it. The findings were published today in Geophysical Research Letters, a journal of the American Geophysical Union.

Given that more ice grew than could be hosted only on East Antarctica, some researchers proposed that the missing ice formed in the northern hemisphere, many millions of years before the documented ice growth in that hemisphere, which started about 3 million years ago. But the new research shows it is not necessary to have ice hosted in the northern polar regions at the start of greenhouse-icehouse transition.

Earlier research published in 2009 and 2012 by the same team showed that West Antarctica bedrock was much higher in elevation at the time of the global climate transition than it is today, with much of its land above sea level. The belief that West Antarctic elevations had always been low lying (as they are today) led researchers to ignore it in past studies. The new research presents compelling evidence that this higher land mass enabled a large ice sheet to be hosted earlier than previously realized, despite a warmer ocean in the past.

“Our new model identifies West Antarctica as the site needed for the accumulation of the extra ice on Earth at that time,” said lead author Douglas S. Wilson, a research geophysicist in UCSB’s Department of Earth Science and Marine Science Institute. “We find that the West Antarctic Ice Sheet first appeared earlier than the previously accepted timing of its initiation sometime in the Miocene, about 14 million years ago. In fact, our model shows it appeared at the same time as the massive East Antarctic Ice Sheet some 20 million years earlier.”

Wilson and his team used a sophisticated numerical ice sheet model to support this view. Using their new bedrock elevation map for the Antarctic continent, the researchers created a computer simulation of the initiation of the Antarctic ice sheets. Unlike previous computer simulations of Antarctic glaciation, this research found the nascent Antarctic ice sheet included substantial ice on the subcontinent of West Antarctica. The modern West Antarctic Ice Sheet contains about 10 percent of the total ice on Antarctica and is similar in scale to the Greenland Ice Sheet.

West Antarctica and Greenland are both major players in scenarios of sea level rise due to global warming because of the sensitivity of the ice sheets on these subcontinents. Recent scientific estimates conclude that global sea level would rise an average of 11 feet should the West Antarctic Ice Sheet melt. This amount would add to sea level rise from the melting of the Greenland ice sheet (about 24 feet).

The UCSB researchers computed a range of ice sheets that consider the uncertainty in the topographic reconstructions, all of which show ice growth on East and West Antarctica 34 million years ago. A surprising result is that the total volume of ice on East and West Antarctica at that time could be more than 1.4 times greater than previously realized and was likely larger than the ice sheet on Antarctica today.

“We feel it is important for the public to know that the origins of the West Antarctic Ice Sheet are under increased scrutiny and that scientists are paying close attention to its role in Earth’s climate now and in the past,” concluded co-author Bruce Luyendyk, UCSB professor emeritus in the Department of Earth Science and research professor at the campus’s Earth Research Institute.

West Antarctica ice sheet existed 20 million years earlier than previously thought

Adelie penguins walk in file on sea ice in front of US research icebreaker Nathaniel B. Palmer in McMurdo Sound. -  John Diebold
Adelie penguins walk in file on sea ice in front of US research icebreaker Nathaniel B. Palmer in McMurdo Sound. – John Diebold

The results of research conducted by professors at UC Santa Barbara and colleagues mark the beginning of a new paradigm for our understanding of the history of Earth’s great global ice sheets. The research shows that, contrary to the popularly held scientific view, an ice sheet on West Antarctica existed 20 million years earlier than previously thought.

The findings indicate that ice sheets first grew on the West Antarctic subcontinent at the start of a global transition from warm greenhouse conditions to a cool icehouse climate 34 million years ago. Previous computer simulations were unable to produce the amount of ice that geological records suggest existed at that time because neighboring East Antarctica alone could not support it. The findings were published today in Geophysical Research Letters, a journal of the American Geophysical Union.

Given that more ice grew than could be hosted only on East Antarctica, some researchers proposed that the missing ice formed in the northern hemisphere, many millions of years before the documented ice growth in that hemisphere, which started about 3 million years ago. But the new research shows it is not necessary to have ice hosted in the northern polar regions at the start of greenhouse-icehouse transition.

Earlier research published in 2009 and 2012 by the same team showed that West Antarctica bedrock was much higher in elevation at the time of the global climate transition than it is today, with much of its land above sea level. The belief that West Antarctic elevations had always been low lying (as they are today) led researchers to ignore it in past studies. The new research presents compelling evidence that this higher land mass enabled a large ice sheet to be hosted earlier than previously realized, despite a warmer ocean in the past.

“Our new model identifies West Antarctica as the site needed for the accumulation of the extra ice on Earth at that time,” said lead author Douglas S. Wilson, a research geophysicist in UCSB’s Department of Earth Science and Marine Science Institute. “We find that the West Antarctic Ice Sheet first appeared earlier than the previously accepted timing of its initiation sometime in the Miocene, about 14 million years ago. In fact, our model shows it appeared at the same time as the massive East Antarctic Ice Sheet some 20 million years earlier.”

Wilson and his team used a sophisticated numerical ice sheet model to support this view. Using their new bedrock elevation map for the Antarctic continent, the researchers created a computer simulation of the initiation of the Antarctic ice sheets. Unlike previous computer simulations of Antarctic glaciation, this research found the nascent Antarctic ice sheet included substantial ice on the subcontinent of West Antarctica. The modern West Antarctic Ice Sheet contains about 10 percent of the total ice on Antarctica and is similar in scale to the Greenland Ice Sheet.

West Antarctica and Greenland are both major players in scenarios of sea level rise due to global warming because of the sensitivity of the ice sheets on these subcontinents. Recent scientific estimates conclude that global sea level would rise an average of 11 feet should the West Antarctic Ice Sheet melt. This amount would add to sea level rise from the melting of the Greenland ice sheet (about 24 feet).

The UCSB researchers computed a range of ice sheets that consider the uncertainty in the topographic reconstructions, all of which show ice growth on East and West Antarctica 34 million years ago. A surprising result is that the total volume of ice on East and West Antarctica at that time could be more than 1.4 times greater than previously realized and was likely larger than the ice sheet on Antarctica today.

“We feel it is important for the public to know that the origins of the West Antarctic Ice Sheet are under increased scrutiny and that scientists are paying close attention to its role in Earth’s climate now and in the past,” concluded co-author Bruce Luyendyk, UCSB professor emeritus in the Department of Earth Science and research professor at the campus’s Earth Research Institute.

Devastating long-distance impact of earthquakes

In 2006 the island of Java, Indonesia was struck by a devastating earthquake followed by the onset of a mud eruption to the east, flooding villages over several square kilometers and that continues to erupt today. Until now, researchers believed the earthquake was too far from the mud volcano to trigger the eruption. Geophysicists at the University of Bonn, Germany and ETH Zurich, Switzerland use computer-based simulations to show that such triggering is possible over long distances. The results have been published in “Nature Geoscience.”

On May 27, 2006 the ground of the Indonesian island Java was shaking with a magnitude 6.3 earthquake. The epicenter was located 25 km southwest of the city of Yogyakarta and initiated at a depth of 12 km. The earthquake took thousands of lives, injured ten thousand and destroyed buildings and homes. 47 hours later, about 250 km from the earthquake hypocenter, a mud volcano formed that came to be known as “Lusi”, short for “Lumpur Sidoarjo”. Hot mud erupted in the vicinity of an oil drilling-well, shooting mud up to 50 m into the sky and flooding the area. Scientists expect the mud volcano to be active for many more years.

Eruption of mud volcano has natural cause

Was the eruption of the mud triggered by natural events or was it man-made by the nearby exploration-well? Geophysicists at the University of Bonn, Germany and at ETH Zürich, Switzerland investigated this question with numerical wave-propagation experiments. “Many researchers believed that the earthquake epicenter was too far from Lusi to have activated the mud volcano,” says Prof. Dr. Stephen A. Miller from the department of Geodynamics at the University of Bonn. However, using their computer simulations that include the geological features of the Lusi subsurface, the team of Stephen Miller concluded that the earthquake was the trigger, despite the long distance.

The overpressured solid mud layer was trapped between layers with different acoustic properties, and this system was shaken from the earthquake and aftershocks like a bottle of champagne. The key, however, is the reflections provided by the dome-shaped geology underneath Lusi that focused the seismic waves of the earthquakes like the echo inside a cave. Prof. Stephen Miller explains: “Our simulations show that the dome-shaped structure with different properties focused seismic energy into the mud layer and could very well have liquified the mud that then injected into nearby faults.”

Previous studies would have underestimated the energy of the seismic waves, as ground motion was only considered at the surface. However, geophysicists at the University of Bonn suspect that those were much less intense than at depth. The dome-like structure “kept” the seismic waves at depth and damped those that reached the surface. “This was actually a lower estimate of the focussing effect because only one wave cycle was input. This effect increases with each wave cycle because of the reducing acoustic impedance of the pressurizing mud layer”. In response to claims that the reported highest velocity layer used in the modeling is a measurement artifact, Miller says “that does not change our conclusions because this effect will occur whenever a layer of low acoustic impedance is sandwiched between high impedance layers, irrespective of the exact values of the impedances. And the source of the Lusi mud was the inside of the sandwich.”

It has already been proposed that a tectonic fault is connecting Lusi to a 15 km distant volcanic system. Prof. Miller explains “This connection probably supplies the mud volcano with heat and fluids that keep Lusi erupting actively up to today”, explains Miller.

With their publication, scientists from Bonn and Zürich point out, that earthquakes can trigger processes over long distances, and this focusing effect may apply to other hydrothermal and volcanic systems. Stephen Miller concludes: “Being a geological rarity, the mud volcano may contribute to a better understanding of triggering processes and relationships between seismic and volcanic activity.” Miller also adds “maybe this work will settle the long-standing controversy and focus instead on helping those affected.” The island of Java is part of the so called Pacific Ring of Fire, a volcanic belt which surrounds the entire Pacific Ocean. Here, oceanic crust is subducted underneath oceanic and continental tectonic plates, leading to melting of crustal material at depth. The resulting magma uprises and is feeding numerous volcanoes.

Devastating long-distance impact of earthquakes

In 2006 the island of Java, Indonesia was struck by a devastating earthquake followed by the onset of a mud eruption to the east, flooding villages over several square kilometers and that continues to erupt today. Until now, researchers believed the earthquake was too far from the mud volcano to trigger the eruption. Geophysicists at the University of Bonn, Germany and ETH Zurich, Switzerland use computer-based simulations to show that such triggering is possible over long distances. The results have been published in “Nature Geoscience.”

On May 27, 2006 the ground of the Indonesian island Java was shaking with a magnitude 6.3 earthquake. The epicenter was located 25 km southwest of the city of Yogyakarta and initiated at a depth of 12 km. The earthquake took thousands of lives, injured ten thousand and destroyed buildings and homes. 47 hours later, about 250 km from the earthquake hypocenter, a mud volcano formed that came to be known as “Lusi”, short for “Lumpur Sidoarjo”. Hot mud erupted in the vicinity of an oil drilling-well, shooting mud up to 50 m into the sky and flooding the area. Scientists expect the mud volcano to be active for many more years.

Eruption of mud volcano has natural cause

Was the eruption of the mud triggered by natural events or was it man-made by the nearby exploration-well? Geophysicists at the University of Bonn, Germany and at ETH Zürich, Switzerland investigated this question with numerical wave-propagation experiments. “Many researchers believed that the earthquake epicenter was too far from Lusi to have activated the mud volcano,” says Prof. Dr. Stephen A. Miller from the department of Geodynamics at the University of Bonn. However, using their computer simulations that include the geological features of the Lusi subsurface, the team of Stephen Miller concluded that the earthquake was the trigger, despite the long distance.

The overpressured solid mud layer was trapped between layers with different acoustic properties, and this system was shaken from the earthquake and aftershocks like a bottle of champagne. The key, however, is the reflections provided by the dome-shaped geology underneath Lusi that focused the seismic waves of the earthquakes like the echo inside a cave. Prof. Stephen Miller explains: “Our simulations show that the dome-shaped structure with different properties focused seismic energy into the mud layer and could very well have liquified the mud that then injected into nearby faults.”

Previous studies would have underestimated the energy of the seismic waves, as ground motion was only considered at the surface. However, geophysicists at the University of Bonn suspect that those were much less intense than at depth. The dome-like structure “kept” the seismic waves at depth and damped those that reached the surface. “This was actually a lower estimate of the focussing effect because only one wave cycle was input. This effect increases with each wave cycle because of the reducing acoustic impedance of the pressurizing mud layer”. In response to claims that the reported highest velocity layer used in the modeling is a measurement artifact, Miller says “that does not change our conclusions because this effect will occur whenever a layer of low acoustic impedance is sandwiched between high impedance layers, irrespective of the exact values of the impedances. And the source of the Lusi mud was the inside of the sandwich.”

It has already been proposed that a tectonic fault is connecting Lusi to a 15 km distant volcanic system. Prof. Miller explains “This connection probably supplies the mud volcano with heat and fluids that keep Lusi erupting actively up to today”, explains Miller.

With their publication, scientists from Bonn and Zürich point out, that earthquakes can trigger processes over long distances, and this focusing effect may apply to other hydrothermal and volcanic systems. Stephen Miller concludes: “Being a geological rarity, the mud volcano may contribute to a better understanding of triggering processes and relationships between seismic and volcanic activity.” Miller also adds “maybe this work will settle the long-standing controversy and focus instead on helping those affected.” The island of Java is part of the so called Pacific Ring of Fire, a volcanic belt which surrounds the entire Pacific Ocean. Here, oceanic crust is subducted underneath oceanic and continental tectonic plates, leading to melting of crustal material at depth. The resulting magma uprises and is feeding numerous volcanoes.

Earthquake acoustics can indicate if a massive tsunami is imminent, Stanford researchers find

On March 11, 2011, a magnitude 9.0 undersea earthquake occurred 43 miles off the shore of Japan. The earthquake generated an unexpectedly massive tsunami that washed over eastern Japan roughly 30 minutes later, killing more than 15,800 people and injuring more than 6,100. More than 2,600 people are still unaccounted for.

Now, computer simulations by Stanford scientists reveal that sound waves in the ocean produced by the earthquake probably reached land tens of minutes before the tsunami. If correctly interpreted, they could have offered a warning that a large tsunami was on the way.

Although various systems can detect undersea earthquakes, they can’t reliably tell which will form a tsunami, or predict the size of the wave. There are ocean-based devices that can sense an oncoming tsunami, but they typically provide only a few minutes of advance warning.

Because the sound from a seismic event will reach land well before the water itself, the researchers suggest that identifying the specific acoustic signature of tsunami-generating earthquakes could lead to a faster-acting warning system for massive tsunamis.

Discovering the signal


The finding was something of a surprise. The earthquake’s epicenter had been traced to the underwater Japan Trench, a subduction zone about 40 miles east of Tohoku, the northeastern region of Japan’s larger island. Based on existing knowledge of earthquakes in this area, seismologists puzzled over why the earthquake rupture propagated from the underground fault all the way up to the seafloor, creating a massive upward thrust that resulted in the tsunami.

Direct observations of the fault were scarce, so Eric Dunham, an assistant professor of geophysics in the School of Earth Sciences, and Jeremy Kozdon, a postdoctoral researcher working with Dunham, began using the cluster of supercomputers at Stanford’s Center for Computational Earth and Environmental Science (CEES) to simulate how the tremors moved through the crust and ocean.

The researchers built a high-resolution model that incorporated the known geologic features of the Japan Trench and used CEES simulations to identify possible earthquake rupture histories compatible with the available data.

Retroactively, the models accurately predicted the seafloor uplift seen in the earthquake, which is directly related to tsunami wave heights, and also simulated sound waves that propagated within the ocean.

In addition to valuable insight into the seismic events as they likely occurred during the 2011 earthquake, the researchers identified the specific fault conditions necessary for ruptures to reach the seafloor and create large tsunamis.

The model also generated acoustic data; an interesting revelation of the simulation was that tsunamigenic surface-breaking ruptures, like the 2011 earthquake, produce higher amplitude ocean acoustic waves than those that do not.

The model showed how those sound waves would have traveled through the water and indicated that they reached shore 15 to 20 minutes before the tsunami.

“We’ve found that there’s a strong correlation between the amplitude of the sound waves and the tsunami wave heights,” Dunham said. “Sound waves propagate through water 10 times faster than the tsunami waves, so we can have knowledge of what’s happening a hundred miles offshore within minutes of an earthquake occurring. We could know whether a tsunami is coming, how large it will be and when it will arrive.”

Worldwide application


The team’s model could apply to tsunami-forming fault zones around the world, though the characteristics of telltale acoustic signature might vary depending on the geology of the local environment. The crustal composition and orientation of faults off the coasts of Japan, Alaska, the Pacific Northwest and Chile differ greatly.

“The ideal situation would be to analyze lots of measurements from major events and eventually be able to say, ‘this is the signal’,” said Kozdon, who is now an assistant professor of applied mathematics at the Naval Postgraduate School. “Fortunately, these catastrophic earthquakes don’t happen frequently, but we can input these site specific characteristics into computer models – such as those made possible with the CEES cluster – in the hopes of identifying acoustic signatures that indicates whether or not an earthquake has generated a large tsunami.”

Dunham and Kozdon pointed out that identifying a tsunami signature doesn’t complete the warning system. Underwater microphones called hydrophones would need to be deployed on the seafloor or on buoys to detect the signal, which would then need to be analyzed to confirm a threat, both of which could be costly. Policymakers would also need to work with scientists to settle on the degree of certainty needed before pulling the alarm.

If these points can be worked out, though, the technique could help provide precious minutes for an evacuation.

The study is detailed in the current issue of the journal the Bulletin of the Seismological Society of America.

Earthquake acoustics can indicate if a massive tsunami is imminent, Stanford researchers find

On March 11, 2011, a magnitude 9.0 undersea earthquake occurred 43 miles off the shore of Japan. The earthquake generated an unexpectedly massive tsunami that washed over eastern Japan roughly 30 minutes later, killing more than 15,800 people and injuring more than 6,100. More than 2,600 people are still unaccounted for.

Now, computer simulations by Stanford scientists reveal that sound waves in the ocean produced by the earthquake probably reached land tens of minutes before the tsunami. If correctly interpreted, they could have offered a warning that a large tsunami was on the way.

Although various systems can detect undersea earthquakes, they can’t reliably tell which will form a tsunami, or predict the size of the wave. There are ocean-based devices that can sense an oncoming tsunami, but they typically provide only a few minutes of advance warning.

Because the sound from a seismic event will reach land well before the water itself, the researchers suggest that identifying the specific acoustic signature of tsunami-generating earthquakes could lead to a faster-acting warning system for massive tsunamis.

Discovering the signal


The finding was something of a surprise. The earthquake’s epicenter had been traced to the underwater Japan Trench, a subduction zone about 40 miles east of Tohoku, the northeastern region of Japan’s larger island. Based on existing knowledge of earthquakes in this area, seismologists puzzled over why the earthquake rupture propagated from the underground fault all the way up to the seafloor, creating a massive upward thrust that resulted in the tsunami.

Direct observations of the fault were scarce, so Eric Dunham, an assistant professor of geophysics in the School of Earth Sciences, and Jeremy Kozdon, a postdoctoral researcher working with Dunham, began using the cluster of supercomputers at Stanford’s Center for Computational Earth and Environmental Science (CEES) to simulate how the tremors moved through the crust and ocean.

The researchers built a high-resolution model that incorporated the known geologic features of the Japan Trench and used CEES simulations to identify possible earthquake rupture histories compatible with the available data.

Retroactively, the models accurately predicted the seafloor uplift seen in the earthquake, which is directly related to tsunami wave heights, and also simulated sound waves that propagated within the ocean.

In addition to valuable insight into the seismic events as they likely occurred during the 2011 earthquake, the researchers identified the specific fault conditions necessary for ruptures to reach the seafloor and create large tsunamis.

The model also generated acoustic data; an interesting revelation of the simulation was that tsunamigenic surface-breaking ruptures, like the 2011 earthquake, produce higher amplitude ocean acoustic waves than those that do not.

The model showed how those sound waves would have traveled through the water and indicated that they reached shore 15 to 20 minutes before the tsunami.

“We’ve found that there’s a strong correlation between the amplitude of the sound waves and the tsunami wave heights,” Dunham said. “Sound waves propagate through water 10 times faster than the tsunami waves, so we can have knowledge of what’s happening a hundred miles offshore within minutes of an earthquake occurring. We could know whether a tsunami is coming, how large it will be and when it will arrive.”

Worldwide application


The team’s model could apply to tsunami-forming fault zones around the world, though the characteristics of telltale acoustic signature might vary depending on the geology of the local environment. The crustal composition and orientation of faults off the coasts of Japan, Alaska, the Pacific Northwest and Chile differ greatly.

“The ideal situation would be to analyze lots of measurements from major events and eventually be able to say, ‘this is the signal’,” said Kozdon, who is now an assistant professor of applied mathematics at the Naval Postgraduate School. “Fortunately, these catastrophic earthquakes don’t happen frequently, but we can input these site specific characteristics into computer models – such as those made possible with the CEES cluster – in the hopes of identifying acoustic signatures that indicates whether or not an earthquake has generated a large tsunami.”

Dunham and Kozdon pointed out that identifying a tsunami signature doesn’t complete the warning system. Underwater microphones called hydrophones would need to be deployed on the seafloor or on buoys to detect the signal, which would then need to be analyzed to confirm a threat, both of which could be costly. Policymakers would also need to work with scientists to settle on the degree of certainty needed before pulling the alarm.

If these points can be worked out, though, the technique could help provide precious minutes for an evacuation.

The study is detailed in the current issue of the journal the Bulletin of the Seismological Society of America.

Supercomputer Unleashes Virtual 9.0 Megaquake in Pacific Northwest





Scientists used a supercomputer-driven 'virtual earthquake' to explore likely ground shaking in a magnitude 9.0 megathrust earthquake in the Pacific Northwest. Peak ground velocities are displayed in yellow and red. The legend represents speed in meters per second (m/s) with red equaling 2.3 m/s. Although the largest ground motions occur offshore near the fault and decrease eastward, sedimentary basins lying beneath some cities amplify the shaking in Seattle, Tacoma, Olympia, and Vancouver, increasing the risk of damage. - Credit: Kim Olsen, SDSU
Scientists used a supercomputer-driven ‘virtual earthquake’ to explore likely ground shaking in a magnitude 9.0 megathrust earthquake in the Pacific Northwest. Peak ground velocities are displayed in yellow and red. The legend represents speed in meters per second (m/s) with red equaling 2.3 m/s. Although the largest ground motions occur offshore near the fault and decrease eastward, sedimentary basins lying beneath some cities amplify the shaking in Seattle, Tacoma, Olympia, and Vancouver, increasing the risk of damage. – Credit: Kim Olsen, SDSU

On January 26, 1700, at about 9 p.m. local time, the Juan de Fuca plate beneath the ocean in the Pacific Northwest suddenly moved, slipping some 60 feet eastward beneath the North American plate in a monster quake of approximately magnitude 9, setting in motion large tsunamis that struck the coast of North America and traveled to the shores of Japan.



Since then, the earth beneath the region – which includes the cities of Vancouver, Seattle and Portland — has been relatively quiet. But scientists believe that earthquakes with magnitudes greater than 8, so-called “megathrust events,” occur along this fault on average every 400 to 500 years.



To help prepare for the next megathrust earthquake, a team of researchers led by seismologist Kim Olsen of San Diego State University (SDSU) used a supercomputer-powered “virtual earthquake” program to calculate for the first time realistic three-dimensional simulations that describe the possible impacts of megathrust quakes on the Pacific Northwest region. Also participating in the study were researchers from the San Diego Supercomputer Center at UC San Diego and the U.S. Geological Survey.



What the scientists learned from this simulation is not reassuring, as reported in the Journal of Seismology, particularly for residents of downtown Seattle.



With a rupture scenario beginning in the north and propagating toward the south along the 600-mile long Cascadia Subduction Zone, the ground moved about 1 ½ feet per second in Seattle; nearly 6 inches per second in Tacoma, Olympia and Vancouver; and 3 inches in Portland, Oregon. Additional simulations, especially of earthquakes that begin in the southern part of the rupture zone, suggest that the ground motion under some conditions can be up to twice as large.



“We also found that these high ground velocities were accompanied by significant low-frequency shaking, like what you feel in a roller coaster, that lasted as long as five minutes – and that’s a long time,” said Olsen.



The long-duration shaking, combined with high ground velocities, raises the possibility that such an earthquake could inflict major damage on metropolitan areas — especially on high-rise buildings in downtown Seattle. Compounding the risks, like Los Angeles to the south, Seattle, Tacoma, and Olympia sit on top of sediment-filled geological basins that are prone to greatly amplifying the waves generated by major earthquakes.



“One thing these studies will hopefully do is to raise awareness of the possibility of megathrust earthquakes happening at any given time in the Pacific Northwest,” said Olsen. “Because these events will tend to occur several hundred kilometers from major cities, the study also implies that the region could benefit from an early warning system that can allow time for protective actions before the brunt of the shaking starts.” Depending on how far the earthquake is from a city, early warning systems could give from a few seconds to a few tens of seconds to implement measures, such as automatically stopping trains and elevators.



Added Olsen, “The information from these simulations can also play a role in research into the hazards posed by large tsunamis, which can originate from such megathrust earthquakes like the ones generated in the 2004 Sumatra-Andeman earthquake in Indonesia.” One of the largest earthquakes ever recorded, the magnitude 9.2 Sumatra-Andeman event was felt as far away as Bangladesh, India, and Malaysia, and triggered devastating tsunamis that killed more than 200,000 people.



In addition to increasing scientific understanding of these massive earthquakes, the results of the simulations can also be used to guide emergency planners, to improve building codes, and help engineers design safer structures — potentially saving lives and property in this region of some 9 million people.


Even with the large supercomputing and data resources at SDSC, creating “virtual earthquakes” is a daunting task. The computations to prepare initial conditions were carried out on SDSC’s DataStar supercomputer, and then the resulting information was transferred for the main simulations to the center’s Blue Gene Data supercomputer via SDSC’s advanced virtual file system or GPFS-WAN, which makes data seamlessly available on different – sometimes distant – supercomputers.



Coordinating the simulations required a complex choreography of moving information into and out of the supercomputer as Olsen’s sophisticated “Anelastic Wave Model” simulation code was running. Completing just one of several simulations, running on 2,000 supercomputer processors, required some 80,000 processor hours – equal to running one program continuously on your PC for more than 9 years!



“To solve the new challenges that arise when researchers need to run their codes at the largest scales, and data sets grow to great size, we worked closely with the earthquake scientists through several years of code optimization and modifications,” said SDSC computational scientist Yifeng Cui, who contributed numerous refinements to allow the computer model to “scale up” to capture a magnitude 9 earthquake over such a vast area.



In order to run the simulations, the scientists must recreate in their model the components that encompass all the important aspects of the earthquake. One component is an accurate representation of the earth’s subsurface layering, and how its structure will bend, reflect, and change the size and direction of the traveling earthquake waves. Co-author William Stephenson of the USGS worked with Olsen and Andreas Geisselmeyer, from Ulm University in Germany, to create the first unified “velocity model” of the layering for this entire region, extending from British Columbia to Northern California.



Another component is a model of the earthquake source from the slipping of the Juan de Fuca plate underneath the North American plate. Making use of the extensive measurements of the massive 2004 Sumatra-Andeman earthquake in Indonesia, the scientists developed a model of the earthquake source for similar megathrust earthquakes in the Pacific Northwest.



The sheer physical size of the region in the study was also challenging. The scientists included in their virtual model an immense slab of the earth more than 650 miles long by 340 miles by 30 miles deep — more than 7 million cubic miles — and used a computer mesh spacing of 250 meters to divide the volume into some 2 billion cubes. This mesh size allows the simulations to model frequencies up to 0.5 Hertz, which especially affect tall buildings.



“One of the strengths of an earthquake simulation model is that it lets us run scenarios of different earthquakes to explore how they may affect ground motion,” said Olsen. Because the accumulated stresses or “slip deficit” can be released in either one large event or several smaller events, the scientists ran scenarios for earthquakes of different sizes.



“We found that the magnitude 9 scenarios generate peak ground velocities five to 10 times larger than those from the smaller magnitude 8.5 quakes.”



The researchers are planning to conduct additional simulations to explore the range of impacts that depend on where the earthquake starts, the direction of travel of the rupture along the fault, and other factors that can vary.



This research was supported by the National Science Foundation, the U.S. Geological Survey, the Southern California Earthquake Center, and computing time on an NSF supercomputer at SDSC.