Scientists discover evidence of super-fast deep earthquake

As scientists learn more about earthquakes that rupture at fault zones near the planet’s surface-and the mechanisms that trigger them-an even more intriguing earthquake mystery lies deeper in the planet.

Scientists at Scripps Institution of Oceanography at UC San Diego have discovered the first evidence that deep earthquakes, those breaking at more than 400 kilometers (250 miles) below Earth’s surface, can rupture much faster than ordinary earthquakes. The finding gives seismologists new clues about the forces behind deep earthquakes as well as fast-breaking earthquakes that strike near the surface.

Seismologists have documented a handful of these events, in which an earthquake’s rupture travels faster than the shear waves of seismic energy that it radiates. These “supershear” earthquakes have rupture speeds of four kilometers per second (an astonishing 9,000 miles per hour) or more.

In a National Science Foundation-funded study reported in the June 11, 2014, issue of the journal Science, Scripps geophysicists Zhongwen Zhan and Peter Shearer of Scripps, along with their colleagues at Caltech, discovered the first deep supershear earthquake while examining the aftershocks of a magnitude 8.3 earthquake on May 24, 2013, in the Sea of Okhotsk off the Russian mainland.

Details of a magnitude 6.7 aftershock of the event captured Zhan’s attention. Analyzing data from the IRIS (Incorporated Research Institutions for Seismology) consortium, which coordinates a global network of seismological instruments, Zhan noted that most seismometers around the world yielded similar records, all suggesting an anomalously short duration for a magnitude 6.7 earthquake.

Data from one seismometer, however, stationed closest to the event in Russia’s Kamchatka Peninsula, told a different story with intriguing details.

After closely analyzing the data, Zhan not only found that the aftershock ruptured extremely deeply at 640 kilometers (400 miles) below the earth’s surface, but its rupture velocity was extraordinary-about eight kilometers per second (five miles per second), nearly 50 percent faster than the shear wave velocity at that depth.

“For a 6.7 earthquake you would expect a duration of seven to eight seconds, but this one lasted just two seconds,” said Shearer, a geophysics professor in the Cecil H. and Ida M. Green Institute of Geophysics and Planetary Physics (IGPP) at Scripps. “This is the first definitive example of supershear rupture for a deep earthquake since previously supershear ruptures have been documented only for shallow earthquakes.”

“This finding will help us understand why deep earthquakes happen,” said Zhan. “One quarter of earthquakes occur at large depths, and some of these can be pretty big, but we still don’t understand why they happen. So this earthquake provides a new observation for deep earthquakes and high-rupture speeds.”

Zhan also believes the new information will be useful in examining ultra-fast earthquakes and their potential for impacting fault zones near the earth’s surface. Although not of supershear caliber, California’s destructive 1994 Northridge earthquake had a comparable size and geometry to that of the 6.7 Sea of Okhotsk aftershock.

“If a shallow earthquake such as Northridge goes supershear, it could cause even more shaking and possibly more damage,” said Zhan.

Great earthquakes, water under pressure, high risk

The largest earthquakes occur where oceanic plates move beneath continents. Obviously, water trapped in the boundary between both plates has a dominant influence on the earthquake rupture process. Analyzing the great Chile earthquake of February, 27th, 2010, a group of scientists from the GFZ German Research Centre for Geosciences and from Liverpool University found that the water pressure in the pores of the rocks making up the plate boundary zone takes the key role (Nature Geoscience, 28.03.2014).

The stress build-up before an earthquake and the magnitude of subsequent seismic energy release are substantially controlled by the mechanical coupling between both plates. Studies of recent great earthquakes have revealed that the lateral extent of the rupture and magnitude of these events are fundamentally controlled by the stress build-up along the subduction plate interface. Stress build-up and its lateral distribution in turn are dependent on the distribution and pressure of fluids along the plate interface.

“We combined observations of several geoscience disciplines – geodesy, seismology, petrology. In addition, we have a unique opportunity in Chile that our natural observatory there provides us with long time series of data,” says Onno Oncken, director of the GFZ-Department “Geodynamics and Geomaterials”. Earth observation (Geodesy) using GPS technology and radar interferometry today allows a detailed mapping of mechanical coupling at the plate boundary from the Earth’s surface. A complementary image of the rock properties at depth is provided by seismology. Earthquake data yield a high resolution three-dimensional image of seismic wave speeds and their variations in the plate interface region. Data on fluid pressure and rock properties, on the other hand, are available from laboratory measurements. All these data had been acquired shortly before the great Chile earthquake of February 2010 struck with a magnitude of 8.8.

“For the first time, our results allow us to map the spatial distribution of the fluid pressure with unprecedented resolution showing how they control mechanical locking and subsequent seismic energy release”, explains Professor Oncken. “Zones of changed seismic wave speeds reflect zones of reduced mechanical coupling between plates”. This state supports creep along the plate interface. In turn, high mechanical locking is promoted in lower pore fluid pressure domains. It is these locked domains that subsequently ruptured during the Chile earthquake releasing most seismic energy causing destruction at the Earth’s surface and tsunami waves. The authors suggest the spatial pore fluid pressure variations to be related to oceanic water accumulated in an altered oceanic fracture zone within the Pacific oceanic plate. Upon subduction of the latter beneath South America the fluid volumes are released and trapped along the overlying plate interface, leading to increasing pore fluid pressures. This study provides a powerful tool to monitor the physical state of a plate interface and to forecast its seismic potential.

Longmanshen fault zone still hazardous, suggest new reports

The 60-kilometer segment of the fault northeast of the 2013 Lushan rupture is the place in the region to watch for the next major earthquake, according to research published in Seismological Research Letters (SRL). Research papers published in this special section of SRL suggest the 2008 Wenchuan earthquake triggered the magnitude 6.6 Lushan quake.

Guest edited by Huajian Yao, professor of geophysics at the University of Science and Technology of China, the special section includes eight articles that present current data, description and preliminary analysis of the Lushan event and discuss the potential of future earthquakes in the region.

More than 87,000 people were killed or went missing as a result of the 2008 magnitude 7.9 Wenchuan earthquake in China’s Sichuan province, the largest quake to hit China since 1950. In 2013, the Lushan quake occurred ~90 km to the south and caused 203 deaths, injured 11,492 and affected more than 1.5 million people.

“After the 2008 magnitude 7.9 Wenchuan earthquake along the Longmenshan fault zone in western Sichuan of China, researchers in China and elsewhere have paid particular attention to this region, seeking to understand how the seismic hazard potential changed in the southern segment of the fault and nearby faults,” said Yao. “Yet the occurrence of this magnitude 6.6 Lushan event surprised many. The challenge of understanding where and when the next big quake will occur after a devastating seismic event continues after this Lushan event, although we now have gained much more information about this area.”

Preliminary rupture details

The southern part of the Longmenshan fault zone is complex and still only moderately understood. Similar to the central segment where the 2008 Wenchuan event occurred, the southern segment, which generated the Lushan rupture, includes the Wenchuan-Maoxian fault, Beichuan-Yingxiu fault, the Pengxian-Guanxian fault and Dayi faults, a series of sub-parallel secondary faults.

Although the Lushan earthquake’s mainshock did not break to the surface, the strong shaking still caused significant damage and casualties in the epicentral region. Three papers detail the rupture process of the Lushan quake. Libo Han from the China Earthquake Administration and colleagues provide a preliminary analysis of the Lushan mainshock and two large aftershocks, which appear to have occurred in the upper crust and terminated at a depth of approximately 8 km. While the Lushan earthquake cannot be associated with any identified surface faults, Han and colleagues suggest the quake may have occurred on a blind thrust fault subparallel to the Dayi fault, which lies at and partly defines the edge of the Chengdu basin. Based on observations from extensive trenching and mapping of fault activity after both the Wenchuan and Lushan earthquakes, Chen Lichun and colleagues from the China Earthquake Administration suggest the Lushan quake spread in a “piggyback fashion” toward the Sichuan basin, but with weaker activity and lower seismogenic potential than the Wenchuan quake. And Junju Xie, from the China Earthquake Administration and Beijing University of Technology, and colleagues examined the vertical and horizontal near-source strong motion from the Mw 6.8 Lushan earthquake. The vertical ground motion is relatively weak for this event, likely due to the fact that seismic energy dissipated at the depth of 12-25 km and the rupture did not break through the ground surface.

Possible link between Lushan and Wenchuan earthquakes

Were the Lushan and Wenchuan earthquakes related? And if so, what is the relationship? Some researchers consider the Lushan quake to be a strong aftershock of the Wenchuan quake, while others see them as independent events. In this special section, researchers tackled the question from various perspectives.

To discover whether the Lushan earthquake was truly independent from the Wenchuan quake, researchers need to have an accurate picture of where the Lushan quake originated. Yong Zhang from the GFZ German Research Centre for Geosciences and the China Earthquake Administration and colleagues begin this process by confirming a new hypocenter for Lushan. To find this place where the fault first began to rupture, the researchers analyze near-fault strong-motion data (movements that took place at a distance of up to a few tens of kilometers away from the fault) as well as long distance (thousands of kilometers ) teleseismic data.

Using their newly calculated location for the hypocenter, Zhang and colleagues now agree with earlier studies that suggest the initial Lushan rupture was a circular rupture event with no predominant direction. But they note that their calculations place the major slip area in the Lushan quake about 40 to 50 kilometers apart from the southwest end of the Wenchuan quake fault. This “gap” between the two faults may hold increased seismic hazards, caution Zhang and colleagues.

Ke Jia of Beijing University and colleagues explore the relationship of the two quakes with a statistical analysis of aftershocks in the region as well as the evolution of shear stress in the lower crust and upper mantle in the broader quake region. Their analyses suggest that the Wenchuan quake did affect the Lushan quake in an immediate sense by changing the overall background seismicity in the region. If these changes in background seismicity are taken into account, the researchers calculate a 62 percent probability that Lushan is a strong aftershock of Wenchuan.

Similarly, Yanzhao Wang from the China Earthquake Administration and colleagues quantified the stress loading of area faults due to the Wenchuan quake and suggest the change in stress may have caused the Lushan quake to rupture approximately 28.4 to 59.3 years earlier than expected. They conclude that the Lushan earthquake is at least 85 percent of a delayed aftershock of the Wenchuan earthquake, rather than due solely to long-term tectonic loading.

After the Wenchuan quake, researchers immediately began calculating stress changes on the major faults surrounding the rupture zone, in part to identify where dangerous aftershocks might occur and to test how well these stress change calculations might work to predict new earthquakes. As part of these analyses, Tom Parsons of the U.S. Geological Survey and Margarita Segou of GeoAzur compared data collected from the Wenchuan and Lushan quakes with data on aftershocks and stress change in four other major earthquakes, including the M 7.4 Landers and Izmit quakes in California and Turkey, respectively, and the M 7.9 Denali quake in Alaska and the M 7.1 Canterbury quake in New Zealand.

Their comparisons reveal that strong aftershocks similar to Lushan are likely to occur where there is highest overall aftershock activity, where stress change is the greatest and on well-developed fault zones. But they also note that by these criteria, the Lushan quake would only have been predicted by stress changes, and not the clustering of aftershocks following the 2008 Wenchuan event.

Future earthquakes in this region

After Wenchuan and Lushan, where should seismologists and other look for the next big quake in the region? After the 2008 Wenchuan quake, seismologists were primed with data to help predict where and when the next rupture might be in the region. The data suggested that the Wenchuan event would increase seismic stress in the southern Longmenshan fault that was the site of the 2013 Lushan quake. But that information alone could not predict that the southern Longmenshan fault would be the next to rupture after Wenchuan, say Mian Liu of the University of Missouri and colleagues, because the Wenchuan earthquake also increased the stress on numerous others faults in the region

Additional insights can be gained from seismic moment studies, according to Liu and colleagues. Moment balancing compares how much seismic strain energy is accumulated along a fault over a certain period with the amount of strain energy released over the same period. In the case of the Longmenshan fault, there had been a slow accumulation of strain energy without release by a major seismic event for more than a millennium. After the Wenchuan quake, the southern part of the Longmenshan fault became the fault with the greatest potential for a quake. And now, after Lushan, Liu and colleagues say that the 60 kilometer-long segment of the fault northeast of the Lushan rupture is the place in the region to watch for the next major earthquake.

Quake-triggered landslides pose significant hazard for Seattle, new study details potential damage

Locations of each zoom-in are shown on the map of Seattle at right. A) Coastal bluffs in the northern part of Seattle are most affected when soils are saturated. B) There are several areas along the I-5 corridor that are highly susceptible to landsliding for all soil saturation levels, such as the area shown here near the access point to the West Seattle bridge. C) The hillsides in West Seattle along the Duwamish valley are at risk of seismically induced landsliding, such as the area shown here. There are industrial as well as 59 residential buildings that could be affected by runout from landsliding in these areas. D) The coastal bluffs along Puget Sound in West Seattle on the hanging wall of the fault, such as the area shown here, are the most highly susceptible areas to landsliding in the city; numerous residential structures are at risk from both potential landslide source areas and runout. -  Allstadt/BSSA
Locations of each zoom-in are shown on the map of Seattle at right. A) Coastal bluffs in the northern part of Seattle are most affected when soils are saturated. B) There are several areas along the I-5 corridor that are highly susceptible to landsliding for all soil saturation levels, such as the area shown here near the access point to the West Seattle bridge. C) The hillsides in West Seattle along the Duwamish valley are at risk of seismically induced landsliding, such as the area shown here. There are industrial as well as 59 residential buildings that could be affected by runout from landsliding in these areas. D) The coastal bluffs along Puget Sound in West Seattle on the hanging wall of the fault, such as the area shown here, are the most highly susceptible areas to landsliding in the city; numerous residential structures are at risk from both potential landslide source areas and runout. – Allstadt/BSSA

A new study suggests the next big quake on the Seattle fault may cause devastating damage from landslides, greater than previously thought and beyond the areas currently defined as prone to landslides. Published online Oct. 22 by the Bulletin of the Seismological Society of America (BSSA), the research offers a framework for simulating hundreds of earthquake scenarios for the Seattle area.

“A major quake along the Seattle fault is among the worst case scenarios for the area since the fault runs just south of downtown. Our study shows the need for dedicated studies on seismically induced landsliding” said co-author Kate Allstadt, doctoral student at University of Washington.

Seattle is prone to strong shaking as it sits atop the Seattle Basin – a deep sedimentary basin that amplifies ground motion and generates strong seismic waves that tend to increase the duration of the shaking. The broader region is vulnerable to earthquakes from multiple sources, including deep earthquakes within the subducted Juan de Fuca plate, offshore megathrust earthquakes on Cascadia subduction zone and the shallow crustal earthquakes within the North American Plate.

For Seattle, a shallow crustal earthquake close to the city would be most damaging. The last major quake along the Seattle fault was in 900 AD, long before the city was established, though native people lived in the area. The earthquake triggered giant landslides along Lake Washington, causing entire blocks of forest to slide into the lake.

“There’s a kind of haunting precedence that tells us that we should pay attention to a large earthquake on this fault because it happened in the past,” said Allstadt, who also serves as duty seismologist for the Pacific Northwest Seismic Network. John Vidale of University of Washington and Art Frankel of the U.S. Geological Survey (USGS) are co-authors of the study, which was funded by the USGS.

While landslides triggered by earthquakes have caused damage and casualties worldwide, they have not often been the subject of extensive quantitative study or fully incorporated into seismic hazard assessments, say authors of this study that looks at just one scenario among potentially hundreds for a major earthquake in the Seattle area.

Dividing the area into a grid of 210-meter cells, they simulated ground motion for a magnitude 7 Seattle fault earthquake and then further subdivided into 5-meter cells, applying anticipated amplification of shaking due to the shallow soil layers. This refined framework yielded some surprises.

“One-third of the landslides triggered by our simulation were outside of the areas designated by the city as prone to landsliding,” said Allstadt. “A lot of people assume that all landslides occur in the same areas, but those triggered by rainfall or human behavior have a different triggering mechanism than landslides caused by earthquakes so we need dedicated studies.”

While soil saturation — whether the soil is dry or saturated with water – is the most important factor when analyzing the potential impact of landslides, the details of ground motion rank second. The amplification of ground shaking, directivity of seismic energy and geological features that may affect ground motion are very important to the outcome of ground failure, say authors.

The authors stress that this is just one randomized scenario study of many potential earthquake scenarios that could strike the city. While the results do not delineate the exact areas that will be affected in a future earthquake, they do illustrate the extent of landsliding to expect for a similar event.

The study suggests the southern half of the city and the coastal bluffs, many of which are developed, would be hardest hit. Depending upon the water saturation level of the soil at the time of the earthquake, several hundred to thousands of buildings could be affected citywide. For dry soil conditions, there are more than 1000 buildings that are within all hazard zones, 400 of those in the two highest hazard designation zones. The analysis suggests landslides could also affect some inland slopes, threatening key transit routes and impeding recovery efforts. For saturated soil conditions, it is an order of magnitude worse, with 8000 buildings within all hazard zones, 5000 of those within the two highest hazard zones. These numbers only reflect the number of buildings in high-risk areas, not the number of buildings that would necessarily suffer damage.

“The extra time we took to include the refined ground motion detail was worth it. It made a significant difference to our understanding of the potential damage to Seattle from seismically triggered landslides,” said Allstadt, who would like to use the new framework to run many more scenarios to prepare for future earthquakes in Seattle.

3-D Earth model developed at Sandia Labs more accurately pinpoints source of earthquakes, explosions

Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth's mantle and crust designed to help pinpoint the location of all types of explosions. -  Photo by Randy Montoya, Sandia National Laboratories
Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth’s mantle and crust designed to help pinpoint the location of all types of explosions. – Photo by Randy Montoya, Sandia National Laboratories

During the Cold War, U.S. and international monitoring agencies could spot nuclear tests and focused on measuring their sizes. Today, they’re looking around the globe to pinpoint much smaller explosives tests.

Under the sponsorship of the National Nuclear Security Administration’s Office of Defense Nuclear Nonproliferation R&D, Sandia National Laboratories and Los Alamos National Laboratory have partnered to develop a 3-D model of the Earth’s mantle and crust called SALSA3D, or Sandia-Los Alamos 3D. The purpose of this model is to assist the US Air Force and the international Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) in Vienna, Austria, more accurately locate all types of explosions.

The model uses a scalable triangular tessellation and seismic tomography to map the Earth’s “compressional wave seismic velocity,” a property of the rocks and other materials inside the Earth that indicates how quickly compressional waves travel through them and is one way to accurately locate seismic events, Sandia geophysicist Sandy Ballard said. Compressional waves – measured first after seismic events – move the particles in rocks and other materials minute distances backward and forward between the location of the event and the station detecting it.

SALSA3D also reduces the uncertainty in the model’s predictions, an important feature for decision-makers who must take action when suspicious activity is detected, he added.

“When you have an earthquake or nuclear explosion, not only do you need to know where it happened, but also how well you know that. That’s a difficult problem for these big 3-D models. It’s mainly a computational problem,” Ballard said. “The math is not so tough, just getting it done is hard, and we’ve accomplished that.”

A Sandia team has been writing and refining code for the model since 2007 and is now demonstrating SALSA3D is more accurate than current models.

In recent tests, SALSA3D was able to predict the source of seismic events over a geographical area that was 26 percent smaller than the traditional one-dimensional model and 9 percent smaller than a recently developed Regional Seismic Travel Time (RSTT) model used with the one-dimensional model.

GeoTess software release

Sandia recently released SALSA3D’s framework – the triangular tessellated grid on which the model is built – to other Earth scientists, seismologists and the public. By standardizing the framework, the seismological research community can more easily share models of the Earth’s structure and global monitoring agencies can better test different models. Both activities are hampered by the plethora of models available today, Ballard said. (See box.)

“GeoTess makes models compatible and standardizes everything,” he said. “This would really facilitate sharing of different models, if everyone agreed on it.”

Seismologists and researchers worldwide can now download GeoTess, which provides a common model parameterization for multidimensional Earth models and a software support system that addresses the construction, population, storage and interrogation of data stored in the model. GeoTess is not specific to any particular data, so users have considerable flexibility in how they store information in the model. The free package, including source code, is being released under the very liberal BSD Open Source License. The code is available in Java and C++, with interfaces to the C++ version written in C and Fortran90. GeoTess has been tested on multiple platforms, including Linux, SunOS, MacOSX and Windows. GeoTess is available here.

When an explosion goes off, the energy travels through the Earth as waves that are picked up by seismometers at U.S. and international ground monitoring stations associated with nuclear explosion monitoring organizations worldwide. Scientists use these signals to determine the location.

They first predict the time taken for the waves to travel from their source through the Earth to each station. To calculate that, they have to know the seismic velocity of the Earth’s materials from the crust to the inner core, Ballard said.

“If you have material that has very high seismic velocity, the waves travel very quickly, but the energy travels less quickly through other kinds of materials, so it takes the signals longer to travel from the source to the receiver,” he says.

For the past 100 years, seismologists have predicted the travel time of seismic energy from source to receiver using one-dimensional models. These models, which are still widely used today, account only for radial variations in seismic velocity and ignore variations in geographic directions. They yield seismic event locations that are reasonably accurate, but not nearly as precise as locations calculated with high fidelity 3-D models.

Modern 3-D models of the Earth, like SALSA3D, account for distortions of the seismic wavefronts caused by minor lateral differences in the properties of rocks and other materials.

For example, waves are distorted when they move through a geological feature called a subduction zone, such as the one beneath the west coast of South America where one tectonic plate under the Pacific Ocean is diving underneath the Andes Mountains. This happens at about the rate at which fingernails grow, but, geologically speaking, that’s fast, Ballard said.

One-dimensional models, like the widely used ak135 developed in the 1990s, are good at predicting the travel time of waves when the distance from the source to the receiver is large because these waves spend most of their time traveling through the deepest, most homogenous parts of the Earth. They don’t do so well at predicting travel time to nearby events where the waves spend most of their time in the Earth’s crust or the shallowest parts of the mantle, both of which contain a larger variety of materials than the lower mantle and the Earth’s core.

RSTT, a previous model developed jointly by Sandia, Los Alamos and Lawrence Livermore national laboratories, tried to solve that problem and works best at ranges of about 60-1,200 miles (100-2,000 kilometers).

Still, “the biggest errors we get are close to the surface of the Earth. That’s where the most variability in materials is,” Ballard said.

Seismic tomography gives SALSA3D accuracy

Today, Earth scientists are mapping three dimensions: the radius, latitude and longitude.

Anyone who’s studied a globe or world atlas knows that the traditional grid of longitudinal and latitudinal lines work all right the closer you are to the equator, but at the poles, the lines are too close together. For nuclear explosion monitoring, Earth models must accurately characterize the polar regions even though they are remote because seismic waves travel under them, Ballard said.

Triangular tessellation solves that with nodes, or intersections of the triangles, that can be accurately modeled even at the poles. The triangles can be smaller where more detail is needed and larger in areas that require less detail, like the oceans. Plus the model extends into the Earth like columns of stacked pieces of pie without the rounded crust edges.

The way Sandia calculates the seismic velocities uses the same math that is used to detect a tumor in an MRI, except on a global, rather than a human, scale.

Sandia uses historical data from 118,000 earthquakes and 13,000 current and former monitoring stations worldwide collected by Los Alamos Lab’s Ground Truth catalog.

“We apply a process called seismic tomography where we take millions of observed travel times and invert them for the seismic velocities that would create that data set. It’s mathematically similar to doing linear regression, but on steroids,” Sandy says. Linear regression is a simple mathematical way to model the relationship between a known variable and one or more unknown variables. Because the Sandia team models hundreds of thousands of unknown variables, they apply a mathematical method called least squares to minimize the discrepancies between the data from previous seismic events and the predictions.

With 10 million data points, Sandia uses a distributed computer network with about 400 core processors to characterize the seismic velocity at every node.

Monitoring agencies could use SALSA3D to precompute the travel time from each station in their network to every point on Earth. When it comes time to compute the location of a new seismic event in real-time, source-to-receiver travel times can be computed in a millisecond and pinpoint the energy’s source in about a second, he said.

Uncertainty modeling a SALSA3D feature

But no model is perfect, so Sandia has developed a way to measure the uncertainty in each prediction SALSA3D makes, based on uncertainty in the velocity at each node and how that uncertainty affects the travel time prediction of each wave from a seismic event to each monitoring station.

SALSA3D estimates for the users at monitoring stations the most likely location of a seismic event and the amount of uncertainty in the answer to help inform their decisions.

International test ban treaties require that on-site inspections can only occur within a 1,000-square-kilometer (385-square-mile) area surrounding a suspected nuclear test site. Today, 3-D Earth models like SALSA3D are helping to meet and sometimes significantly exceed this threshold in most parts of the world.

“It’s extremely difficult to do because the problem is so large,” Ballard said. “But we’ve got to know it within 1,000 square kilometers or they might search in the wrong place.”

Devastating long-distance impact of earthquakes

In 2006 the island of Java, Indonesia was struck by a devastating earthquake followed by the onset of a mud eruption to the east, flooding villages over several square kilometers and that continues to erupt today. Until now, researchers believed the earthquake was too far from the mud volcano to trigger the eruption. Geophysicists at the University of Bonn, Germany and ETH Zurich, Switzerland use computer-based simulations to show that such triggering is possible over long distances. The results have been published in “Nature Geoscience.”

On May 27, 2006 the ground of the Indonesian island Java was shaking with a magnitude 6.3 earthquake. The epicenter was located 25 km southwest of the city of Yogyakarta and initiated at a depth of 12 km. The earthquake took thousands of lives, injured ten thousand and destroyed buildings and homes. 47 hours later, about 250 km from the earthquake hypocenter, a mud volcano formed that came to be known as “Lusi”, short for “Lumpur Sidoarjo”. Hot mud erupted in the vicinity of an oil drilling-well, shooting mud up to 50 m into the sky and flooding the area. Scientists expect the mud volcano to be active for many more years.

Eruption of mud volcano has natural cause

Was the eruption of the mud triggered by natural events or was it man-made by the nearby exploration-well? Geophysicists at the University of Bonn, Germany and at ETH Zürich, Switzerland investigated this question with numerical wave-propagation experiments. “Many researchers believed that the earthquake epicenter was too far from Lusi to have activated the mud volcano,” says Prof. Dr. Stephen A. Miller from the department of Geodynamics at the University of Bonn. However, using their computer simulations that include the geological features of the Lusi subsurface, the team of Stephen Miller concluded that the earthquake was the trigger, despite the long distance.

The overpressured solid mud layer was trapped between layers with different acoustic properties, and this system was shaken from the earthquake and aftershocks like a bottle of champagne. The key, however, is the reflections provided by the dome-shaped geology underneath Lusi that focused the seismic waves of the earthquakes like the echo inside a cave. Prof. Stephen Miller explains: “Our simulations show that the dome-shaped structure with different properties focused seismic energy into the mud layer and could very well have liquified the mud that then injected into nearby faults.”

Previous studies would have underestimated the energy of the seismic waves, as ground motion was only considered at the surface. However, geophysicists at the University of Bonn suspect that those were much less intense than at depth. The dome-like structure “kept” the seismic waves at depth and damped those that reached the surface. “This was actually a lower estimate of the focussing effect because only one wave cycle was input. This effect increases with each wave cycle because of the reducing acoustic impedance of the pressurizing mud layer”. In response to claims that the reported highest velocity layer used in the modeling is a measurement artifact, Miller says “that does not change our conclusions because this effect will occur whenever a layer of low acoustic impedance is sandwiched between high impedance layers, irrespective of the exact values of the impedances. And the source of the Lusi mud was the inside of the sandwich.”

It has already been proposed that a tectonic fault is connecting Lusi to a 15 km distant volcanic system. Prof. Miller explains “This connection probably supplies the mud volcano with heat and fluids that keep Lusi erupting actively up to today”, explains Miller.

With their publication, scientists from Bonn and Zürich point out, that earthquakes can trigger processes over long distances, and this focusing effect may apply to other hydrothermal and volcanic systems. Stephen Miller concludes: “Being a geological rarity, the mud volcano may contribute to a better understanding of triggering processes and relationships between seismic and volcanic activity.” Miller also adds “maybe this work will settle the long-standing controversy and focus instead on helping those affected.” The island of Java is part of the so called Pacific Ring of Fire, a volcanic belt which surrounds the entire Pacific Ocean. Here, oceanic crust is subducted underneath oceanic and continental tectonic plates, leading to melting of crustal material at depth. The resulting magma uprises and is feeding numerous volcanoes.

Everything you always wanted to know about exploring Earth with seismology

A new addition to The Geological Society of America’s Memoir series, this comprehensive volume presents the worldwide history (1850 to 2005) of seismological studies of Earth’s crust. Authors Claus Prodehl of Universität Karlsuhe, Germany, and Walter D. Mooney of the U.S. Geological Survey have achieved the Herculean task of compiling into this one volume the results of all major field projects, land and sea, that have used man-made seismic energy sources to explore Earth’s crust.

First is a general synthesis of all major seismic projects on land as well as the most important oceanic projects during the time period 1850 to 1939, with more detailed coverage from 1940 onward. After the initial overview, history and results are subdivided into a separate chapter for each decade, with the material ordered by geographical region. Each chapter highlights the major advances achieved during that decade in terms of data acquisition, processing technology, and interpretation methods.

For all major seismic projects, Prodehl and Mooney provide specific details regarding the field observations, interpreted crustal cross section, and key references. They conclude the memoir with global and continental scale maps of all field measurements and interpreted Moho contours. An accompanying DVD contains important out-of-print publications and an extensive collection of controlled-source data, location maps and crustal cross sections.