Volcanoes, including Mt. Hood, can go from dormant to active quickly

Mount Hood, in the Oregon Cascades, doesn't have a highly explosive history. -  Photo courtesy Alison M Koleszar
Mount Hood, in the Oregon Cascades, doesn’t have a highly explosive history. – Photo courtesy Alison M Koleszar

A new study suggests that the magma sitting 4-5 kilometers beneath the surface of Oregon’s Mount Hood has been stored in near-solid conditions for thousands of years, but that the time it takes to liquefy and potentially erupt is surprisingly short – perhaps as little as a couple of months.

The key, scientists say, is to elevate the temperature of the rock to more than 750 degrees Celsius, which can happen when hot magma from deep within the Earth’s crust rises to the surface. It is the mixing of the two types of magma that triggered Mount Hood’s last two eruptions – about 220 and 1,500 years ago, said Adam Kent, an Oregon State University geologist and co-author of the study.

Results of the research, which was funded by the National Science Foundation, were published this week in the journal Nature.

“If the temperature of the rock is too cold, the magma is like peanut butter in a refrigerator,” Kent said. “It just isn’t very mobile. For Mount Hood, the threshold seems to be about 750 degrees (C) – if it warms up just 50 to 75 degrees above that, it greatly increases the viscosity of the magma and makes it easier to mobilize.”

Thus the scientists are interested in the temperature at which magma resides in the crust, they say, since it is likely to have important influence over the timing and types of eruptions that could occur. The hotter magma from down deep warms the cooler magma stored at 4-5 kilometers, making it possible for both magmas to mix and to be transported to the surface to eventually produce an eruption.

The good news, Kent said, is that Mount Hood’s eruptions are not particularly violent. Instead of exploding, the magma tends to ooze out the top of the peak. A previous study by Kent and OSU postdoctoral researcher Alison Koleszar found that the mixing of the two magma sources – which have different compositions – is both a trigger to an eruption and a constraining factor on how violent it can be.

“What happens when they mix is what happens when you squeeze a tube of toothpaste in the middle,” said Kent, a professor in OSU’s College of Earth, Ocean, and Atmospheric Sciences. “A big glob kind of plops out the top, but in the case of Mount Hood – it doesn’t blow the mountain to pieces.”

The collaborative study between Oregon State and the University of California, Davis is important because little was known about the physical conditions of magma storage and what it takes to mobilize the magma. Kent and UC-Davis colleague Kari Cooper, also a co-author on the Nature article, set out to find if they could determine how long Mount Hood’s magma chamber has been there, and in what condition.

When Mount Hood’s magma first rose up through the crust into its present-day chamber, it cooled and formed crystals. The researchers were able to document the age of the crystals by the rate of decay of naturally occurring radioactive elements. However, the growth of the crystals is also dictated by temperature – if the rock is too cold, they don’t grow as fast.

Thus the combination of the crystals’ age and apparent growth rate provides a geologic fingerprint for determining the approximate threshold for making the near-solid rock viscous enough to cause an eruption. The diffusion rate of the element strontium, which is also sensitive to temperature, helped validate the findings.

“What we found was that the magma has been stored beneath Mount Hood for at least 20,000 years – and probably more like 100,000 years,” Kent said. “And during the time it’s been there, it’s been in cold storage – like the peanut butter in the fridge – a minimum of 88 percent of the time, and likely more than 99 percent of the time.”

In other words – even though hot magma from below can quickly mobilize the magma chamber at 4-5 kilometers below the surface, most of the time magma is held under conditions that make it difficult for it to erupt.

“What is encouraging from another standpoint is that modern technology should be able to detect when magma is beginning to liquefy, or mobilize,” Kent said, “and that may give us warning of a potential eruption. Monitoring gases, utilizing seismic waves and studying ground deformation through GPS are a few of the techniques that could tell us that things are warming.”

The researchers hope to apply these techniques to other, larger volcanoes to see if they can determine their potential for shifting from cold storage to potential eruption, a development that might bring scientists a step closer to being able to forecast volcanic activity.

Volcanoes, including Mt. Hood, can go from dormant to active quickly

Mount Hood, in the Oregon Cascades, doesn't have a highly explosive history. -  Photo courtesy Alison M Koleszar
Mount Hood, in the Oregon Cascades, doesn’t have a highly explosive history. – Photo courtesy Alison M Koleszar

A new study suggests that the magma sitting 4-5 kilometers beneath the surface of Oregon’s Mount Hood has been stored in near-solid conditions for thousands of years, but that the time it takes to liquefy and potentially erupt is surprisingly short – perhaps as little as a couple of months.

The key, scientists say, is to elevate the temperature of the rock to more than 750 degrees Celsius, which can happen when hot magma from deep within the Earth’s crust rises to the surface. It is the mixing of the two types of magma that triggered Mount Hood’s last two eruptions – about 220 and 1,500 years ago, said Adam Kent, an Oregon State University geologist and co-author of the study.

Results of the research, which was funded by the National Science Foundation, were published this week in the journal Nature.

“If the temperature of the rock is too cold, the magma is like peanut butter in a refrigerator,” Kent said. “It just isn’t very mobile. For Mount Hood, the threshold seems to be about 750 degrees (C) – if it warms up just 50 to 75 degrees above that, it greatly increases the viscosity of the magma and makes it easier to mobilize.”

Thus the scientists are interested in the temperature at which magma resides in the crust, they say, since it is likely to have important influence over the timing and types of eruptions that could occur. The hotter magma from down deep warms the cooler magma stored at 4-5 kilometers, making it possible for both magmas to mix and to be transported to the surface to eventually produce an eruption.

The good news, Kent said, is that Mount Hood’s eruptions are not particularly violent. Instead of exploding, the magma tends to ooze out the top of the peak. A previous study by Kent and OSU postdoctoral researcher Alison Koleszar found that the mixing of the two magma sources – which have different compositions – is both a trigger to an eruption and a constraining factor on how violent it can be.

“What happens when they mix is what happens when you squeeze a tube of toothpaste in the middle,” said Kent, a professor in OSU’s College of Earth, Ocean, and Atmospheric Sciences. “A big glob kind of plops out the top, but in the case of Mount Hood – it doesn’t blow the mountain to pieces.”

The collaborative study between Oregon State and the University of California, Davis is important because little was known about the physical conditions of magma storage and what it takes to mobilize the magma. Kent and UC-Davis colleague Kari Cooper, also a co-author on the Nature article, set out to find if they could determine how long Mount Hood’s magma chamber has been there, and in what condition.

When Mount Hood’s magma first rose up through the crust into its present-day chamber, it cooled and formed crystals. The researchers were able to document the age of the crystals by the rate of decay of naturally occurring radioactive elements. However, the growth of the crystals is also dictated by temperature – if the rock is too cold, they don’t grow as fast.

Thus the combination of the crystals’ age and apparent growth rate provides a geologic fingerprint for determining the approximate threshold for making the near-solid rock viscous enough to cause an eruption. The diffusion rate of the element strontium, which is also sensitive to temperature, helped validate the findings.

“What we found was that the magma has been stored beneath Mount Hood for at least 20,000 years – and probably more like 100,000 years,” Kent said. “And during the time it’s been there, it’s been in cold storage – like the peanut butter in the fridge – a minimum of 88 percent of the time, and likely more than 99 percent of the time.”

In other words – even though hot magma from below can quickly mobilize the magma chamber at 4-5 kilometers below the surface, most of the time magma is held under conditions that make it difficult for it to erupt.

“What is encouraging from another standpoint is that modern technology should be able to detect when magma is beginning to liquefy, or mobilize,” Kent said, “and that may give us warning of a potential eruption. Monitoring gases, utilizing seismic waves and studying ground deformation through GPS are a few of the techniques that could tell us that things are warming.”

The researchers hope to apply these techniques to other, larger volcanoes to see if they can determine their potential for shifting from cold storage to potential eruption, a development that might bring scientists a step closer to being able to forecast volcanic activity.

Soil production breaks geologic speed record

This is a photo of the researcher hiking down the ridge at Rapid Creek to collect soil samples.  The dense bush and heavy 10 kilogram soil samples slowed uphill progress to less than 200 meters per hour. -  Andre Eger
This is a photo of the researcher hiking down the ridge at Rapid Creek to collect soil samples. The dense bush and heavy 10 kilogram soil samples slowed uphill progress to less than 200 meters per hour. – Andre Eger

Geologic time is shorthand for slow-paced. But new measurements from steep mountaintops in New Zealand show that rock can transform into soil more than twice as fast as previously believed possible.

The findings were published Jan. 16 in the early online edition of Science.

“Some previous work had argued that there were limits to soil production,” said first author Isaac Larsen, who did the work as part of his doctoral research in Earth sciences at the University of Washington. “But no one had made the measurements.”

The finding is more than just a new speed record. Rapidly eroding mountain ranges account for at least half of the total amount of the planet’s weathering and sediment production, although they occupy just a few percent of the Earth’s surface, researchers said.

So the record-breaking production at the mountaintops has implications for the entire carbon cycle by which the Earth’s crust pushes up to form mountains, crumbles, washes with rivers and rainwater to the sea, and eventually settles to the bottom to form new rock.

“This work takes the trend between soil production rates and chemical weathering rates and extends it to much higher values than had ever been previously observed,” said Larsen, now a postdoctoral researcher at the California Institute of Technology in Pasadena.

The study site in New Zealand’s Southern Alps is “an extremely rugged mountain range,” Larsen said, with rainfall of 10 meters (33 feet) per year and slopes of about 35 degrees.

To collect samples Larsen and co-author André Eger, then a graduate student at Lincoln University in New Zealand, were dropped from a helicopter onto remote mountaintops above the tree line. They would hike down to an appropriate test site and collect 20 pounds of dirt apiece, and then trek the samples back up to their base camp. The pair stayed at each of the mountaintop sites for about three days.

“I’ve worked in a lot of places,” Larsen said. “This was the most challenging fieldwork I’ve done.”

Researchers then brought soil samples back to the UW and measured the amount of Beryllium-10, an isotope that forms only at the Earth’s surface by exposure to cosmic rays. Those measurements showed soil production rates on the ridge tops ranging from 0.1 to 2.5 millimeters (1/10 of an inch) per year, and decrease exponentially with increasing soil thickness.

The peak rate is more than twice the proposed speed limit for soil production, in which geologists wondered if in places where soil is lost very quickly, the soil production just can’t keep up. In earlier work Larsen had noticed vegetation on very steep slopes and so he proposed this project to measure soil production rates at some of the steepest, wettest locations on the planet.

The new results show that soil production and weathering rates continue to increase as the landscape gets steeper and erodes faster, and suggest that other very steep locations such as the Himalayas and the mountains in Taiwan may also have very fast soil formation.

“A couple millimeters a year sounds pretty slow to anybody but a geologist,” said co-author David Montgomery, a UW professor of Earth and space sciences. “Isaac measured two millimeters of soil production a year, so it would take just a dozen years to make an inch of soil. That’s shockingly fast for a geologist, because the conventional wisdom is it takes centuries.”

The researchers believe plant roots may be responsible here. The mountain landscape was covered with low, dense vegetation. The roots of those plants reach into cracks in the rocks, helping break them apart and expose them to rainwater and chemical weathering.

“This opens up new questions about how soil production might happen in other locations, climates and environments,” Larsen said.

Soil production breaks geologic speed record

This is a photo of the researcher hiking down the ridge at Rapid Creek to collect soil samples.  The dense bush and heavy 10 kilogram soil samples slowed uphill progress to less than 200 meters per hour. -  Andre Eger
This is a photo of the researcher hiking down the ridge at Rapid Creek to collect soil samples. The dense bush and heavy 10 kilogram soil samples slowed uphill progress to less than 200 meters per hour. – Andre Eger

Geologic time is shorthand for slow-paced. But new measurements from steep mountaintops in New Zealand show that rock can transform into soil more than twice as fast as previously believed possible.

The findings were published Jan. 16 in the early online edition of Science.

“Some previous work had argued that there were limits to soil production,” said first author Isaac Larsen, who did the work as part of his doctoral research in Earth sciences at the University of Washington. “But no one had made the measurements.”

The finding is more than just a new speed record. Rapidly eroding mountain ranges account for at least half of the total amount of the planet’s weathering and sediment production, although they occupy just a few percent of the Earth’s surface, researchers said.

So the record-breaking production at the mountaintops has implications for the entire carbon cycle by which the Earth’s crust pushes up to form mountains, crumbles, washes with rivers and rainwater to the sea, and eventually settles to the bottom to form new rock.

“This work takes the trend between soil production rates and chemical weathering rates and extends it to much higher values than had ever been previously observed,” said Larsen, now a postdoctoral researcher at the California Institute of Technology in Pasadena.

The study site in New Zealand’s Southern Alps is “an extremely rugged mountain range,” Larsen said, with rainfall of 10 meters (33 feet) per year and slopes of about 35 degrees.

To collect samples Larsen and co-author André Eger, then a graduate student at Lincoln University in New Zealand, were dropped from a helicopter onto remote mountaintops above the tree line. They would hike down to an appropriate test site and collect 20 pounds of dirt apiece, and then trek the samples back up to their base camp. The pair stayed at each of the mountaintop sites for about three days.

“I’ve worked in a lot of places,” Larsen said. “This was the most challenging fieldwork I’ve done.”

Researchers then brought soil samples back to the UW and measured the amount of Beryllium-10, an isotope that forms only at the Earth’s surface by exposure to cosmic rays. Those measurements showed soil production rates on the ridge tops ranging from 0.1 to 2.5 millimeters (1/10 of an inch) per year, and decrease exponentially with increasing soil thickness.

The peak rate is more than twice the proposed speed limit for soil production, in which geologists wondered if in places where soil is lost very quickly, the soil production just can’t keep up. In earlier work Larsen had noticed vegetation on very steep slopes and so he proposed this project to measure soil production rates at some of the steepest, wettest locations on the planet.

The new results show that soil production and weathering rates continue to increase as the landscape gets steeper and erodes faster, and suggest that other very steep locations such as the Himalayas and the mountains in Taiwan may also have very fast soil formation.

“A couple millimeters a year sounds pretty slow to anybody but a geologist,” said co-author David Montgomery, a UW professor of Earth and space sciences. “Isaac measured two millimeters of soil production a year, so it would take just a dozen years to make an inch of soil. That’s shockingly fast for a geologist, because the conventional wisdom is it takes centuries.”

The researchers believe plant roots may be responsible here. The mountain landscape was covered with low, dense vegetation. The roots of those plants reach into cracks in the rocks, helping break them apart and expose them to rainwater and chemical weathering.

“This opens up new questions about how soil production might happen in other locations, climates and environments,” Larsen said.

Innovative handheld mineral analyzer — ‘the first of its kind’

This is Dr. Graeme Hansford of the University of Leicester Space Research Centre. -  University of Leicester
This is Dr. Graeme Hansford of the University of Leicester Space Research Centre. – University of Leicester

Dr Graeme Hansford from the University of Leicester’s Space Research Centre (SRC) has recently started a collaborative project with Bruker Elemental to develop a handheld mineral analyser for mining applications – the first of its kind.

The analyser will allow rapid mineral identification and quantification in the field through a combination of X-ray diffraction (XRD) and X-ray fluorescence (XRF). The novel X-ray diffraction method was invented at the University of Leicester and has been developed at the Space Research Centre. The addition of XRD capability represents an evolution of current handheld XRF instruments which sell 1000s of units each year globally.

The handheld instrument is expected to weigh just 1.5 kg, will be capable of analysing mining samples for mineral content within 1 – 2 minutes, and requires no sample preparation. This would be a world first. The analyser is unique due to the insensitivity of the technique to the shape of the sample, which enables the direct analysis of samples without any form of preparation – something currently inconceivable using conventional XRD equipment.

Dr Hansford said: “It’s very fulfilling for me to see the development of this novel XRD technique from initial conception through theoretical calculations and modelling to experimental demonstration.

“The next step is to develop the commercial potential and I’m very excited to be working with Bruker Elemental on the development of a handheld instrument.”

Bruker Elemental is a global leader in handheld XRF instrumentation, with the mining sector a key customer. Bruker therefore brings essential commercial expertise to the project. The two partners have complementary expertise, and are uniquely placed to successfully deliver this knowledge exchange project.

Alexander Seyfarth, senior product manager at Bruker Elemental, said: “Bruker is excited to be involved in this project as it will bring new measurement capabilities to our handheld equipment. In many cases this system will provide information on the crystallography of the sample in addition to the elemental analysis.”

Dr Hansford originally conceived of the XRD technique in early 2010, when trying to work out how to apply XRD for space applications – for example on the surface of Mars or an asteroid – without the need for any sample preparation.

The next stage of the project will focus on developing and testing the methodology using samples which are representative of real-world problems encountered in mining, such as determining the relative amounts of iron oxide minerals in ore samples. In the second part of the project, a prototype handheld device will be developed at the SRC in conjunction with Bruker to demonstrate efficacy of the technology in the field. A key advantage is that the hardware requirements of the technique are very similar to existing handheld XRF devices, facilitating both rapid development and customer acceptance.

Innovative handheld mineral analyzer — ‘the first of its kind’

This is Dr. Graeme Hansford of the University of Leicester Space Research Centre. -  University of Leicester
This is Dr. Graeme Hansford of the University of Leicester Space Research Centre. – University of Leicester

Dr Graeme Hansford from the University of Leicester’s Space Research Centre (SRC) has recently started a collaborative project with Bruker Elemental to develop a handheld mineral analyser for mining applications – the first of its kind.

The analyser will allow rapid mineral identification and quantification in the field through a combination of X-ray diffraction (XRD) and X-ray fluorescence (XRF). The novel X-ray diffraction method was invented at the University of Leicester and has been developed at the Space Research Centre. The addition of XRD capability represents an evolution of current handheld XRF instruments which sell 1000s of units each year globally.

The handheld instrument is expected to weigh just 1.5 kg, will be capable of analysing mining samples for mineral content within 1 – 2 minutes, and requires no sample preparation. This would be a world first. The analyser is unique due to the insensitivity of the technique to the shape of the sample, which enables the direct analysis of samples without any form of preparation – something currently inconceivable using conventional XRD equipment.

Dr Hansford said: “It’s very fulfilling for me to see the development of this novel XRD technique from initial conception through theoretical calculations and modelling to experimental demonstration.

“The next step is to develop the commercial potential and I’m very excited to be working with Bruker Elemental on the development of a handheld instrument.”

Bruker Elemental is a global leader in handheld XRF instrumentation, with the mining sector a key customer. Bruker therefore brings essential commercial expertise to the project. The two partners have complementary expertise, and are uniquely placed to successfully deliver this knowledge exchange project.

Alexander Seyfarth, senior product manager at Bruker Elemental, said: “Bruker is excited to be involved in this project as it will bring new measurement capabilities to our handheld equipment. In many cases this system will provide information on the crystallography of the sample in addition to the elemental analysis.”

Dr Hansford originally conceived of the XRD technique in early 2010, when trying to work out how to apply XRD for space applications – for example on the surface of Mars or an asteroid – without the need for any sample preparation.

The next stage of the project will focus on developing and testing the methodology using samples which are representative of real-world problems encountered in mining, such as determining the relative amounts of iron oxide minerals in ore samples. In the second part of the project, a prototype handheld device will be developed at the SRC in conjunction with Bruker to demonstrate efficacy of the technology in the field. A key advantage is that the hardware requirements of the technique are very similar to existing handheld XRF devices, facilitating both rapid development and customer acceptance.

Longmanshen fault zone still hazardous, suggest new reports

The 60-kilometer segment of the fault northeast of the 2013 Lushan rupture is the place in the region to watch for the next major earthquake, according to research published in Seismological Research Letters (SRL). Research papers published in this special section of SRL suggest the 2008 Wenchuan earthquake triggered the magnitude 6.6 Lushan quake.

Guest edited by Huajian Yao, professor of geophysics at the University of Science and Technology of China, the special section includes eight articles that present current data, description and preliminary analysis of the Lushan event and discuss the potential of future earthquakes in the region.

More than 87,000 people were killed or went missing as a result of the 2008 magnitude 7.9 Wenchuan earthquake in China’s Sichuan province, the largest quake to hit China since 1950. In 2013, the Lushan quake occurred ~90 km to the south and caused 203 deaths, injured 11,492 and affected more than 1.5 million people.

“After the 2008 magnitude 7.9 Wenchuan earthquake along the Longmenshan fault zone in western Sichuan of China, researchers in China and elsewhere have paid particular attention to this region, seeking to understand how the seismic hazard potential changed in the southern segment of the fault and nearby faults,” said Yao. “Yet the occurrence of this magnitude 6.6 Lushan event surprised many. The challenge of understanding where and when the next big quake will occur after a devastating seismic event continues after this Lushan event, although we now have gained much more information about this area.”

Preliminary rupture details

The southern part of the Longmenshan fault zone is complex and still only moderately understood. Similar to the central segment where the 2008 Wenchuan event occurred, the southern segment, which generated the Lushan rupture, includes the Wenchuan-Maoxian fault, Beichuan-Yingxiu fault, the Pengxian-Guanxian fault and Dayi faults, a series of sub-parallel secondary faults.

Although the Lushan earthquake’s mainshock did not break to the surface, the strong shaking still caused significant damage and casualties in the epicentral region. Three papers detail the rupture process of the Lushan quake. Libo Han from the China Earthquake Administration and colleagues provide a preliminary analysis of the Lushan mainshock and two large aftershocks, which appear to have occurred in the upper crust and terminated at a depth of approximately 8 km. While the Lushan earthquake cannot be associated with any identified surface faults, Han and colleagues suggest the quake may have occurred on a blind thrust fault subparallel to the Dayi fault, which lies at and partly defines the edge of the Chengdu basin. Based on observations from extensive trenching and mapping of fault activity after both the Wenchuan and Lushan earthquakes, Chen Lichun and colleagues from the China Earthquake Administration suggest the Lushan quake spread in a “piggyback fashion” toward the Sichuan basin, but with weaker activity and lower seismogenic potential than the Wenchuan quake. And Junju Xie, from the China Earthquake Administration and Beijing University of Technology, and colleagues examined the vertical and horizontal near-source strong motion from the Mw 6.8 Lushan earthquake. The vertical ground motion is relatively weak for this event, likely due to the fact that seismic energy dissipated at the depth of 12-25 km and the rupture did not break through the ground surface.

Possible link between Lushan and Wenchuan earthquakes

Were the Lushan and Wenchuan earthquakes related? And if so, what is the relationship? Some researchers consider the Lushan quake to be a strong aftershock of the Wenchuan quake, while others see them as independent events. In this special section, researchers tackled the question from various perspectives.

To discover whether the Lushan earthquake was truly independent from the Wenchuan quake, researchers need to have an accurate picture of where the Lushan quake originated. Yong Zhang from the GFZ German Research Centre for Geosciences and the China Earthquake Administration and colleagues begin this process by confirming a new hypocenter for Lushan. To find this place where the fault first began to rupture, the researchers analyze near-fault strong-motion data (movements that took place at a distance of up to a few tens of kilometers away from the fault) as well as long distance (thousands of kilometers ) teleseismic data.

Using their newly calculated location for the hypocenter, Zhang and colleagues now agree with earlier studies that suggest the initial Lushan rupture was a circular rupture event with no predominant direction. But they note that their calculations place the major slip area in the Lushan quake about 40 to 50 kilometers apart from the southwest end of the Wenchuan quake fault. This “gap” between the two faults may hold increased seismic hazards, caution Zhang and colleagues.

Ke Jia of Beijing University and colleagues explore the relationship of the two quakes with a statistical analysis of aftershocks in the region as well as the evolution of shear stress in the lower crust and upper mantle in the broader quake region. Their analyses suggest that the Wenchuan quake did affect the Lushan quake in an immediate sense by changing the overall background seismicity in the region. If these changes in background seismicity are taken into account, the researchers calculate a 62 percent probability that Lushan is a strong aftershock of Wenchuan.

Similarly, Yanzhao Wang from the China Earthquake Administration and colleagues quantified the stress loading of area faults due to the Wenchuan quake and suggest the change in stress may have caused the Lushan quake to rupture approximately 28.4 to 59.3 years earlier than expected. They conclude that the Lushan earthquake is at least 85 percent of a delayed aftershock of the Wenchuan earthquake, rather than due solely to long-term tectonic loading.

After the Wenchuan quake, researchers immediately began calculating stress changes on the major faults surrounding the rupture zone, in part to identify where dangerous aftershocks might occur and to test how well these stress change calculations might work to predict new earthquakes. As part of these analyses, Tom Parsons of the U.S. Geological Survey and Margarita Segou of GeoAzur compared data collected from the Wenchuan and Lushan quakes with data on aftershocks and stress change in four other major earthquakes, including the M 7.4 Landers and Izmit quakes in California and Turkey, respectively, and the M 7.9 Denali quake in Alaska and the M 7.1 Canterbury quake in New Zealand.

Their comparisons reveal that strong aftershocks similar to Lushan are likely to occur where there is highest overall aftershock activity, where stress change is the greatest and on well-developed fault zones. But they also note that by these criteria, the Lushan quake would only have been predicted by stress changes, and not the clustering of aftershocks following the 2008 Wenchuan event.

Future earthquakes in this region

After Wenchuan and Lushan, where should seismologists and other look for the next big quake in the region? After the 2008 Wenchuan quake, seismologists were primed with data to help predict where and when the next rupture might be in the region. The data suggested that the Wenchuan event would increase seismic stress in the southern Longmenshan fault that was the site of the 2013 Lushan quake. But that information alone could not predict that the southern Longmenshan fault would be the next to rupture after Wenchuan, say Mian Liu of the University of Missouri and colleagues, because the Wenchuan earthquake also increased the stress on numerous others faults in the region

Additional insights can be gained from seismic moment studies, according to Liu and colleagues. Moment balancing compares how much seismic strain energy is accumulated along a fault over a certain period with the amount of strain energy released over the same period. In the case of the Longmenshan fault, there had been a slow accumulation of strain energy without release by a major seismic event for more than a millennium. After the Wenchuan quake, the southern part of the Longmenshan fault became the fault with the greatest potential for a quake. And now, after Lushan, Liu and colleagues say that the 60 kilometer-long segment of the fault northeast of the Lushan rupture is the place in the region to watch for the next major earthquake.

Longmanshen fault zone still hazardous, suggest new reports

The 60-kilometer segment of the fault northeast of the 2013 Lushan rupture is the place in the region to watch for the next major earthquake, according to research published in Seismological Research Letters (SRL). Research papers published in this special section of SRL suggest the 2008 Wenchuan earthquake triggered the magnitude 6.6 Lushan quake.

Guest edited by Huajian Yao, professor of geophysics at the University of Science and Technology of China, the special section includes eight articles that present current data, description and preliminary analysis of the Lushan event and discuss the potential of future earthquakes in the region.

More than 87,000 people were killed or went missing as a result of the 2008 magnitude 7.9 Wenchuan earthquake in China’s Sichuan province, the largest quake to hit China since 1950. In 2013, the Lushan quake occurred ~90 km to the south and caused 203 deaths, injured 11,492 and affected more than 1.5 million people.

“After the 2008 magnitude 7.9 Wenchuan earthquake along the Longmenshan fault zone in western Sichuan of China, researchers in China and elsewhere have paid particular attention to this region, seeking to understand how the seismic hazard potential changed in the southern segment of the fault and nearby faults,” said Yao. “Yet the occurrence of this magnitude 6.6 Lushan event surprised many. The challenge of understanding where and when the next big quake will occur after a devastating seismic event continues after this Lushan event, although we now have gained much more information about this area.”

Preliminary rupture details

The southern part of the Longmenshan fault zone is complex and still only moderately understood. Similar to the central segment where the 2008 Wenchuan event occurred, the southern segment, which generated the Lushan rupture, includes the Wenchuan-Maoxian fault, Beichuan-Yingxiu fault, the Pengxian-Guanxian fault and Dayi faults, a series of sub-parallel secondary faults.

Although the Lushan earthquake’s mainshock did not break to the surface, the strong shaking still caused significant damage and casualties in the epicentral region. Three papers detail the rupture process of the Lushan quake. Libo Han from the China Earthquake Administration and colleagues provide a preliminary analysis of the Lushan mainshock and two large aftershocks, which appear to have occurred in the upper crust and terminated at a depth of approximately 8 km. While the Lushan earthquake cannot be associated with any identified surface faults, Han and colleagues suggest the quake may have occurred on a blind thrust fault subparallel to the Dayi fault, which lies at and partly defines the edge of the Chengdu basin. Based on observations from extensive trenching and mapping of fault activity after both the Wenchuan and Lushan earthquakes, Chen Lichun and colleagues from the China Earthquake Administration suggest the Lushan quake spread in a “piggyback fashion” toward the Sichuan basin, but with weaker activity and lower seismogenic potential than the Wenchuan quake. And Junju Xie, from the China Earthquake Administration and Beijing University of Technology, and colleagues examined the vertical and horizontal near-source strong motion from the Mw 6.8 Lushan earthquake. The vertical ground motion is relatively weak for this event, likely due to the fact that seismic energy dissipated at the depth of 12-25 km and the rupture did not break through the ground surface.

Possible link between Lushan and Wenchuan earthquakes

Were the Lushan and Wenchuan earthquakes related? And if so, what is the relationship? Some researchers consider the Lushan quake to be a strong aftershock of the Wenchuan quake, while others see them as independent events. In this special section, researchers tackled the question from various perspectives.

To discover whether the Lushan earthquake was truly independent from the Wenchuan quake, researchers need to have an accurate picture of where the Lushan quake originated. Yong Zhang from the GFZ German Research Centre for Geosciences and the China Earthquake Administration and colleagues begin this process by confirming a new hypocenter for Lushan. To find this place where the fault first began to rupture, the researchers analyze near-fault strong-motion data (movements that took place at a distance of up to a few tens of kilometers away from the fault) as well as long distance (thousands of kilometers ) teleseismic data.

Using their newly calculated location for the hypocenter, Zhang and colleagues now agree with earlier studies that suggest the initial Lushan rupture was a circular rupture event with no predominant direction. But they note that their calculations place the major slip area in the Lushan quake about 40 to 50 kilometers apart from the southwest end of the Wenchuan quake fault. This “gap” between the two faults may hold increased seismic hazards, caution Zhang and colleagues.

Ke Jia of Beijing University and colleagues explore the relationship of the two quakes with a statistical analysis of aftershocks in the region as well as the evolution of shear stress in the lower crust and upper mantle in the broader quake region. Their analyses suggest that the Wenchuan quake did affect the Lushan quake in an immediate sense by changing the overall background seismicity in the region. If these changes in background seismicity are taken into account, the researchers calculate a 62 percent probability that Lushan is a strong aftershock of Wenchuan.

Similarly, Yanzhao Wang from the China Earthquake Administration and colleagues quantified the stress loading of area faults due to the Wenchuan quake and suggest the change in stress may have caused the Lushan quake to rupture approximately 28.4 to 59.3 years earlier than expected. They conclude that the Lushan earthquake is at least 85 percent of a delayed aftershock of the Wenchuan earthquake, rather than due solely to long-term tectonic loading.

After the Wenchuan quake, researchers immediately began calculating stress changes on the major faults surrounding the rupture zone, in part to identify where dangerous aftershocks might occur and to test how well these stress change calculations might work to predict new earthquakes. As part of these analyses, Tom Parsons of the U.S. Geological Survey and Margarita Segou of GeoAzur compared data collected from the Wenchuan and Lushan quakes with data on aftershocks and stress change in four other major earthquakes, including the M 7.4 Landers and Izmit quakes in California and Turkey, respectively, and the M 7.9 Denali quake in Alaska and the M 7.1 Canterbury quake in New Zealand.

Their comparisons reveal that strong aftershocks similar to Lushan are likely to occur where there is highest overall aftershock activity, where stress change is the greatest and on well-developed fault zones. But they also note that by these criteria, the Lushan quake would only have been predicted by stress changes, and not the clustering of aftershocks following the 2008 Wenchuan event.

Future earthquakes in this region

After Wenchuan and Lushan, where should seismologists and other look for the next big quake in the region? After the 2008 Wenchuan quake, seismologists were primed with data to help predict where and when the next rupture might be in the region. The data suggested that the Wenchuan event would increase seismic stress in the southern Longmenshan fault that was the site of the 2013 Lushan quake. But that information alone could not predict that the southern Longmenshan fault would be the next to rupture after Wenchuan, say Mian Liu of the University of Missouri and colleagues, because the Wenchuan earthquake also increased the stress on numerous others faults in the region

Additional insights can be gained from seismic moment studies, according to Liu and colleagues. Moment balancing compares how much seismic strain energy is accumulated along a fault over a certain period with the amount of strain energy released over the same period. In the case of the Longmenshan fault, there had been a slow accumulation of strain energy without release by a major seismic event for more than a millennium. After the Wenchuan quake, the southern part of the Longmenshan fault became the fault with the greatest potential for a quake. And now, after Lushan, Liu and colleagues say that the 60 kilometer-long segment of the fault northeast of the Lushan rupture is the place in the region to watch for the next major earthquake.

Scientists anticipated size and location of 2012 Costa Rica earthquake

Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, performs a GPS survey in Costa Rica's Nicoya Peninsula in 2010. -  Lujia Feng
Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, performs a GPS survey in Costa Rica’s Nicoya Peninsula in 2010. – Lujia Feng

Scientists using GPS to study changes in the Earth’s shape accurately forecasted the size and location of the magnitude 7.6 Nicoya earthquake that occurred in 2012 in Costa Rica.

The Nicoya Peninsula in Costa Rica is one of the few places where land sits atop the portion of a subduction zone where the Earth’s greatest earthquakes take place. Costa Rica’s location therefore makes it the perfect spot for learning how large earthquakes rupture. Because earthquakes greater than about magnitude 7.5 have occurred in this region roughly every 50 years, with the previous event striking in 1950, scientists have been preparing for this earthquake through a number of geophysical studies. The most recent study used GPS to map out the area along the fault storing energy for release in a large earthquake.

“This is the first place where we’ve been able to map out the likely extent of an earthquake rupture along the subduction megathrust beforehand,” said Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology.

The study was published online Dec. 22, 2013, in the journal Nature Geoscience. The research was supported by the National Science Foundation and was a collaboration of researchers from Georgia Tech, the Costa Rica Volcanological and Seismological Observatory (OVSICORI) at Universidad Nacional, University California, Santa Cruz, and the University of South Florida.

Subduction zones are locations where one tectonic plate is forced under another one. The collision of tectonic plates during this process can unleash devastating earthquakes, and sometimes devastating tsunamis. The magnitude 9.0 earthquake off the coast of Japan in 2011 was due to just such a subduction zone eaerthquake. The Cascadia subduction zone in the Pacific Northwest is capable of unleashing a similarly sized quake. Damage from the Nicoya earthquake was not as bad as might be expected from a magnitude 7.6 quake.

“Fortunately there was very little damage considering the earthquake’s size,” said Marino Protti of OVSICORI and the study’s lead author. “The historical pattern of earthquakes not only allowed us to get our instruments ready, it also allowed Costa Ricans to upgrade their buildings to be earthquake safe.”

Plate tectonics are the driving force for subduction zones. As tectonic plates converge, strain temporarily accumulates across the plate boundary when portions of the interface between these tectonic plates, called a megathrust, become locked together. The strain can accumulate to dangerous levels before eventually being released as a massive earthquake.

“The Nicoya Peninsula is an ideal natural lab for studying these events, because the coastline geometry uniquely allows us to get our equipment close to the zone of active strain accumulation,” said Susan Schwartz, professor of earth sciences at the University of California, Santa Cruz, and a co-author of the study.

Through a series of studies starting in the early 1990s using land-based tools, the researchers mapped regions where tectonic plates were completely locked along the subduction interface. Detailed geophysical observations of the region allowed the researchers to create an image of where the faults had locked.

The researchers published a study a few months before the earthquake, describing the particular locked patch with the clearest potential for the next large earthquake in the region. The team projected the total amount of energy that could have developed across that region and forecasted that if the locking remained similar since the last major earthquake in 1950, then there is presently enough energy for an earthquake on the order of magnitude 7.8 there.

Because of limits in technology and scientific understanding about processes controlling fault locking and release, scientists cannot say much about precisely where or when earthquakes will occur. However, earthquakes in Nicoya have occurred about every 50 years, so seismologists had been anticipating another one around 2000, give or take 20 years, Newman said. The earthquake occurred in September of 2012 as a magnitude 7.6 quake.

“It occurred right in the area we determined to be locked and it had almost the size we expected,” Newman said.

The researchers hope to apply what they’ve learned in Costa Rica to other environments. Virtually every damaging subduction zone earthquake occurs far offshore.

“Nicoya is the only place on Earth where we’ve actually been able to get a very accurate image of the locked patch because it occurs directly under land,” Newman said. “If we really want to understand the seismic potential for most of the world, we have to go offshore.”

Scientists have been able to reasonably map portions of these locked areas offshore using data on land, but the resolution is poor, particularly in the regions that are most responsible for generating tsunamis, Newman said. He hopes that his group’s work in Nicoya will be a driver for geodetic studies on the seafloor to observe such Earth deformation. These seafloor geodetic studies are rare and expensive today.

“If we want to understand the potential for large earthquakes, then we really need to start doing more seafloor observations,” Newman said. “It’s a growing push in our community and this study highlights the type of results that one might be able to obtain for most other dangerous environments, including offshore the Pacific Northwest.”

Scientists anticipated size and location of 2012 Costa Rica earthquake

Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, performs a GPS survey in Costa Rica's Nicoya Peninsula in 2010. -  Lujia Feng
Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, performs a GPS survey in Costa Rica’s Nicoya Peninsula in 2010. – Lujia Feng

Scientists using GPS to study changes in the Earth’s shape accurately forecasted the size and location of the magnitude 7.6 Nicoya earthquake that occurred in 2012 in Costa Rica.

The Nicoya Peninsula in Costa Rica is one of the few places where land sits atop the portion of a subduction zone where the Earth’s greatest earthquakes take place. Costa Rica’s location therefore makes it the perfect spot for learning how large earthquakes rupture. Because earthquakes greater than about magnitude 7.5 have occurred in this region roughly every 50 years, with the previous event striking in 1950, scientists have been preparing for this earthquake through a number of geophysical studies. The most recent study used GPS to map out the area along the fault storing energy for release in a large earthquake.

“This is the first place where we’ve been able to map out the likely extent of an earthquake rupture along the subduction megathrust beforehand,” said Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology.

The study was published online Dec. 22, 2013, in the journal Nature Geoscience. The research was supported by the National Science Foundation and was a collaboration of researchers from Georgia Tech, the Costa Rica Volcanological and Seismological Observatory (OVSICORI) at Universidad Nacional, University California, Santa Cruz, and the University of South Florida.

Subduction zones are locations where one tectonic plate is forced under another one. The collision of tectonic plates during this process can unleash devastating earthquakes, and sometimes devastating tsunamis. The magnitude 9.0 earthquake off the coast of Japan in 2011 was due to just such a subduction zone eaerthquake. The Cascadia subduction zone in the Pacific Northwest is capable of unleashing a similarly sized quake. Damage from the Nicoya earthquake was not as bad as might be expected from a magnitude 7.6 quake.

“Fortunately there was very little damage considering the earthquake’s size,” said Marino Protti of OVSICORI and the study’s lead author. “The historical pattern of earthquakes not only allowed us to get our instruments ready, it also allowed Costa Ricans to upgrade their buildings to be earthquake safe.”

Plate tectonics are the driving force for subduction zones. As tectonic plates converge, strain temporarily accumulates across the plate boundary when portions of the interface between these tectonic plates, called a megathrust, become locked together. The strain can accumulate to dangerous levels before eventually being released as a massive earthquake.

“The Nicoya Peninsula is an ideal natural lab for studying these events, because the coastline geometry uniquely allows us to get our equipment close to the zone of active strain accumulation,” said Susan Schwartz, professor of earth sciences at the University of California, Santa Cruz, and a co-author of the study.

Through a series of studies starting in the early 1990s using land-based tools, the researchers mapped regions where tectonic plates were completely locked along the subduction interface. Detailed geophysical observations of the region allowed the researchers to create an image of where the faults had locked.

The researchers published a study a few months before the earthquake, describing the particular locked patch with the clearest potential for the next large earthquake in the region. The team projected the total amount of energy that could have developed across that region and forecasted that if the locking remained similar since the last major earthquake in 1950, then there is presently enough energy for an earthquake on the order of magnitude 7.8 there.

Because of limits in technology and scientific understanding about processes controlling fault locking and release, scientists cannot say much about precisely where or when earthquakes will occur. However, earthquakes in Nicoya have occurred about every 50 years, so seismologists had been anticipating another one around 2000, give or take 20 years, Newman said. The earthquake occurred in September of 2012 as a magnitude 7.6 quake.

“It occurred right in the area we determined to be locked and it had almost the size we expected,” Newman said.

The researchers hope to apply what they’ve learned in Costa Rica to other environments. Virtually every damaging subduction zone earthquake occurs far offshore.

“Nicoya is the only place on Earth where we’ve actually been able to get a very accurate image of the locked patch because it occurs directly under land,” Newman said. “If we really want to understand the seismic potential for most of the world, we have to go offshore.”

Scientists have been able to reasonably map portions of these locked areas offshore using data on land, but the resolution is poor, particularly in the regions that are most responsible for generating tsunamis, Newman said. He hopes that his group’s work in Nicoya will be a driver for geodetic studies on the seafloor to observe such Earth deformation. These seafloor geodetic studies are rare and expensive today.

“If we want to understand the potential for large earthquakes, then we really need to start doing more seafloor observations,” Newman said. “It’s a growing push in our community and this study highlights the type of results that one might be able to obtain for most other dangerous environments, including offshore the Pacific Northwest.”