Re-thinking Southern California earthquake scenarios in Coachella Valley, San Andreas Fault

The Coachella Valley segment of the southernmost section of the San Andreas Fault in California has a high likelihood for a large rupture in the near future, since it has a recurrence interval of about 180 years but has not ruptured in over 300 years. -  UMass Amherst and Google Earth
The Coachella Valley segment of the southernmost section of the San Andreas Fault in California has a high likelihood for a large rupture in the near future, since it has a recurrence interval of about 180 years but has not ruptured in over 300 years. – UMass Amherst and Google Earth

New three-dimensional (3D) numerical modeling that captures far more geometric complexity of an active fault segment in southern California than any other, suggests that the overall earthquake hazard for towns on the west side of the Coachella Valley such as Palm Springs and Palm Desert may be slightly lower than previously believed.

New simulations of deformation on three alternative fault configurations for the Coachella Valley segment of the San Andreas Fault conducted by geoscientists Michele Cooke and Laura Fattaruso of the University of Massachusetts Amherst, with Rebecca Dorsey of the University of Oregon, appear in the December issue of Geosphere.

The Coachella Valley segment is the southernmost section of the San Andreas Fault in California. It has a high likelihood for a large rupture in the near future, since it has a recurrence interval of about 180 years but has not ruptured in over 300 years, the authors point out.

The researchers acknowledge that their new modeling offers “a pretty controversial interpretation” of the data. Many geoscientists do not accept a dipping active fault geometry to the San Andreas Fault in the Coachella Valley, they say. Some argue that the data do not confirm the dipping structure. “Our contribution to this debate is that we add an uplift pattern to the data that support a dipping active fault and it rejects the other models,” say Cooke and colleagues.

Their new model yields an estimated 10 percent increase in shaking overall for the Coachella segment. But for the towns to the west of the fault where most people live, it yields decreased shaking due to the dipping geometry. It yields a doubling of shaking in mostly unpopulated areas east of the fault. “This isn’t a direct outcome of our work but an implication,” they add.

Cooke says, “Others have used a dipping San Andreas in their models but they didn’t include the degree of complexity that we did. By including the secondary faults within the Mecca Hills we more accurately capture the uplift pattern of the region.”

Fattaruso adds, “Others were comparing to different data sets, such as geodesy, and since we were comparing to uplift it is important that we have this complexity.” In this case, geodesy is the science of measuring and representing the Earth and its crustal motion, taking into account the competition of geological processes in 3D over time.

Most other models of deformation, stress, rupture and ground shaking have assumed that the southern San Andreas Fault is vertical, say Cooke and colleagues. However, seismic, imaging, aerial magnetometric surveys and GPS-based strain observations suggest that the fault dips 60 to 70 degrees toward the northeast, a hypothesis they set out to investigate.

Specifically, they explored three alternative geometric models of the fault’s Coachella Valley segment with added complexity such as including smaller faults in the nearby Indio and Mecca Hills. “We use localized uplift patterns in the Mecca Hills to assess the most plausible geometry for the San Andreas Fault in the Coachella Valley and better understand the interplay of fault geometry and deformation,” they write.

Cooke and colleagues say the fault structures in their favored model agree with distributions of local seismicity, and are consistent with geodetic observations of recent strain. “Crustal deformation models that neglect the northeast dip of the San Andreas Fault in the Coachella Valley will not replicate the ground shaking in the region and therefore inaccurately estimate seismic hazard,” they note.

This work was supported by the National Science Foundation.
More: http://geosphere.gsapubs.org/content/10/6/1235.abstract

Severe drought is causing the western US to rise

The severe drought gripping the western United States in recent years is changing the landscape well beyond localized effects of water restrictions and browning lawns. Scientists at Scripps Institution of Oceanography at UC San Diego have now discovered that the growing, broad-scale loss of water is causing the entire western U.S. to rise up like an uncoiled spring.

Investigating ground positioning data from GPS stations throughout the west, Scripps researchers Adrian Borsa, Duncan Agnew, and Dan Cayan found that the water shortage is causing an “uplift” effect up to 15 millimeters (more than half an inch) in California’s mountains and on average four millimeters (0.15 of an inch) across the west. From the GPS data, they estimate the water deficit at nearly 240 gigatons (62 trillion gallons of water), equivalent to a six-inch layer of water spread out over the entire western U.S.

Results of the study, which was supported by the U.S. Geological Survey (USGS), appear in the August 21 online edition of the journal Science.

While poring through various sets of data of ground positions from highly precise GPS stations within the National Science Foundation’s Plate Boundary Observatory and other networks, Borsa, a Scripps assistant research geophysicist, kept noticing the same pattern over the 2003-2014 period: All of the stations moved upwards in the most recent years, coinciding with the timing of the current drought.

Agnew, a Scripps Oceanography geophysics professor who specializes in studying earthquakes and their impact on shaping the earth’s crust, says the GPS data can only be explained by rapid uplift of the tectonic plate upon which the western U.S. rests (Agnew cautions that the uplift has virtually no effect on the San Andreas fault and therefore does not increase the risk of earthquakes).

For Cayan, a research meteorologist with Scripps and USGS, the results paint a new picture of the dire hydrological state of the west.

“These results quantify the amount of water mass lost in the past few years,” said Cayan. “It also represents a powerful new way to track water resources over a very large landscape. We can home in on the Sierra Nevada mountains and critical California snowpack. These results demonstrate that this technique can be used to study changes in fresh water stocks in other regions around the world, if they have a network of GPS sensors.”

New Oso report, rockfall in Yosemite, and earthquake models

From AGU’s blogs: Oso disaster had its roots in earlier landslides

A research team tasked with being some of the first scientists and engineers to evaluate extreme events has issued its findings on disastrous Oso, Washington, landslide. The report studies the conditions and causes related to the March 22 mudslide that killed 43 people and destroyed the Steelhead Haven neighborhood in Oso, Washington. The team from the Geotechnical Extreme Events Reconnaissance (GEER) Association, funded by the National Science Foundation, determined that intense rainfall in the three weeks before the slide likely was a major issue, but factors such as altered groundwater migration, weakened soil consistency because of previous landslides and changes in hillside stresses played key roles.

From this week’s Eos: Reducing Rockfall Risk in Yosemite National Park

The glacially sculpted granite walls of Yosemite Valley attract 4 million visitors a year, but rockfalls from these cliffs pose substantial hazards. Responding to new studies, the National Park Service recently took actions to reduce the human risk posed by rockfalls in Yosemite National Park.

From AGU’s journals: A new earthquake model may explain discrepancies in San Andreas fault slip

Investigating the earthquake hazards of the San Andreas Fault System requires an accurate understanding of accumulating stresses and the history of past earthquakes. Faults tend to go through an “earthquake cycle”-locking and accumulating stress, rupturing in an earthquake, and locking again in a well-accepted process known as “elastic rebound.” One of the key factors in preparing for California’s next “Big One” is estimating the fault slip rate, the speed at which one side of the San Andreas Fault is moving past the other.

Broadly speaking, there are two ways geoscientists study fault slip. Geologists formulate estimates by studying geologic features at key locations to study slip rates through time. Geodesists, scientists who measure the size and shape of the planet, use technologies like GPS and satellite radar interferometry to estimate the slip rate, estimates which often differ from the geologists’ estimations.

In a recent study by Tong et al., the authors develop a new three-dimensional viscoelastic earthquake cycle model that represents 41 major fault segments of the San Andreas Fault System. While previous research has suggested that there are discrepancies between the fault slip rates along the San Andreas as measured by geologic and geodetic means, the authors find that there are no significant differences between the two measures if the thickness of the tectonic plate and viscoelasticity are taken into account. The authors find that the geodetic slip rate depends on the plate thickness over the San Andreas, a variable lacking in previous research.

The team notes that of the 41 studied faults within the San Andreas Fault system, a small number do in fact have disagreements between the geologic and geodetic slip rates. These differences could be attributed to inadequate data coverage or to incomplete knowledge of the fault structures or the chronological sequence of past events.

The next ‘Big One’ for the Bay Area may be a cluster of major quakes

A cluster of closely timed earthquakes over 100 years in the 17th and 18th centuries released as much accumulated stress on San Francisco Bay Area’s major faults as the Great 1906 San Francisco earthquake, suggesting two possible scenarios for the next “Big One” for the region, according to new research published by the Bulletin of the Seismological Society of America (BSSA).

“The plates are moving,” said David Schwartz, a geologist with the U.S. Geological Survey and co-author of the study. “The stress is re-accumulating, and all of these faults have to catch up. How are they going to catch up?”

The San Francisco Bay Region (SFBR) is considered within the boundary between the Pacific and North American plates. Energy released during its earthquake cycle occurs along the region’s principal faults: the San Andreas, San Gregorio, Calaveras, Hayward-Rodgers Creek, Greenville, and Concord-Green Valley faults.

“The 1906 quake happened when there were fewer people, and the area was much less developed,” said Schwartz. “The earthquake had the beneficial effect of releasing the plate boundary stress and relaxing the crust, ushering in a period of low level earthquake activity.”

The earthquake cycle reflects the accumulation of stress, its release as slip on a fault or a set of faults, and its re-accumulation and re-release. The San Francisco Bay Area has not experienced a full earthquake cycle since its been occupied by people who have reported earthquake activity, either through written records or instrumentation. Founded in 1776, the Mission Dolores and the Presidio in San Francisco kept records of felt earthquakes and earthquake damage, marking the starting point for the historic earthquake record for the region.

“We are looking back at the past to get a more reasonable view of what’s going to happen decades down the road,” said Schwartz. “The only way to get a long history is to do these paleoseismic studies, which can help construct the rupture histories of the faults and the region. We are trying to see what went on and understand the uncertainties for the Bay Area.”

Schwartz and colleagues excavated trenches across faults, observing past surface ruptures from the most recent earthquakes on the major faults in the area. Radiocarbon dating of detrital charcoal and the presence of non-native pollen established the dates of paleoearthquakes, expanding the span of information of large events back to 1600.

The trenching studies suggest that between 1690 and the founding of the Mission Dolores and Presidio in 1776, a cluster of earthquakes ranging from magnitude 6.6 to 7.8 occurred on the Hayward fault (north and south segments), San Andreas fault (North Coast and San Juan Bautista segments), northern Calaveras fault, Rodgers Creek fault, and San Gregorio fault. There are no paleoearthquake data for the Greenville fault or northern extension of the Concord-Green Valley fault during this time interval.

“What the cluster of earthquakes did in our calculations was to release an amount of energy somewhat comparable to the amount released in the crust by the 1906 quake,” said Schwartz.

As stress on the region accumulates, the authors see at least two modes of energy release – one is a great earthquake and other is a cluster of large earthquakes. The probability for how the system will rupture is spread out over all faults in the region, making a cluster of large earthquakes more likely than a single great earthquake.

“Everybody is still thinking about a repeat of the 1906 quake,” said Schwartz. “It’s one thing to have a 1906-like earthquake where seismic activity is shut off, and we slide through the next 110 years in relative quiet. But what happens if every five years we get a magnitude 6.8 or 7.2? That’s not outside the realm of possibility.”

San Francisco’s big 1906 quake was third of a series on San Andreas Fault

University of Oregon doctoral student Ashley Streig shows a tree stump on which tree-ring dating indicates the tree was cut prior to the earthquake of 1838 on the San Andreas Fault in the Santa Cruz Mountains. -  University of Oregon
University of Oregon doctoral student Ashley Streig shows a tree stump on which tree-ring dating indicates the tree was cut prior to the earthquake of 1838 on the San Andreas Fault in the Santa Cruz Mountains. – University of Oregon

Research led by a University of Oregon doctoral student in California’s Santa Cruz Mountains has uncovered geologic evidence that supports historical narratives for two earthquakes in the 68 years prior to San Francisco’s devastating 1906 disaster.

The evidence places the two earthquakes, in 1838 and 1890, on the San Andreas Fault, as theorized by many researchers based on written accounts about damage to Spanish-built missions in the Monterey and San Francisco bay areas. These two quakes, as in 1906, were surface-rupturing events, the researchers concluded.

Continuing work, says San Francisco Bay-area native Ashley R. Streig, will dig deeper into the region’s geological record — layers of sediment along the fault — to determine if the ensuing seismically quiet years make up a normal pattern — or not — of quake frequency along the fault.

Streig is lead author of the study, published in this month’s issue of the Bulletin of the Seismological Society of America. She collaborated on the project with her doctoral adviser Ray Weldon, professor of the UO’s Department of Geological Sciences, and Timothy E. Dawson of the Menlo Park office of the California Geological Survey.

The study was the first to fully map the active fault trace in the Santa Cruz Mountains using a combination of on-the-ground observations and airborne Light Detection and Ranging (LiDAR), a remote sensing technology. The Santa Cruz Mountains run for about 39 miles from south of San Francisco to near San Juan Batista. Hazel Dell is east of Santa Cruz and north of Watsonville.

“We found the first geologic evidence of surface rupture by what looks like the 1838 and 1890 earthquakes, as well as 1906,” said Streig, whose introduction to major earthquakes came at age 11 during the 1989 Loma Prieta Earthquake on a deep sub-fault of the San Andreas Fault zone. That quake, which disrupted baseball’s World Series, forced her family to camp outside their home.

Unlike the 1906 quake that ruptured 470 km (296 mi) of the fault, the 1838 and 1890 quakes ruptured shorter portions of the fault, possibly limited to the Santa Cruz Mountains. “This is the first time we have had good, clear geologic evidence of these historic 19th century earthquakes,” she said. “It’s important because it tells us that we had three surface ruptures, really closely spaced in time that all had fairly large displacements of at least half a meter and probably larger.”

The team identified ax-cut wood chips, tree stumps and charcoal fragments from early logging efforts in unexpectedly deep layers of sediment, 1.5 meters (five feet) below the ground, and document evidence of three earthquakes since logging occurred at the site. The logging story emerged from 16 trenches dug in 2008, 2010 and 2011 along the fault at the Hazel Dell site in the mountain range.

High-resolution radiocarbon dating of tree-rings from the wood chips and charcoal confirm these are post European deposits, and the geologic earthquake evidence coincides with written accounts describing local earthquake damage, including damage to Spanish missions in 1838, and in a USGS publication of earthquakes in 1890 catalogued by an astronomer from Lick Observatory.

Additionally, in 1906 individuals living near the Hazel Dell site reported to geologists that cracks from the 1906 earthquake had occurred just where they had 16 years earlier, in 1890, which, Streig and colleagues say, was probably centered in the Hazel Dell region. Another displacement of sediment at the Hazel Dell site matched the timeline of the 1906 quake.

The project also allowed the team to conclude that another historically reported quake, in 1865, was not surface rupturing, but it was probably deep and, like the 1989 event, occurred on a sub zone of the San Andreas Fault. Conventional thinking, Streig said, has suggested that the San Andreas Fault always ruptures in a long-reaching fashion similar to the 1906 earthquake. This study, however, points to more regionally confined ruptures as well.

“This all tells us that there are more frequent surface-rupturing earthquakes on this section of the fault than have been previously identified, certainly in the historic period,” Streig said. “This becomes important to earthquake models because it is saying something about the connectivity of all these fault sections — and how they might link up.”

The frequency of the quakes in the Santa Cruz Mountains, she added, must have been a terrifying experience for settlers during the 68-year period.

“This study is the first to show three historic ruptures on the San Andreas Fault outside the special case of Parkfield,” Weldon said, referring to a region in mountains to the south of the Santa Cruz range where six magnitude 6-plus earthquakes occurred between 1857 and 1966. “The earthquakes of 1838 and 1890 were known to be somewhere nearby from shaking, but now we know the San Andreas Fault ruptured three times on the same piece of the fault in less than 100 years.”

More broadly, Weldon said, having multiple paleoseismic sites close together on a major fault, geologists now realize that interpretations gleaned from single-site evidence probably aren’t reliable. “We need to spend more time reproducing or confirming results rather than rushing to the next fault if we are going to get it right,” he said. “Ashley’s combination of historical research, C-14 dating, tree rings, pollen and stratigraphic correlation between sites has allowed us to credibly argue for precision that allows identification of the 1838 and 1890 earthquakes.”

“Researchers at the University of Oregon are using tools and technologies to further our understanding of the dynamic forces that continue to shape our planet and impact its people,” said Kimberly Andrews Espy, vice president for research and innovation and dean of the UO Graduate School. “This research furthers our understanding of the connectivity of the various sections of California’s San Andreas Fault and has the potential to save lives by leading to more accurate earthquake modeling.”

Scientists use ‘virtual earthquakes’ to forecast Los Angeles quake risk

Stanford scientists are using weak vibrations generated by the Earth’s oceans to produce “virtual earthquakes” that can be used to predict the ground movement and shaking hazard to buildings from real quakes.

The new technique, detailed in the Jan. 24 issue of the journal Science, was used to confirm a prediction that Los Angeles will experience stronger-than-expected ground movement if a major quake occurs south of the city.

“We used our virtual earthquake approach to reconstruct large earthquakes on the southern San Andreas Fault and studied the responses of the urban environment of Los Angeles to such earthquakes,” said lead author Marine Denolle, who recently received her PhD in geophysics from Stanford and is now at the Scripps Institution of Oceanography in San Diego.

The new technique capitalizes on the fact that earthquakes aren’t the only sources of seismic waves. “If you put a seismometer in the ground and there’s no earthquake, what do you record? It turns out that you record something,” said study leader Greg Beroza, a geophysics professor at Stanford.

What the instruments will pick up is a weak, continuous signal known as the ambient seismic field. This omnipresent field is generated by ocean waves interacting with the solid Earth. When the waves collide with each other, they generate a pressure pulse that travels through the ocean to the sea floor and into the Earth’s crust. “These waves are billions of times weaker than the seismic waves generated by earthquakes,” Beroza said.

Scientists have known about the ambient seismic field for about 100 years, but it was largely considered a nuisance because it interferes with their ability to study earthquakes. The tenuous seismic waves that make up this field propagate every which way through the crust. But in the past decade, seismologists developed signal-processing techniques that allow them to isolate certain waves; in particular, those traveling through one seismometer and then another one downstream.

Denolle built upon these techniques and devised a way to make these ambient seismic waves function as proxies for seismic waves generated by real earthquakes. By studying how the ambient waves moved underground, the researchers were able to predict the actions of much stronger waves from powerful earthquakes.

She began by installing several seismometers along the San Andreas Fault to specifically measure ambient seismic waves.

Employing data from the seismometers, the group then used mathematical techniques they developed to make the waves appear as if they originated deep within the Earth. This was done to correct for the fact that the seismometers Denolle installed were located at the Earth’s surface, whereas real earthquakes occur at depth.

In the study, the team used their virtual earthquake approach to confirm the accuracy of a prediction, made in 2006 by supercomputer simulations, that if the southern San Andreas Fault section of California were to rupture and spawn an earthquake, some of the seismic waves traveling northward would be funneled toward Los Angeles along a 60-mile-long (100-kilometer-long) natural conduit that connects the city with the San Bernardino Valley. This passageway is composed mostly of sediments, and acts to amplify and direct waves toward the Los Angeles region.

Until now, there was no way to test whether this funneling action, known as the waveguide-to-basin effect, actually takes place because a major quake has not occurred along that particular section of the San Andreas Fault in more than 150 years.

The virtual earthquake approach also predicts that seismic waves will become further amplified when they reach Los Angeles because the city sits atop a large sedimentary basin. To understand why this occurs, study coauthor Eric Dunham, an assistant professor of geophysics at Stanford, said to imagine taking a block of plastic foam, cutting out a bowl-shaped hole in the middle, and filling the cavity with gelatin. In this analogy, the plastic foam is a stand-in for rocks, while the gelatin is like sediments, or dirt. “The gelatin is floppier and a lot more compliant. If you shake the whole thing, you’re going to get some motion in the Styrofoam, but most of what you’re going to see is the basin oscillating,” Dunham said.

As a result, the scientists say, Los Angeles could be at risk for stronger, and more variable, ground motion if a large earthquake – magnitude 7.0 or greater – were to occur along the southern San Andreas Fault, near the Salton Sea.

“The seismic waves are essentially guided into the sedimentary basin that underlies Los Angeles,” Beroza said. “Once there, the waves reverberate and are amplified, causing stronger shaking than would otherwise occur.”

Beroza’s group is planning to test the virtual earthquake approach in other cities around the world that are built atop sedimentary basins, such as Tokyo, Mexico City, Seattle and parts of the San Francisco Bay area. “All of these cities are earthquake threatened, and all of them have an extra threat because of the basin amplification effect,” Beroza said.

Because the technique is relatively inexpensive, it could also be useful for forecasting ground motion in developing countries. “You don’t need large supercomputers to run the simulations,” Denolle said.

In addition to studying earthquakes that have yet to occur, the technique could also be used as a kind of “seismological time machine” to recreate the seismic signatures of temblors that shook the Earth long ago, according to Beroza.

“For an earthquake that occurred 200 years ago, if you know where the fault was, you could deploy instruments, go through this procedure, and generate seismograms for earthquakes that occurred before seismographs were invented,” he said.

Subterranean ‘sedimentary bathtub’ amplifies earthquakes

These images show multiple scenarios for shallow earthquakes within the Georgia Basin. Numbers in the upper right-hand represent how much motion is amplified within the basin and on the surface in Vancouver, respectively. -  Sheri Molnar and Kim Olsen.
These images show multiple scenarios for shallow earthquakes within the Georgia Basin. Numbers in the upper right-hand represent how much motion is amplified within the basin and on the surface in Vancouver, respectively. – Sheri Molnar and Kim Olsen.

Like an amphitheater amplifies sound, the stiff, sturdy soil beneath the Greater Vancouver metropolitan area could greatly amplify the effects of an earthquake, pushing the potential devastation past what building codes in the region are prepared for. That’s the conclusion behind a pair of studies recently coauthored by San Diego State University seismologist Kim Olsen.

Greater Vancouver sits atop a tectonic plate known as the Juan de Fuca Plate, which extends south to encompass Washington and Oregon states. The subterranean region of this plate beneath Vancouver is a bowl-shaped mass of rigid soil called the Georgia Basin. Earthquakes can and do occur in the Georgia Basin and can originate deep within the earth, between 50 and 70 kilometers down, or as shallow as a couple kilometers.

While earthquake researchers have long known that the region is tectonically active and policymakers have enforced building codes designed to protect against earthquakes, those codes aren’t quite strict enough because seismologists have failed to account for how the Georgia Basin affects a quake’s severity, Olsen said. In large part, that’s because until recently the problem has been too computationally complex, he said.

“People have neglected the effects of stiffer soil,” Olsen said. “They haven’t been able to look at the basin as a three-dimensional object.”

The idea to investigate the basin’s effect on earthquakes originated with Sheri Molnar, a postdoctoral researcher at the University of British Columbia. She reached out to Olsen, an expert in earthquake simulation, for help modeling the problem. Using supercomputer technology, Olsen has previously simulated the potential effects of a supermassive magnitude 8.0 quake in Southern California.

Using the same technology, Molnar and Olsen coded an algorithm to take into account the stiff-soil geography of the Georgia Basin to see how it would influence the surface effects of a magnitude 6.8 earthquake. They then ran the simulation for both a shallow and a deep quake.

In both simulations they found that the basin had an amplifying effect on motion on the surface, but the amplification was especially pronounced in shallow earthquakes. In the latter scenario, their model predicts that the sedimentary basin would cause the surface to shake for approximately 22 seconds longer than normal.

“The deep structure of the Georgia Basin can amplify the ground motion of an earthquake by a factor of three or more,” Olsen said. “It’s an irregularly shaped bathtub of sediments that can trap and amplify the waves.”

The deep and shallow studies were published today in the Bulletin of the Seismological Society of America.

Current building codes in Vancouver don’t take into account this amplification, Olsen added, meaning many buildings in the region would be in danger if a large earthquake were to hit.

Vancouver isn’t the only large metropolis built atop sedimentary basins. Los Angeles and San Francisco, too, sit on basins similar to the Georgia Basin. Olsen is currently investigating how major earthquakes along the San Andreas Fault would be affected by these basins.

He hopes that city planners can use this knowledge to update their building codes to reflect the amplifying geography beneath their feet.

“That’s always going to be the goal, to make structures safer and to mitigate the damage in the future,” Olsen said.

Study to enhance earthquake prediction and mitigation in Pakistani region

This is a sketch map of southeast Asia showing major faults and tectonic blocks, including the Chaman Fault. -  Courtesy of Shuhab Khan
This is a sketch map of southeast Asia showing major faults and tectonic blocks, including the Chaman Fault. – Courtesy of Shuhab Khan

A three-year, $451,000 grant from the United States Agency for International Development to study the Chaman Fault in Western Pakistan will help earthquake prediction and mitigation in this heavily populated region.

The research, part of the Pakistan-U.S. Science and Technology Cooperation Program, will also increase the strength and breadth of cooperation and linkages between Pakistani scientists and institutions with counterparts in the U.S. The National Academy of Sciences implements the U.S. side of the program.

Shuhab Khan, associate professor of geology at University of Houston, will lead the project in the U.S. His counterpart in Pakistan is Abdul Salam Khan of the University of Balochistan.

“The Chaman Fault is a large, active fault around 1,000 kilometers, or 620 miles, long. It crosses back and forth between Afghanistan and Pakistan, ultimately merging with some other faults and going to the Arabian Sea,” Khan said.

The study area is located close to megacities in both countries.

“Seismic activity across this region has caused hundreds of thousands of deaths and catastrophic economic losses,” Khan said. “However, the Chaman Fault is one of the least studied fault systems. Through this research, we will determine how fast the fault is moving and which part is more active.”

The Chaman Fault is the largest, strike-slip fault system in Central Asia. It is a little more than half the size of the San Andreas Fault in California.

“In strike-slip faults, the Earth’s crust moves laterally. Earthquakes along these types of faults are shallow and more damaging,” he said. “Rivers can also be displaced and change course with activity related to this type of fault.”

The study team will gather data using remote sensing satellite technology and field measurements made at various sites along the fault.

“In addition to current movement, the techniques will allow us to go back tens of thousands of years to determine which areas have moved and how much,” Khan said.

Field measurement techniques include sampling and analysis of rocks and sand along the fault system.

“Through cosmogenic age dating, we can determine how much time rocks along the fault have been exposed to sunlight by measuring for cosmic rays and radiation. Those measurements help us determine how much time it took the rocks to move in the area,” Khan said.

Sand buried below the surface will be sampled without being exposed to light. In the lab, measurements using optically stimulated luminescence will reveal how long the sand has been buried.

Three students from the University of Balochistan will come to the U.S. to learn the field techniques. “We will take them to the San Andreas Fault for training because the locations and faults are similar,” Khan said. “They will return to Pakistan and gather samples from designated areas along the fault.”

The samples will be analyzed at the University of Cincinnati lab of geology professor Lewis Owen, co-investigator on the grant. The research team also includes University of Houston geosciences students. Two undergraduate students will help process the rock samples, and a Ph.D. student will work with the remote sensing data.

“Through the data collection, we will learn more about the movement along this fault in the recent past. That information will help with earthquake prediction and mitigation,” Khan said.

Revised location of 1906 rupture of San Andreas Fault in Portola Valley

New evidence suggests the 1906 earthquake ruptured the San Andreas Fault in a single trace through Portola Village, current day Town of Portola Valley, and indicates a revised location for the fault trace.

Portola Valley, south of San Francisco, has been extensively studied and the subject of the first geological map published in California. Yet studies have offered conflicting conclusions, caused in part by a misprinted photograph and unpublished data, as to the location and nature of the 1906 surface rupture through the area.

“It is critical for the residents and leaders of Portola Valley to know the exact location of the fault – an active fault near public buildings and structures,” said co-author Chester T. Wrucke, a retired geologist with the U.S. Geological Survey and resident of Portola Valley. Independent researcher Robert T. Wrucke and engineering geologist Ted Sayre, with Cotton Shires and Associates, are co-authors of the study, published by the Bulletin of the Seismological Society of America (BSSA).

Using a new high-resolution imaging technology, known as bare-earth airborne LiDAR (Light Detection And Ranging), combined with field observations and an extensive review of archival photography, researchers reinterpreted previous documentation to locate the 1906 fault trace.

“People back then were hampered by thick vegetation to see a critical area,” said Chester Wrucke. “Modern technology – LiDAR – and modern techniques made it possible for us to see the bare ground, interpret correctly where old photographs were taken and identify the fault trace.”

The 1906 earthquake changed the landscape of Portola Valley, breaking rock formations, cracking roads, creating landslides and forcing other changes masked by vegetation. With easy access to the area, local professors and photographers from Stanford created a rich trove of field observations, photos and drawings.

J.C. Banner, then a geology professor at Stanford, was among the scientists who, along with his students, submitted their observations of the 1906 fault rupture to the California Earthquake Commission to include in an official compilation of the cause and affects of the earthquake. While the compilation, published in 1908, contained a final conclusion that the earthquake ruptured along a single fault trace in Portola Valley, a key map of that trace – Map 22 — included unintentional errors of the fault location.

Studies of the area resumed 50 years later, and those studies relied on literature, including Map 22. Subsequent studies published variations of Map 22, further altering the assumed location of the fault and suggesting the earthquake ruptured along multiple traces of the fault.

The authors sought to answer a seemingly simple question – where did the fault cross Alpine Road? “With variations in the literature and interpretation of the data, we decided to pay close attention to the original work,” said Robert Wrucke.

The authors relied on Branner’s description, together with 1906 photographs, a hand-drawn map, a student notebook and an analysis of changes to Alpine Road for clues to confirm the location of where the fault crossed Alpine Road.

Scanning archives to study all available photos from 1906 and notes from observers, the researchers compared geological features to LiDAR images. Their forensic analysis suggests the primary rupture in 1906 in Portola Valley was along the western of two main traces of the San Andreas Fault. Their analysis shows that there was no step-over within the town to the other trace.

“The biggest practical benefit of knowing the correct fault position is the ability to keep proposed news buildings off the critical rupture zone,” said Sayre.

“We had the luxury of LiDAR and were able to meld LiDAR data with old photos and made a breakthrough,” said Bob Wrucke. “Modern technology helps with geological interpretation. Our experience may be useful for others in situations where there’s confusion.”

Geothermal power facility induces earthquakes, study finds

An analysis of earthquakes in the area around the Salton Sea Geothermal Field in southern California has found a strong correlation between seismic activity and operations for production of geothermal power, which involve pumping water into and out of an underground reservoir.

“We show that the earthquake rate in the Salton Sea tracks a combination of the volume of fluid removed from the ground for power generation and the volume of wastewater injected,” said Emily Brodsky, a geophysicist at the University of California, Santa Cruz, and lead author of the study, published online in Science on July 11.

“The findings show that we might be able to predict the earthquakes generated by human activities. To do this, we need to take a large view of the system and consider both the water coming in and out of the ground,” said Brodsky, a professor of Earth and planetary sciences at UCSC.

Brodsky and coauthor Lia Lajoie, who worked on the project as a UCSC graduate student, studied earthquake records for the region from 1981 through 2012. They compared earthquake activity with production data for the geothermal power plant, including records of fluid injection and extraction. The power plant is a “flash-steam facility” which pulls hot water out of the ground, flashes it to steam to run turbines, and recaptures as much water as possible for injection back into the ground. Due to evaporative losses, less water is pumped back in than is pulled out, so the net effect is fluid extraction.

During the period of relatively low-level geothermal operations before 1986, the rate of earthquakes in the region was also low. Seismicity increased as the operations expanded. After 2001, both geothermal operations and seismicity climbed steadily.

The researchers tracked the variation in net extraction over time and compared it to seismic activity. The relationship is complicated because earthquakes are naturally clustered due to local aftershocks, and it can be difficult to separate secondary triggering (aftershocks) from the direct influence of human activities. The researchers developed a statistical method to separate out the aftershocks, allowing them to measure the “background rate” of primary earthquakes over time.

“We found a good correlation between seismicity and net extraction,” Brodsky said. “The correlation was even better when we used a combination of all the information we had on fluid injection and net extraction. The seismicity is clearly tracking the changes in fluid volume in the ground.”

The vast majority of the induced earthquakes are small, and the same is true of earthquakes in general. The key question is what is the biggest earthquake that could occur in the area, Brodsky said. The largest earthquake in the region of the Salton Sea Geothermal Field during the 30-year study period was a magnitude 5.1 earthquake.

The nearby San Andreas fault, however, is capable of unleashing extremely destructive earthquakes of at least magnitude 8, Brodsky said. The location of the geothermal field at the southern end of the San Andreas fault is cause for concern due to the possibility of inducing a damaging earthquake.

“It’s hard to draw a direct line from the geothermal field to effects on the San Andreas fault, but it seems plausible that they could interact,” Brodsky said.

At its southern end, the San Andreas fault runs into the Salton Sea, and it’s not clear what faults there might be beneath the water. A seismically active region known as the Brawley Seismic Zone extends from the southern end of the San Andreas fault to the northern end of the Imperial fault. The Salton Sea Geothermal Field, located on the southeastern edge of the Salton Sea, is one of four operating geothermal fields in the area.