Improving earthquake early warning systems for California and Taiwan

<IMG SRC="/Images/561618626.jpg" WIDTH="350" HEIGHT="319" BORDER="0" ALT="This is a map of the blind-zone radius for California. Yellow and orange colors correspond to regions with small blind zones and red and dark-red
colors correspond to regions with large blind zones. – SRL“>
This is a map of the blind-zone radius for California. Yellow and orange colors correspond to regions with small blind zones and red and dark-red
colors correspond to regions with large blind zones. – SRL

Earthquake early warning systems may provide the public with crucial seconds to prepare for severe shaking. For California, a new study suggests upgrading current technology and relocating some seismic stations would improve the warning time, particularly in areas poorly served by the existing network – south of San Francisco Bay Area to north Los Angeles and north of the San Francisco Bay Area.

A separate case study focuses on the utility of low cost sensors to create a high-density, effective network that can be used for issuing early warnings in Taiwan. Both studies appear in the November issue of the journal Seismological Research Letters (SRL).

“We know where most active faults are in California, and we can smartly place seismic stations to optimize the network,” said Serdar Kuyuk, assistant professor of civil engineering at Sakarya University in Turkey, who conducted the California study while he was a post-doctoral fellow at University of California (UC), Berkeley. Richard Allen, director of the Seismological Laboratory at UC Berkeley, is the co-author of this study.

Japan started to build its EEW system after the 1995 Kobe earthquake and performed well during the 2011 magnitude 9 Tohoku-Oki earthquake. While the U.S. Geological Survey(USGS)/Caltech Southern California Seismic and TriNet Network in Southern California was upgraded in response to the 1994 Northridge quake, the U.S is lagging behind Japan and other countries in developing a fully functional warning system.

“We should not wait until another major quake before improving the early warning system,” said Kuyuk.

Noting California’s recent law that calls for the creation of a statewide earthquake early warning (EEW) system, Kuyuk says “the study is timely and highlights for policymakers where to deploy stations for optimal coverage.” The approach maximizes the warning time and reduces the size of “blind zones” where no warning is possible, while also taking into account budgetary constraints.

Earthquake early warning systems detect the initiation of an earthquake and issue warning alerts of possible forthcoming ground shaking. Seismic stations detect the energy from the compressional P-wave first, followed by the shear and surface waves, which cause the intense shaking and most damage.

The warning time that any system generates depends on many factors, with the most important being the proximity of seismic stations to the earthquake epicenter. Once an alert is sent, the amount of warning time is a function of distance from the epicenter, where more distant locations receive more time.

Areas in “blind zones” do not receive any warning prior to arrival of the more damaging S-wave. The goal, writes Kuyuk and Allen, is to minimize the number of people and key infrastructure within the blind zone. For the more remote earthquakes, such as earthquakes offshore or in unpopulated regions, larger blind zones can be tolerated.

“There are large blind zones between the Bay Area and Los Angeles where there are active faults,” said Kuyuk. “Why? There are only 10 stations along the 150-mile section of the San Andreas Fault. Adding more stations would improve warning for people in these areas, as well as people in LA and the Bay Area should an earthquake start somewhere in between,” said Kuyuk.

Adding stations may not be so simple, according to Allen. “While there is increasing enthusiasm from state and federal legislators to build the earthquake early warning system that the public wants,” said Allen, “the reality of the USGS budget for the earthquake program means that it is becoming impossible to maintain the functionality of the existing network operated by the USGS and the universities.

“The USGS was recently forced to downgrade the telemetry of 58 of the stations in the San Francisco Bay Area in order to reduce costs,” said Allen. “While our SRL paper talks about where additional stations are needed in California to build a warning system, we are unfortunately losing stations.”

In California, the California Integrated Seismic Network (CISN) consists of multiple networks, with 2900 seismic stations at varying distances from each other, ranging from 2 to 100 km. Of the some 2900 stations, 377 are equipped to contribute to an EEW system.

Kuyuk and Allen estimate 10 km is the ideal distance between seismic stations in areas along major faults or near major cities. For other areas, an interstation distance of 20 km would provide sufficient warning. The authors suggest greater density of stations and coverage could be achieved by upgrading technology used by the existing stations, integrating Nevada stations into the current network, relocating some existing stations and adding new ones to the network.

The U.S. Geological Survey (USGS) and the Gordon and Betty Moore Foundation funded this study.

A Low-Cost Solution in Taiwan


In a separate study, Yih-Min Wu of National Taiwan University reports on the successful experiment to use low cost MEMS sensors to build a high-density seismic network to support an early warning system for Taiwan.

MEMS accelerometers are tiny sensors used in common devices, such as smart phones and laptops. These sensors are relatively cheap and have proven to be sensitive detectors of ground motion, particularly from large earthquakes.

The current EEW system in Taiwan consists of 109 seismic stations that can provide alerts within 20 seconds following the initial detection of an earthquake. Wu sought to reduce the time between earthquake and initial alert, thereby increasing the potential warning time.

The EEW research group at National Taiwan University developed a P-wave alert device named “Palert” that uses MEMS accelerometers for onsite earthquake early warning, at one-tenth the cost of traditional strong motion instruments.

From June 2012 to May 2013 Wu and his colleagues tested a network of 400 Palert devices deployed throughout Taiwan, primarily at elementary schools to take advantage of existing power and Internet connections and where they can be used to educate students about earthquake hazard mitigation.

During the testing period, the Palert system functioned similarly to the existing EEW system, which consists of the conventional strong motion instruments. With four times as many stations, the Palert network can provide a detailed shaking map for damage assessments, which it did for the March 2013 magnitude 6.1 Nantou quake.

Wu suggests the relatively low cost Palert device may have commercial potential and can be readily integrated into existing seismic networks to increase coverage density of EEW systems. In addition to China, Indonesia and Mexico, plans call for the Palert devices to be installed near New Delhi, India to test the feasibility of an EEW system there.

Researchers quantify toxic ocean conditions during major extinction 93.9 million years ago

Oxygen in the atmosphere and ocean rose dramatically about 600 million years ago, coinciding with the first proliferation of animal life.Since then, numerous short lived biotic events – typically marked by significant climatic perturbations – took place when oxygen concentrations in the ocean dipped episodically.

The most studied and extensive of these events occurred 93.9 million years ago. By looking at the chemistry of rocks deposited during that time period, specifically coupled carbon and sulfur isotope data, a research team led by University of California, Riverside biogeochemists reports that oxygen-free and hydrogen sulfide-rich waters extended across roughly five percent of the global ocean during this major climatic perturbation – far more than the modern ocean’s 0.1 percent but much less than previous estimates for this event.

The research suggests that previous estimates of oxygen-free and hydrogen sulfide-rich conditions, or “euxinia,” were too high. Nevertheless, the limited and localized euxinia were still sufficiently widespread to have dramatic effect on the entire ocean’s chemistry and thus biological activity.

“These conditions must have impacted nutrient availability in the ocean and ultimately the spatial and temporal distribution of marine life,” said team member Jeremy D. Owens, a former UC Riverside graduate student, who is now a postdoctoral scientist at the Woods Hole Oceanographic Institution. “Under low-oxygen environments, many biologically important metals and other nutrients are removed from seawater and deposited in the sediments on the seafloor, making them less available for life to flourish.”

“What makes this discovery particularly noteworthy is that we mapped out a landscape of bioessential elements in the ocean that was far more perturbed than we expected, and the impacts on life were big,” said Timothy W. Lyons, a professor of biogeochemistry at UCR, Owens’s former advisor and the principal investigator on the research project.

Study results appear online this week in the Proceedings of the National Academy of Sciences.

Across the event 93.9 million years ago, a major biological extinction in the marine realm has already been documented. Also associated with this event are high levels of carbon dioxide in the atmosphere, which are linked to elevated ocean and atmospheric temperatures. Associated consequences include likely enhanced global rainfall and weathering of the continents, which further shifted the chemistry of the ocean.

“Our work shows that even though only a small portion of the ocean contained toxic and metal-scavenging hydrogen sulfide, it was sufficiently large so that changes to the ocean’s chemistry and biology were likely profound,” Owens said. “What this says is that only portions of the ocean need to contain sulfide to greatly impact biota.”

For their analysis, the researchers collected seafloor mud samples, now rock, from multiple localities in England and Italy. They then performed chemical extraction on the samples to analyze the sulfur isotope compositions in order to estimate the chemistry of the global ocean.

According to the researchers, the importance of their study is elevated by the large amount of previous work on the same interval and thus the extensive availability of supporting data and samples. Yet despite all this past research, the team was able to make a fundamental discovery about the global conditions in the ancient ocean and their impacts on life.

“Today, we are facing rising carbon dioxide contents in the atmosphere through human activities, and the amount of oxygen in the ocean may drop correspondingly in the face of rising seawater temperatures,” Lyons said. “Oxygen is less soluble in warmer water, and there are already suggestions of such decreases. In the face of these concerns, our findings from the warm, oxygen-poor ancient ocean may be a warning shot about yet another possible perturbation to marine ecology in the future.”

Gold mining ravages Peru

The Carnegie Airborne Observatory flies over the Madre De Dios region of Peru, where vast deforested and polluted areas result from gold mining. -  Image courtesy Carnegie Airborne Observatory
The Carnegie Airborne Observatory flies over the Madre De Dios region of Peru, where vast deforested and polluted areas result from gold mining. – Image courtesy Carnegie Airborne Observatory

For the first time, researchers have been able to map the true extent of gold mining in the biologically diverse region of Madre De Dios in the Peruvian Amazon. The team combined field surveys with airborne mapping and high-resolution satellite monitoring to show that the geographic extent of mining has increased 400% from 1999 to 2012 and that the average annual rate of forest loss has tripled since the Great Recession of 2008. Until this study, thousands of small, clandestine mines that have boomed since the economic crisis have gone unmonitored. The research is published in the online early edition of the Proceedings of the National Academy of Sciences the week of October 28, 2013.

The team, led by Carnegie’s Greg Asner in close collaboration with officials from the Peruvian Ministry of Environment, used the Carnegie Landsat Analysis System-lite (CLASlite) to detect and map both large and small mining operations. CLASlite differs from other satellite mapping methods. It uses algorithms to detect changes to the forest in areas as small as 10 square meters, about 100 square feet, allowing scientists to find small-scale disturbances that cannot be detected by traditional satellite methods.

The team corroborated the satellite results with on-ground field surveys and Carnegie Airborne Observatory (CAO) data. The CAO uses Light Detection and Ranging (LiDAR), a technology that sweeps laser light across the vegetation canopy to image it in 3-D. It can determine the location of single standing trees at 3.5 feet (1.1 meter) resolution. This level of detail was used to assess how well CLASlite determined forest conditions in the mining areas. The CAO data were also used to evaluate the accuracy of the CLASlite maps along the edges of large mines, as well as the inaccessible small mines that are set back from roads and rivers to avoid detection. The field and CAO data confirmed up to 94% of the CLASlite mine detections.

Lead author Asner commented: “Our results reveal far more rainforest damage than previously reported by the government, NGOs, or other researchers. In all, we found that the rate of forest loss from gold mining accelerated from 5,350 acres (2,166 hectares) per year before 2008 to15,180 acres (6,145 hectares) each year after the 2008 global financial crisis that rocketed gold prices.”

In addition to wreaking direct havoc on tropical forests, gold mining releases sediment into rivers, with severe effects on aquatic life. Other recent work has shown that PerĂº’s gold mining has contributed to widespread mercury pollution affecting the entire food chain, including the food ingested by people throughout the region. Miners also hunt wild game, depleting the rainforest fauna around mining areas, and disrupting the ecological balance for centuries to come.

Co-author Ernesto Raez Luna, Senior Advisor to the Minister, Peruvian Ministry of the Environment, remarked: “Obtaining good information on illegal gold mining, to guide sound policy and enforcement decisions, has been particularly difficult so far. Finally, we have very detailed and accurate data that we can turn into government action. We are using this study to warn Peruvians on the terrible impact of illegal mining in one of the most important enclaves of biodiversity in the world, a place that we have vowed, as a nation, to protect for all humanity. Nobody should buy one gram of this jungle gold. The mining must be stopped.”

As of 2012, small illicit mines accounted for more than half of all mining operations in the region. Large mines of previous focus are heavy polluters but are taking on a subordinate role to thousands of small mines in degrading the tropical forest throughout the region. This trend highlights the importance of using this newer, high-resolution monitoring system for keeping tabs on this growing cause of forest loss.

Asner emphasized: “The gold rush in Madre de Dios, PerĂº, exceeds the combined effects of all other causes of forest loss in the region, including from logging, ranching and agriculture. This is really important because we’re talking about a global biodiversity hotspot. The region’s incredible flora and fauna is being lost to gold fever. “

Researchers turn to technology to discover a novel way of mapping landscapes

University of Cincinnati researchers are blending technology with tradition, as they discover new and improved methods for mapping landscapes. The research is newly published in the Journal of Applied Geography (Vol. 45, December 2013) by UC authors Jacek Niesterowicz, a doctoral student in the geography department, and Professor Tomasz Stepinski, the Thomas Jefferson Chair of Space Exploration in the McMicken College of Arts and Sciences (A&S).

The researchers say the analysis is the first to use a technology from a field of machine vision to build a new map of landscape types – a generalization of a popular land cover/land use map. Whereas land cover/land use pertains to physical material at, or utilization of, the local piece of Earth’s surface, a landscape type pertains to a pattern or a mosaic of different land covers over a larger neighborhood.

Machine vision is a subfield of computer science devoted to analyzing and understanding the content of images. A role of a machine vision algorithm is to “see” and interpret images as close to human vision interpretation as possible. Previous uses of the technology have focused on medicine, industry and government, ranging from robotics to face detection.

The UC research focused on a very large map of land cover/land use, called the National Land Cover Database 2006, developed by the U.S. Geological Survey.

Niesterowicz says he developed and applied machine vision-based algorithms to map landscape types in an area of northern Georgia that he selected because of the diverse patterns of land cover. The result allowed the researchers to discover and differentiate 15 distinctive landscape types, including separating forests by their domination of different plant species.

“Before now, people would do this mapping by hand, but if you had 10 maps drawn by 10 people, they would all be different,” says Stepinski.

Niesterowicz says the information uncovered by auto-mapping of landscape types would be useful for a number of fields, ranging from geographic research to land management, urban planning and conservation.

“The good thing about this method is that it doesn’t need to be restricted to land cover or other physical variables – it can be applied as well to socio-economic data, such as U.S. Census data, for example,” says Niesterowicz.

“It’s an entirely new way to conduct geographic research,” says Stepinski.

“By leveraging technology developed in the field of computer science, it’s possible to make geography searchable by content. Using this technique, for example, we can quickly discover (using Web-based applications on our website) that farms in Minnesota are on average larger than farms in Ohio, and ask why that is.”

The researchers say future research will involve using the method to identify characteristic landscape types (from waterways to forests to regions influenced by human habitation) over the entire United States.

Stepinski adds that longer-term applications could involve comparisons of landscape types of other countries with those of the United States and to identify characteristic patterns of different geographical entities, such as terrain, or human patterns including socioeconomics and race.

Neutrons, electrons and theory reveal secrets of natural gas reserves

Gas and oil deposits in shale have no place to hide from an Oak Ridge National Laboratory technique that provides an inside look at pores and reveals structural information potentially vital to the nation’s energy needs.

The research by scientists at the Department of Energy laboratory could clear the path to the more efficient extraction of gas and oil from shale, environmentally benign and efficient energy production from coal and perhaps viable carbon dioxide sequestration technologies, according to Yuri Melnichenko, an instrument scientist at ORNL’s High Flux Isotope Reactor.

Melnichenko’s broader work was emboldened by a collaboration with James Morris and Nidia Gallego, lead authors of a paper recently published in Journal of Materials Chemistry A and members of ORNL’s Materials Science and Technology Division.

Researchers were able to describe a small-angle neutron scattering technique that, combined with electron microscopy and theory, can be used to examine the function of pore sizes.

Using their technique at the General Purpose SANS instrument at the High Flux Isotope Reactor, scientists showed there is significantly higher local structural order than previously believed in nanoporous carbons. This is important because it allows scientists to develop modeling methods based on local structure of carbon atoms. Researchers also probed distribution of adsorbed gas molecules at unprecedented smaller length scales, allowing them to devise models of the pores.

“We have recently developed efficient approaches to predict the effect of pore size on adsorption,” Morris said. “However, these predictions need verification – and the recent small-angle neutron experiments are ideal for this. The experiments also beg for further calculations, so there is much to be done.”

While traditional methods provide general information about adsorption averaged over an entire sample, they do not provide insight into how pores of different sizes contribute to the total adsorption capacity of a material. Unlike absorption, a process involving the uptake of a gas or liquid in some bulk porous material, adsorption involves the adhesion of atoms, ions or molecules to a surface.

This research, in conjunction with previous work, allows scientists to analyze two-dimensional images to understand how local structures can affect the accessibility of shale pores to natural gas.

“Combined with atomic-level calculations, we demonstrated that local defects in the porous structure observed by microscopy provide stronger gas binding and facilitate its condensation into liquid in pores of optimal sub-nanometer size,” Melnichenko said. “Our method provides a reliable tool for probing properties of sub- and super-critical fluids in natural and engineered porous materials with different structural properties.

“This is a crucial step toward predicting and designing materials with enhanced gas adsorption properties.”

Together, the application of neutron scattering, electron microscopy and theory can lead to new design concepts for building novel nanoporous materials with properties tailored for the environment and energy storage-related technologies. These include capture and sequestration of man-made greenhouse gases, hydrogen storage, membrane gas separation, environmental remediation and catalysis.