Study shows tectonic plates not rigid, deform horizontally in cooling process

Corné Kreemer, associate professor in the College of Science at the University of Nevada, Reno, conducts research on plate tectonics and geodetics. His latest research shows that oceanic tectonic plates deform due to cooling, causing shortening of the plates and mid-plate seismicity. -  Photo by Mike Wolterbeek, University of Nevada, Reno.
Corné Kreemer, associate professor in the College of Science at the University of Nevada, Reno, conducts research on plate tectonics and geodetics. His latest research shows that oceanic tectonic plates deform due to cooling, causing shortening of the plates and mid-plate seismicity. – Photo by Mike Wolterbeek, University of Nevada, Reno.

The puzzle pieces of tectonic plates that make up the outer layer of the earth are not rigid and don’t fit together as nicely as we were taught in high school.

A study published in the journal Geology by Corné Kreemer, an associate professor at the University of Nevada, Reno, and his colleague Richard Gordon of Rice University, quantifies deformation of the Pacific plate and challenges the central approximation of the plate tectonic paradigm that plates are rigid.

Using large-scale numerical modeling as well as GPS velocities from the largest GPS data-processing center in the world – the Nevada Geodetic Laboratory at the University of Nevada, Reno – Kreemer and Gordon have showed that cooling of the lithosphere, the outermost layer of Earth, makes some sections of the Pacific plate contract horizontally at faster rates than other sections. This causes the plate to deform.

Gordon’s idea is that the plate cooling, which makes the ocean deeper, also affects horizontal movement and that there is shortening and deformation of the plates due to the cooling. In partnering with Kreemer, the two put their ideas and expertise together to show that the deformation could explain why some parts of the plate tectonic puzzle didn’t fall neatly into place in recent plate motion models, which is based on spreading rates along mid-oceanic ridges. Kreemer and Gordon also showed that there is a positive correlation between where the plate is predicted to deform and where intraplate earthquakes occur. Their work was supported by the National Science Foundation.

Results of the study suggest that plate-scale horizontal thermal contraction is significant, and that it may be partly released seismically. . The pair of researchers are, as the saying goes, rewriting the textbooks.

“This is plate tectonics 2.0, it revolutionizes the concepts of plate rigidity,” Kreemer, who teaches in the University’s College of Science, said. “We have shown that the Pacific plate deforms, that it is pliable. We are refining the plate tectonic theory and have come up with an explanation for mid-plate seismicity.”

The oceanic plates are shortening due to cooling, which causes relative motion inside the plate, Kreemer said. The oceanic crust of the Pacific plate off shore California is moving 2 mm to the south every year relative to the Pacific/Antarctic plate boundary.

“It may not sound like much, but it is significant considering that we can measure crustal motion with GPS within a fraction of a millimeter per year,” he said. “Unfortunately, all existing GPS stations on Pacific islands are in the old part of the plate that is not expected nor shown to deform. New measurements will be needed within the young parts of the plate to confirm this study’s predictions, either on very remote islands or through sensors on the ocean floor.”

This work is complementary to Kreemer’s ongoing effort to quantify the deformation in all of the Earth’s plate boundary zones with GPS velocities – data that are for a large part processed in the Nevada Geodetic Laboratory. The main goal of the global modeling is to convert the strain rates to earthquake forecast maps.

“Because we don’t have GPS data in the right places of the Pacific plate, our prediction of how that plate deforms can supplement the strain rates I’ve estimated in parts of the world where we can quantify them with GPS data,” Kreemer said. “Ultimately, we hope to have a good estimate of strain rates everywhere so that the models not only forecast earthquakes for places like Reno and San Francisco, but also for places where you may expect them the least.”

Felling pine trees to study their wind resistance

Forestry experts of the French Institute for Agricultural Research INRA together with technicians from NEIKER-Tecnalia and the Chartered Provincial Council of Bizkaia felled radiata pine specimens of different ages in order to find out their resistance to gales and observe the force the wind needs to exert to blow down these trees in the particular conditions of the Basque Country.

This experience is of great interest for the managers of forests and will help them to manage their woodlands better and incorporate the wind variable into decisions like the distribution of plantations, or the most propitious moment for felling the trees.

Professionals like timber growers in the forestry sector, foresters, forestry technicians and researchers gathered to witness the simulation from close quarters. The trees were felled with steel cables that act as the wind force and which were fitted with sensors to measure the force need to bring the trees down. Each radiata pine had been fitted with three tilt meters that recorded the degree of tilt according to the force exerted on the tree. That way it was possible to determine the resistance of the roots and the strength of the trunk, two essential parameters to find out the capacity of the tree to withstand the thrust of the wind.

The experience carried out this morning is part of the seminar ‘FORRISK: Wind damage risk in forests’, which took place in the Bizkaia Aretoa in Bilbao, and was organised by NEIKER-Tecnalia in collaboration with the Chartered Provincial Council of Bizkaia, HAZI and the Atlantic Regional Office of EFI (European Forest Institute). The seminar is part of the European project “FORRISK- Network for innovation in silviculture and integrated systems for forest risk management”. This initiative has been co-funded by the ERDF and by the Sub-Ministry for Agriculture, Fisheries and Food Policy of the Government of the Basque Autonomous Community (region). The seminar took place in Bilbao because of its status as European Forest City 2014.

The seminar was used to present the detailed map of the characteristics of the wind in the Basque Country, which timber growers and forestry managers can now avail themselves of.The map has been produced by researchers at INRA, the French Institute for Agricultural Research, who have used information from the 57 meteorological stations equipped with anemometers in the network of the Basque Meteorological Authority, Euskalmet.

A tool for estimating wind damage

Those attending the seminar also had the chance to get to know the ForestGALES computing tool that allows managers to estimate the probability of wind damage in forests. ForestGALES was originally created for Britain and has been adapted to the characteristics of the Basque geography by INRA, NEIKER-Tecnalia and HAZI technicians. This innovative application is of great use in specifying concrete actions (for example: spacing, silvicultural interventions like clearing or thinning) bearing in mind the probability of wind damage on each plot.

To get the most out of this tool, it is necessary to know the resistance of the roots and strength of the trunks of the relevant species, as well as the characteristics of the wind where the trees are growing.So today’s simulation and the Basque wind map are two fundamental components for developing the ForestGALES model.

Increase in extreme winds owing to climate change

Cyclones like Klaus (2009) and Xynthia (2010) brought down over 200,000 cubic metres of timber as they passed through the Basque Country, owing to gusts of winds in excess of 228 kilometres per hour. Predictions indicate that the frequency of extreme phenomena like these is set to increase owing to climate change. So the forestry sector needs to have information and tools that will enable it to tackle the risks resulting from the wind.

The breathing sand

An Eddy Correlation Lander analyzes the strength of the oxygen fluxes at the bottom of the North Sea. -  Photo: ROV-Team, GEOMAR
An Eddy Correlation Lander analyzes the strength of the oxygen fluxes at the bottom of the North Sea. – Photo: ROV-Team, GEOMAR

A desert at the bottom of the sea? Although the waters of the North Sea exchange about every two to three years, there is evidence of decreasing oxygen content. If lower amounts of this gas are dissolved in seawater, organisms on and in the seabed produce less energy – with implications for larger creatures and the biogeochemical cycling in the marine ecosystem. Since nutrients, carbon and oxygen circulate very well and are processed quickly in the permeable, sandy sediments that make up two-thirds of the North Sea, measurements of metabolic rates are especially difficult here. Using the new Aquatic Eddy Correlation technique, scientists from GEOMAR Helmholtz Centre for Ocean Research Kiel, Leibniz Institute of Freshwater Ecology and Inland Fisheries, the University of Southern Denmark, the University of Koblenz-Landau, the Scottish Marine Institute and Aarhus University were able to demonstrate how oxygen flows at the ground of the North Sea. Their methods and results are presented in the Journal of Geophysical Research: Oceans.

“The so-called ‘Eddy Correlation’ technique detects the flow of oxygen through these small turbulences over an area of several square meters. It considers both the mixing of sediments by organisms living in it and the hydrodynamics of the water above the rough sea floor”, Dr. Peter Linke, a marine biologist at GEOMAR, explains. “Previous methods overlooked only short periods or disregarded important parameters. Now we can create a more realistic picture.” The new method also takes into account the fact that even small objects such as shells or ripples shaped by wave action or currents are able to impact the oxygen exchange in permeable sediments.

On the expedition CE0913 with the Irish research vessel CELTIC EXPLORER, scientists used the underwater robot ROV KIEL 6000 to place three different instruments within the “Tommeliten” area belonging to Norway: Two “Eddy Correlation Landers” recorded the strength of oxygen fluxes over three tidal cycles. Information about the distribution of oxygen in the sediment was collected with a “Profiler Lander”, a seafloor observatory with oxygen sensors and flow meters. A “Benthic chamber” isolated 314 square centimetres of sediment and took samples from the overlying water over a period of 24 hours to determine the oxygen consumption of the sediment.

“The combination of traditional tools with the ‘Eddy Correlation’ technique has given us new insights into the dynamics of the exchange of substances between the sea water and the underlying sediment. A variety of factors determine the timing and amount of oxygen available. Currents that provide the sandy sediment with oxygen, but also the small-scale morphology of the seafloor, ensure that small benthic organisms are able to process carbon or other nutrients. The dependencies are so complex that they can be decrypted only by using special methods”, Dr. Linke summarizes. Therefore, detailed measurements in the water column and at the boundary to the seafloor as well as model calculations are absolutely necessary to understand basic functions and better estimate future changes in the cycle of materials. “With conventional methods, for example, we would never have been able to find that the loose sandy sediment stores oxygen brought in by the currents for periods of less water movement and less oxygen introduction.”

Original publication:
McGinnis, D. F., S. Sommer, A. Lorke, R. N. Glud, P. Linke (2014): Quantifying tidally driven benthic oxygen exchange across permeable sediments: An aquatic eddy correlation study. Journal of Geophysical Research: Oceans, doi:10.1002/2014JC010303.

Links:

GEOMAR Helmholtz Centre for Ocean Research Kiel

Eddy correlation information page

Leibniz Institute of Freshwater Ecology and Inland Fisheries, IGB

University of Southern Denmark

University of Koblenz-Landau

Scottish Marine Institute

Aarhus University

Images:
High resolution images can be downloaded at http://www.geomar.de/n2110-e.

Video footage is available on request.

Contact:
Dr. Peter Linke (GEOMAR FB2-MG), Tel. 0431 600-2115, plinke@geomar.de

Maike Nicolai (GEOMAR, Kommunikation & Medien), Tel. 0431 600-2807, mnicolai@geomar.de

Researcher receives $1.2 million to create real-time seismic imaging system

This is Dr. WenZhan Song. -  Georgia State University
This is Dr. WenZhan Song. – Georgia State University

Dr. WenZhan Song, a professor in the Department of Computer Science at Georgia State University, has received a four-year, $1.2 million grant from the National Science Foundation to create a real-time seismic imaging system using ambient noise.

This imaging system for shallow earth structures could be used to study and monitor the sustainability of the subsurface, or area below the surface, and potential hazards of geological structures. Song and his collaborators, Yao Xie of the Georgia Institute of Technology and Fan-Chi Lin of the University of Utah, will use ambient noise to image the subsurface of geysers in Yellowstone National Park.

“This project is basically imaging what’s underground in a situation where there’s no active source, like an earthquake. We’re using background noise,” Song said. “At Yellowstone, for instance, people visit there and cars drive by. All that could generate signals that are penetrating through the ground. We essentially use that type of information to tap into a very weak signal to infer the image of underground. This is very frontier technology today.”

The system will be made up of a large network of wireless sensors that can perform in-network computing of 3-D images of the shallow earth structure that are based solely on ambient noise.

Real-time ambient noise seismic imaging technology could also inform homeowners if the subsurface below their home, which can change over time, is stable or will sink beneath them.

This technology can also be used in circumstances that don’t need to rely on ambient noise but have an active source that produces signals that can be detected by wireless sensors. It could be used for real-time monitoring and developing early warning systems for natural hazards, such as volcanoes, by determining how close magma is to the surface. It could also benefit oil exploration, which uses methods such as hydrofracturing, in which high-pressure water breaks rocks and allows natural gas to flow more freely from underground.

“As they do that, it’s critical to monitor that in real time so you can know what’s going on under the ground and not cause damage,” Song said. “It’s a very promising technology, and we’re helping this industry reduce costs significantly because previously they only knew what was going on under the subsurface many days and even months later. We could reduce this to seconds.”

Until now, data from oil exploration instruments had to be manually retrieved and uploaded into a centralized database, and it could take days or months to process and analyze the data.

The research team plans to have a field demonstration of the system in Yellowstone and image the subsurface of some of the park’s geysers. The results will be shared with Yellowstone management, rangers and staff. Yellowstone, a popular tourist attraction, is a big volcano that has been dormant for a long time, but scientists are concerned it could one day pose potential hazards.

In the past several years, Song has been developing a Real-time In-situ Seismic Imaging (RISI) system using active sources, under the support of another $1.8 million NSF grant. His lab has built a RISI system prototype that is ready for deployment. The RISI system can be implemented as a general field instrumentation platform for various geophysical imaging applications and incorporate new geophysical data processing and imaging algorithms.

The RISI system can be applied to a wide range of geophysical exploration topics, such as hydrothermal circulation, oil exploration, mining safety and mining resource monitoring, to monitor the uncertainty inherent to the exploration and production process, reduce operation costs and mitigate the environmental risks. The business and social impact is broad and significant. Song is seeking business investors and partners to commercialize this technology.

###

For more information about the project, visit http://sensorweb.cs.gsu.edu/?q=ANSI.

Predicting landslides with light

Optical fiber sensors are used around the world to monitor the condition of difficult-to-access segments of infrastructure-such as the underbellies of bridges, the exterior walls of tunnels, the feet of dams, long pipelines and railways in remote rural areas.

Now, a team of researchers in Italy are expanding the reach of optical fiber sensors “to the hills” by embedding them in shallow trenches within slopes to detect and monitor both large landslides and slow slope movements. The team will present their research at The Optical Society’s (OSA) 98th Annual Meeting, Frontiers in Optics, being held Oct. 19-23 in Tucson, Arizona, USA.

As major disasters around the world this year have shown, landslides can be stark examples of nature at her most unforgiving. Within seconds, a major landslide can completely erase houses and structures that have stood for years, and the catastrophic toll they inflict on communities is felt not just in that destructive loss of property but in the devastating loss of life. The 1999 Vargus tragedy in Venezuela, for instance, killed tens of thousands of people and erased whole towns from the map without warning.

The motivation for an early warning technology, like the one the Italian team has devised, is to find a way to mitigate such losses -just as hurricane tracking can prompt coastal evacuations and save lives.

Predicting Landslides by Detecting Land Strains


Landslides are failures of a rock or soil mass, and are always preceded by various types of “pre-failure” strains-known technically as elastic, plastic and viscous volumetric and shear strains. While the magnitude of these pre-failure strains depends on the rock or soil involved-ranging from fractured rock debris and pyroclastic flows to fine-grained soils-they are measurable. This new technology can detect small shifts in soil slopes, and thus can detect the onset of landslides. Usually, electrical sensors have been used for monitoring landslides, but these sensors are easily damaged. Optical fiber sensors are more robust, economical and sensitive. This is where the new technology could make a difference.

“Distributed optical fiber sensors can act as a ‘nervous system’ of slopes by measuring the tensile strain of the soil they’re embedded within,” explained Professor Luigi Zeni, who is in the Department of Industrial & Information Engineering at the Second University of Naples.

Taking it a step further, Zeni and his colleagues worked out a way of combining several types of optical fiber sensors into a plastic tube that twists and moves under the forces of pre-failure strains. Researchers are then able to monitor the movement and bending of the optical fiber remotely to determine if a landslide is imminent.

The use of novel fiber optic sensors “allows us to overcome some limitations of traditional inclinometers, because fiber-based ones have no moving parts and can withstand larger soil deformations,” Zeni said. “These sensors can be used to cover very large areas-several square kilometers-and interrogated in a time-continuous way to pinpoint any critical zones.”

The findings clearly demonstrate the potential of distributed optical fiber sensors as an entirely new tool to monitor areas subject to landslide risk, Zeni said, and to develop early warning systems based on geo-indicators-early deformations-of slope failures.

New view of Rainier’s volcanic plumbing

This image was made by measuring how the ground conducts or resists electricity in a study co-authored by geophysicist Phil Wannamaker of the University of Utah Energy & Geoscience Institute. It  shows the underground plumbing system that provides molten and partly molten rock to the magma chamber beneath the Mount Rainier volcano in Washington state. The scale at left is miles depth. The scale at bottom is miles from the Pacific Coast. The Juan de Fuca plate of Earth's Pacific seafloor crust and upper mantle is shown in blue on the left half of the image as it dives or 
'subducts' eastward beneath Washington state. The reddish orange and yellow colors represent molten and partly molten rock forming atop the Juan de Fuca plate or 'slab.' The image shows the rock begins to melt about 50 miles beneath Mount Rainier (the red triangle at top). Some is pulled downward and eastward as the slab keeps diving, but other melts move upward to the orange magma chamber shown under but west of Mount Rainier. The line of sensors used to make this image were placed north of the 14,410-foot peak, so the image may be showing a lobe of the magma chamber that extends northwest of the mountain. Red ovals on the left half of the page are the hypocenters of earthquakes. -  R Shane McGary, Woods Hole Oceanographic Institution.
This image was made by measuring how the ground conducts or resists electricity in a study co-authored by geophysicist Phil Wannamaker of the University of Utah Energy & Geoscience Institute. It shows the underground plumbing system that provides molten and partly molten rock to the magma chamber beneath the Mount Rainier volcano in Washington state. The scale at left is miles depth. The scale at bottom is miles from the Pacific Coast. The Juan de Fuca plate of Earth’s Pacific seafloor crust and upper mantle is shown in blue on the left half of the image as it dives or
‘subducts’ eastward beneath Washington state. The reddish orange and yellow colors represent molten and partly molten rock forming atop the Juan de Fuca plate or ‘slab.’ The image shows the rock begins to melt about 50 miles beneath Mount Rainier (the red triangle at top). Some is pulled downward and eastward as the slab keeps diving, but other melts move upward to the orange magma chamber shown under but west of Mount Rainier. The line of sensors used to make this image were placed north of the 14,410-foot peak, so the image may be showing a lobe of the magma chamber that extends northwest of the mountain. Red ovals on the left half of the page are the hypocenters of earthquakes. – R Shane McGary, Woods Hole Oceanographic Institution.

By measuring how fast Earth conducts electricity and seismic waves, a University of Utah researcher and colleagues made a detailed picture of Mount Rainier’s deep volcanic plumbing and partly molten rock that will erupt again someday.

“This is the most direct image yet capturing the melting process that feeds magma into a crustal reservoir that eventually is tapped for eruptions,” says geophysicist Phil Wannamaker, of the university’s Energy & Geoscience Institute and Department of Civil and Environmental Engineering. “But it does not provide any information on the timing of future eruptions from Mount Rainier or other Cascade Range volcanoes.”

The study was published today in the journal Nature by Wannamaker and geophysicists from the Woods Hole Oceanographic Institution in Massachusetts, the College of New Jersey and the University of Bergen, Norway.

In an odd twist, the image appears to show that at least part of Mount Rainier’s partly molten magma reservoir is located about 6 to 10 miles northwest of the 14,410-foot volcano, which is 30 to 45 miles southeast of the Seattle-Tacoma area.

But that could be because the 80 electrical sensors used for the experiment were placed in a 190-mile-long, west-to-east line about 12 miles north of Rainier. So the main part of the magma chamber could be directly under the peak, but with a lobe extending northwest under the line of detectors, Wannamaker says.

The top of the magma reservoir in the image is 5 miles underground and “appears to be 5 to 10 miles thick, and 5 to 10 miles wide in east-west extent,” he says. “We can’t really describe the north-south extent because it’s a slice view.”

Wannamaker estimates the reservoir is roughly 30 percent molten. Magma chambers are like a sponge of hot, soft rock containing pockets of molten rock.

The new image doesn’t reveal the plumbing tying Mount Rainier to the magma chamber 5 miles below it. Instead, it shows water and partly molten and molten rock are generated 50 miles underground where one of Earth’s seafloor crustal plates or slabs is “subducting” or diving eastward and downward beneath the North America plate, and how and where those melts rise to Rainier’s magma chamber.

The study was funded largely by the National Science Foundation’s Earthscope program, which also has made underground images of the United States using seismic or sound-wave tomography, much like CT scans show the body’s interior using X-rays.

The new study used both seismic imaging and magnetotelluric measurements, which make images by showing how electrical and magnetic fields in the ground vary due to differences in how much underground rock and fluids conduct or resist electricity.

Wannamaker says it is the most detailed cross-section view yet under a Cascades volcanic system using electrical and seismic imaging. Earlier seismic images indicated water and partly molten rock atop the diving slab. The new image shows melting “from the surface of the slab to the upper crust, where partly molten magma accumulates before erupting,” he adds.

Wannamaker and Rob L. Evans, of the Woods Hole Oceanographic Institution, conceived the study. First author R Shane McGary – then at Woods Hole and now at the College of New Jersey – did the data analysis. Other co-authors were Jimmy Elsenbeck of Woods Hole and Stéphane Rondenay of the University of Bergen.

Mount Rainier: Hazardous Backdrop to Metropolitan Seattle-Tacoma

Mount Rainier, the tallest peak in the Cascades, “is an active volcano that will erupt again,” says the U.S. Geological Survey. Rainier sits atop volcanic flows up to 36 million years old. An ancestral Rainier existed 2 million to 1 million years ago. Frequent eruptions built the mountain’s modern edifice during the past 500,000 years. During the past 11,000 years, Rainier erupted explosively dozens of times, spewing ash and pumice.

Rainier once was taller until it collapsed during an eruption 5,600 years ago to form a large crater open to the northeast, much like the crater formed by Mount St. Helens’ 1980 eruption. The 5,600-year-old eruption sent a huge mudflow west to Puget Sound, covering parts or all of the present sites of the Port of Tacoma, Seattle suburbs Kent and Auburn, and the towns Puyallup, Orting, Buckley, Sumner and Enumclaw.

Rainier’s last lava flows were 2,200 years ago, the last flows of hot rock and ash were 1,100 years ago and the last big mudflow 500 years ago. There are disputed reports of steam eruptions in the 1800s.

Subduction Made Simple – and a Peek beneath a Peak

The “ring of fire” is a zone of active volcanoes and frequent earthquake activity surrounding the Pacific Ocean. It exists where Earth’s tectonic plates collide – specifically, plates that make up the seafloor converge with plates that carry continents.

From Cape Mendocino in northern California and north past Oregon, Washington state and into British Columbia, an oceanic plate is being pushed eastward and downward – a process called subduction – beneath the North American plate. This relatively small Juan de Fuca plate is located between the huge Pacific plate and the Pacific Northwest.

New seafloor rock – rich with water in cracks and minerals – emerges from an undersea volcanic ridge some 250 miles off the coast, from northern California into British Columbia. That seafloor adds to the western edge of the Juan de Fuca plate and pushes it east-northeast under the Pacific Northwest, as far as Idaho.

The part of the plate diving eastward and downward is called the slab, which ranges from 30 to 60 miles thick as it is jammed under the North American plate. The part of the North American plate above the diving slab is shaped like a wedge.

When the leading, eastern edge of the diving slab descends deep enough, where pressures and temperatures are high, water-bearing minerals such as chlorite and amphibole release water from the slab, and the slab and surrounding mantle rock begin to melt. That is why the Cascade Range of active volcanoes extends north-to-south – above the slab and parallel but about 120 miles inland from the coast – from British Columbia south to Mount Shasta and Lassen Peak in northern California.

In the new image, yellow-orange-red areas correspond to higher electrical conductivity (or lower resistivity) in places where fluids and melts are located.

The underground image produced by the new study shows where water and molten rock accumulate atop the descending slab, and the route they take to the magma chamber that feeds eruptions of Mount Rainier:

– The rock begins to melt atop the slab about 50 miles beneath Mount Rainier. Wannamaker says it is best described as partly molten rock that contains about 2 percent water and “is a mush of crystals within an interlacing a network of molten rock.”

– Some water and partly molten rock actually gets dragged downward atop the descending slab, to depths of 70 miles or more.

– Other partly molten rock rises up through the upper mantle wedge, crosses into the crust at a depth of about 25 miles, and then rises into Rainier’s magma chamber – or at least the lobe of the chamber that crosses under the line of sensors used in the study. Evidence suggests the magma moves upward at least 0.4 inches per year.

– The new magnetotelluric image also shows a shallower zone of fluid perhaps 60 miles west of Rainier and 25 miles deep at the crust-mantle boundary. Wannamaker says it is largely water released from minerals as the slab is squeezed and heated as it dives.

The seismic data were collected during 2008-2009 for other studies. The magnetotelluric data were gathered during 2009-2010 by authors of the new study.

Wannamaker and colleagues placed an east-west line of magnetotelluric sensors: 60 that made one-day measurements and looked as deep as 30 miles into the Earth, and 20 that made measurements for a month and looked at even greater depths.

Ground-improvement methods might protect against earthquakes

Researchers from the University of Texas at Austin’s Cockrell School of Engineering are developing ground-improvement methods to help increase the resilience of homes and low-rise structures built on top of soils prone to liquefaction during strong earthquakes.

Findings will help improve the safety of structures in Christchurch and the Canterbury region in New Zealand, which were devastated in 2010 and 2011 by a series of powerful earthquakes. Parts of Christchurch were severely affected by liquefaction, in which water-saturated soil temporarily becomes liquid-like and often flows to the surface creating sand boils.

“The 2010-2011 Canterbury earthquakes in New Zealand have caused significant damage to many residential houses due to varying degrees of soil liquefaction over a wide extent of urban areas unseen in past destructive earthquakes,” said Kenneth Stokoe, a professor in the Department of Civil, Architectural and Environmental Engineering. “One critical problem facing the rebuilding effort is that the land remains at risk of liquefaction in future earthquakes. Therefore, effective engineering solutions must be developed to increase the resilience of homes and low-rise structures.”

Researchers have conducted a series of field trials to test shallow-ground-improvement methods.

“The purpose of the field trials was to determine if and which improvement methods achieve the objectives of inhibiting liquefaction triggering in the improved ground and are cost-effective measures,” said Stokoe, working with Brady Cox, an assistant professor of civil engineering. “This knowledge is needed to develop foundation design solutions.”

Findings were detailed in a research paper presented in December at the New Zealand – Japan Workshop on Soil Liquefaction during Recent large-Scale Earthquakes. The paper was authored by Stokoe, graduate students Julia Roberts and Sungmoon Hwang; Cox and operations manager Farn-Yuh Menq from the University of Texas at Austin; and Sjoerd Van Ballegooy from Tonkin & Taylor Ltd, an international environmental and engineering consulting firm in Auckland, New Zealand.

The researchers collected data from test sections of improved and unimproved soils that were subjected to earthquake stresses using a large mobile shaker, called T-Rex, and with explosive charges planted underground. The test sections were equipped with sensors to monitor key factors including ground motion and water pressure generated in soil pores during the induced shaking, providing preliminary data to determine the most effective ground-improvement method.

Four ground-improvement methods were initially selected for the testing: rapid impact compaction (RIC); rammed aggregate piers (RAP), which consist of gravel columns; low-mobility grouting (LMG); and construction of a single row of horizontal beams (SRB) or a double row of horizontal beams (DRB) beneath existing residential structures via soil-cement mixing.
“The results are being analyzed, but good and poor performance can already be differentiated,” Stokoe said. “The ground-improvement methods that inhibited liquefaction triggering the most were RIC, RAP, and DRB. However, additional analyses are still underway.”

The test site is located along the Avon River in the Christchurch suburb of Bexley. The work is part of a larger testing program that began in early 2013 with a preliminary evaluation by Brady Cox of seven potential test sites along the Avon River in the Christchurch area.

Enormous aquifer discovered under Greenland ice sheet

Glaciologist Lora Koenig (left) operates a video recorder that has been lowered into the bore hole to observe the ice structure of the aquifer in April 2013. -  University of Utah/Clément Miège
Glaciologist Lora Koenig (left) operates a video recorder that has been lowered into the bore hole to observe the ice structure of the aquifer in April 2013. – University of Utah/Clément Miège

Buried underneath compacted snow and ice in Greenland lies a large liquid water reservoir that has now been mapped by researchers using data from NASA’s Operation IceBridge airborne campaign.

A team of glaciologists serendipitously found the aquifer while drilling in southeast Greenland in 2011 to study snow accumulation. Two of their ice cores were dripping water when the scientists lifted them to the surface, despite air temperatures of minus 4 F (minus 20 C). The researchers later used NASA’s Operation Icebridge radar data to confine the limits of the water reservoir, which spreads over 27,000 square miles (69,930 square km) – an area larger than the state of West Virginia. The water in the aquifer has the potential to raise global sea level by 0.016 inches (0.4 mm).

“When I heard about the aquifer, I had almost the same reaction as when we discovered Lake Vostok [in Antarctica]: it blew my mind that something like that is possible,” said Michael Studinger, project scientist for Operation IceBridge, a NASA airborne campaign studying changes in ice at the poles. “It turned my view of the Greenland ice sheet upside down – I don’t think anyone had expected that this layer of liquid water could survive the cold winter temperatures without being refrozen.”

Southeast Greenland is a region of high snow accumulation. Researchers now believe that the thick snow cover insulates the aquifer from cold winter surface temperatures, allowing it to remain liquid throughout the year. The aquifer is fed by meltwater that percolates from the surface during the summer.

The new research is being presented in two papers: one led by University of Utah’s Rick Forster that was published on Dec. 22 in the journal Nature Geoscience and one led by NASA’s Lora Koenig that has been accepted for publication in the journal Geophysical Research Letters. The findings will significantly advance the understanding of how melt water flows through the ice sheet and contributes to sea level rise.

When a team led by Forster accidentally drilled into water in 2011, they weren’t able to continue studying the aquifer because their tools were not suited to work in an aquatic environment. Afterward, Forster’s team determined the extent of the aquifer by studying radar data from Operation IceBridge together with ground-based radar data. The top of the water layer clearly showed in the radar data as a return signal brighter than the ice layers.

Koenig, a glaciologist with NASA’s Goddard Space Flight Center in Greenbelt, Md., co-led another expedition to southeast Greenland with Forster in April 2013 specifically designed to study the physical characteristics of the newly discovered water reservoir. Koenig’s team extracted two cores of firn (aged snow) that were saturated with water. They used a water-resistant thermoelectric drill to study the density of the ice and lowered strings packed with temperature sensors down the holes, and found that the temperature of the aquifer hovers around 32 F (zero C), warmer than they had expected it to be.

Koenig and her team measured the top of the aquifer at around 39 feet (12 meters) under the surface. This was the depth at which the boreholes filled with water after extracting the ice cores. They then determined the amount of water in the water-saturated firn cores by comparing them to dry cores extracted nearby. The researchers determined the depth at which the pores in the firn close, trapping the water inside the bubbles – at this point, there is a change in the density of the ice that the scientists can measure. This depth is about 121 feet (37 meters) and corresponds to the bottom of the aquifer. Once Koenig’s team had the density, depth and spatial extent of the aquifer, they were able to come up with an estimated water volume of about 154 billion tons (140 metric gigatons). If this water was to suddenly discharge to the ocean, this would correspond to 0.016 inches (0.4 mm) of sea level rise.

Researchers think that the perennial aquifer is a heat reservoir for the ice sheet in two ways: melt water carries heat when it percolates from the surface down the ice to reach the aquifer. And if the trapped water were to refreeze, it would release latent heat. Altogether, this makes the ice in the vicinity of the aquifer warmer, and warmer ice flows faster toward the sea.

“Our next big task is to understand how this aquifer is filling and how it’s discharging,” said Koenig. “The aquifer could offset some sea level rise if it’s storing water for long periods of time. For example after the 2012 extreme surface melt across Greenland, it appears that the aquifer filled a little bit. The question now is how does that water leave the aquifer on its way to the ocean and whether it will leave this year or a hundred years from now.”

East Antarctica is sliding sideways

It’s official: East Antarctica is pushing West Antarctica around.

Now that West Antarctica is losing weight–that is, billions of tons of ice per year–its softer mantle rock is being nudged westward by the harder mantle beneath East Antarctica.

The discovery comes from researchers led by The Ohio State University, who have recorded GPS measurements that show West Antarctic bedrock is being pushed sideways at rates up to about twelve millimeters–about half an inch–per year. This movement is important for understanding current ice loss on the continent, and predicting future ice loss.

They reported the results on Thursday, Dec. 12 at the American Geophysical Union meeting in San Francisco.

Half an inch doesn’t sound like a lot, but it’s actually quite dramatic compared to other areas of the planet, explained Terry Wilson, professor of earth sciences at Ohio State. Wilson leads POLENET, an international collaboration that has planted GPS and seismic sensors all over the West Antarctic Ice Sheet.

She and her team weren’t surprised to detect the horizontal motion. After all, they’ve been using GPS to observe vertical motion on the continent since the 1990’s.

They were surprised, she said, to find the bedrock moving towards regions of greatest ice loss.

“From computer models, we knew that the bedrock should rebound as the weight of ice on top of it goes away,” Wilson said. “But the rock should spread out from the site where the ice used to be. Instead, we see movement toward places where there was the most ice loss.”

The seismic sensors explained why. By timing how fast seismic waves pass through the earth under Antarctica, the researchers were able to determine that the mantle regions beneath east and west are very different. West Antarctica contains warmer, softer rock, and East Antarctica has colder, harder rock.

Stephanie Konfal, a research associate with POLENET, pointed out that where the transition is most pronounced, the sideways movement runs perpendicular to the boundary between the two types of mantle.

She likened the mantle interface to a pot of honey.

“If you imagine that you have warm spots and cold spots in the honey, so that some of it is soft and some is hard,” Konfal said, “and if you press down on the surface of the honey with a spoon, the honey will move away from the spoon, but the movement won’t be uniform. The hard spots will push into the soft spots. And when you take the spoon away, the soft honey won’t uniformly flow back up to fill the void, because the hard honey is still pushing on it.”

Or, put another way, ice compressed West Antarctica’s soft mantle. Some ice has melted away, but the soft mantle isn’t filling back in uniformly, because East Antarctica’s harder mantle is pushing it sideways. The crust is just along for the ride.

This finding is significant, Konfal said, because we use these crustal motions to understand ice loss.

“We’re witnessing expected movements being reversed, so we know we really need computer models that can take lateral changes in mantle properties into account.”

Wilson said that such extreme differences in mantle properties are not seen elsewhere on the planet where glacial rebound is occurring.

“We figured Antarctica would be different,” she said. “We just didn’t know how different.”

Ohio State’s POLENET academic partners in the United States are Pennsylvania State University, Washington University, New Mexico Tech, Central Washington University, the University of Texas Institute for Geophysics and the University of Memphis. A host of international partners are part of the effort as well. The project is supported by the UNAVCO and IRIS-PASSCAL geodetic and seismic facilities.

Improving earthquake early warning systems for California and Taiwan

<IMG SRC="/Images/561618626.jpg" WIDTH="350" HEIGHT="319" BORDER="0" ALT="This is a map of the blind-zone radius for California. Yellow and orange colors correspond to regions with small blind zones and red and dark-red
colors correspond to regions with large blind zones. – SRL“>
This is a map of the blind-zone radius for California. Yellow and orange colors correspond to regions with small blind zones and red and dark-red
colors correspond to regions with large blind zones. – SRL

Earthquake early warning systems may provide the public with crucial seconds to prepare for severe shaking. For California, a new study suggests upgrading current technology and relocating some seismic stations would improve the warning time, particularly in areas poorly served by the existing network – south of San Francisco Bay Area to north Los Angeles and north of the San Francisco Bay Area.

A separate case study focuses on the utility of low cost sensors to create a high-density, effective network that can be used for issuing early warnings in Taiwan. Both studies appear in the November issue of the journal Seismological Research Letters (SRL).

“We know where most active faults are in California, and we can smartly place seismic stations to optimize the network,” said Serdar Kuyuk, assistant professor of civil engineering at Sakarya University in Turkey, who conducted the California study while he was a post-doctoral fellow at University of California (UC), Berkeley. Richard Allen, director of the Seismological Laboratory at UC Berkeley, is the co-author of this study.

Japan started to build its EEW system after the 1995 Kobe earthquake and performed well during the 2011 magnitude 9 Tohoku-Oki earthquake. While the U.S. Geological Survey(USGS)/Caltech Southern California Seismic and TriNet Network in Southern California was upgraded in response to the 1994 Northridge quake, the U.S is lagging behind Japan and other countries in developing a fully functional warning system.

“We should not wait until another major quake before improving the early warning system,” said Kuyuk.

Noting California’s recent law that calls for the creation of a statewide earthquake early warning (EEW) system, Kuyuk says “the study is timely and highlights for policymakers where to deploy stations for optimal coverage.” The approach maximizes the warning time and reduces the size of “blind zones” where no warning is possible, while also taking into account budgetary constraints.

Earthquake early warning systems detect the initiation of an earthquake and issue warning alerts of possible forthcoming ground shaking. Seismic stations detect the energy from the compressional P-wave first, followed by the shear and surface waves, which cause the intense shaking and most damage.

The warning time that any system generates depends on many factors, with the most important being the proximity of seismic stations to the earthquake epicenter. Once an alert is sent, the amount of warning time is a function of distance from the epicenter, where more distant locations receive more time.

Areas in “blind zones” do not receive any warning prior to arrival of the more damaging S-wave. The goal, writes Kuyuk and Allen, is to minimize the number of people and key infrastructure within the blind zone. For the more remote earthquakes, such as earthquakes offshore or in unpopulated regions, larger blind zones can be tolerated.

“There are large blind zones between the Bay Area and Los Angeles where there are active faults,” said Kuyuk. “Why? There are only 10 stations along the 150-mile section of the San Andreas Fault. Adding more stations would improve warning for people in these areas, as well as people in LA and the Bay Area should an earthquake start somewhere in between,” said Kuyuk.

Adding stations may not be so simple, according to Allen. “While there is increasing enthusiasm from state and federal legislators to build the earthquake early warning system that the public wants,” said Allen, “the reality of the USGS budget for the earthquake program means that it is becoming impossible to maintain the functionality of the existing network operated by the USGS and the universities.

“The USGS was recently forced to downgrade the telemetry of 58 of the stations in the San Francisco Bay Area in order to reduce costs,” said Allen. “While our SRL paper talks about where additional stations are needed in California to build a warning system, we are unfortunately losing stations.”

In California, the California Integrated Seismic Network (CISN) consists of multiple networks, with 2900 seismic stations at varying distances from each other, ranging from 2 to 100 km. Of the some 2900 stations, 377 are equipped to contribute to an EEW system.

Kuyuk and Allen estimate 10 km is the ideal distance between seismic stations in areas along major faults or near major cities. For other areas, an interstation distance of 20 km would provide sufficient warning. The authors suggest greater density of stations and coverage could be achieved by upgrading technology used by the existing stations, integrating Nevada stations into the current network, relocating some existing stations and adding new ones to the network.

The U.S. Geological Survey (USGS) and the Gordon and Betty Moore Foundation funded this study.

A Low-Cost Solution in Taiwan


In a separate study, Yih-Min Wu of National Taiwan University reports on the successful experiment to use low cost MEMS sensors to build a high-density seismic network to support an early warning system for Taiwan.

MEMS accelerometers are tiny sensors used in common devices, such as smart phones and laptops. These sensors are relatively cheap and have proven to be sensitive detectors of ground motion, particularly from large earthquakes.

The current EEW system in Taiwan consists of 109 seismic stations that can provide alerts within 20 seconds following the initial detection of an earthquake. Wu sought to reduce the time between earthquake and initial alert, thereby increasing the potential warning time.

The EEW research group at National Taiwan University developed a P-wave alert device named “Palert” that uses MEMS accelerometers for onsite earthquake early warning, at one-tenth the cost of traditional strong motion instruments.

From June 2012 to May 2013 Wu and his colleagues tested a network of 400 Palert devices deployed throughout Taiwan, primarily at elementary schools to take advantage of existing power and Internet connections and where they can be used to educate students about earthquake hazard mitigation.

During the testing period, the Palert system functioned similarly to the existing EEW system, which consists of the conventional strong motion instruments. With four times as many stations, the Palert network can provide a detailed shaking map for damage assessments, which it did for the March 2013 magnitude 6.1 Nantou quake.

Wu suggests the relatively low cost Palert device may have commercial potential and can be readily integrated into existing seismic networks to increase coverage density of EEW systems. In addition to China, Indonesia and Mexico, plans call for the Palert devices to be installed near New Delhi, India to test the feasibility of an EEW system there.