Study shows tectonic plates not rigid, deform horizontally in cooling process

Corné Kreemer, associate professor in the College of Science at the University of Nevada, Reno, conducts research on plate tectonics and geodetics. His latest research shows that oceanic tectonic plates deform due to cooling, causing shortening of the plates and mid-plate seismicity. -  Photo by Mike Wolterbeek, University of Nevada, Reno.
Corné Kreemer, associate professor in the College of Science at the University of Nevada, Reno, conducts research on plate tectonics and geodetics. His latest research shows that oceanic tectonic plates deform due to cooling, causing shortening of the plates and mid-plate seismicity. – Photo by Mike Wolterbeek, University of Nevada, Reno.

The puzzle pieces of tectonic plates that make up the outer layer of the earth are not rigid and don’t fit together as nicely as we were taught in high school.

A study published in the journal Geology by Corné Kreemer, an associate professor at the University of Nevada, Reno, and his colleague Richard Gordon of Rice University, quantifies deformation of the Pacific plate and challenges the central approximation of the plate tectonic paradigm that plates are rigid.

Using large-scale numerical modeling as well as GPS velocities from the largest GPS data-processing center in the world – the Nevada Geodetic Laboratory at the University of Nevada, Reno – Kreemer and Gordon have showed that cooling of the lithosphere, the outermost layer of Earth, makes some sections of the Pacific plate contract horizontally at faster rates than other sections. This causes the plate to deform.

Gordon’s idea is that the plate cooling, which makes the ocean deeper, also affects horizontal movement and that there is shortening and deformation of the plates due to the cooling. In partnering with Kreemer, the two put their ideas and expertise together to show that the deformation could explain why some parts of the plate tectonic puzzle didn’t fall neatly into place in recent plate motion models, which is based on spreading rates along mid-oceanic ridges. Kreemer and Gordon also showed that there is a positive correlation between where the plate is predicted to deform and where intraplate earthquakes occur. Their work was supported by the National Science Foundation.

Results of the study suggest that plate-scale horizontal thermal contraction is significant, and that it may be partly released seismically. . The pair of researchers are, as the saying goes, rewriting the textbooks.

“This is plate tectonics 2.0, it revolutionizes the concepts of plate rigidity,” Kreemer, who teaches in the University’s College of Science, said. “We have shown that the Pacific plate deforms, that it is pliable. We are refining the plate tectonic theory and have come up with an explanation for mid-plate seismicity.”

The oceanic plates are shortening due to cooling, which causes relative motion inside the plate, Kreemer said. The oceanic crust of the Pacific plate off shore California is moving 2 mm to the south every year relative to the Pacific/Antarctic plate boundary.

“It may not sound like much, but it is significant considering that we can measure crustal motion with GPS within a fraction of a millimeter per year,” he said. “Unfortunately, all existing GPS stations on Pacific islands are in the old part of the plate that is not expected nor shown to deform. New measurements will be needed within the young parts of the plate to confirm this study’s predictions, either on very remote islands or through sensors on the ocean floor.”

This work is complementary to Kreemer’s ongoing effort to quantify the deformation in all of the Earth’s plate boundary zones with GPS velocities – data that are for a large part processed in the Nevada Geodetic Laboratory. The main goal of the global modeling is to convert the strain rates to earthquake forecast maps.

“Because we don’t have GPS data in the right places of the Pacific plate, our prediction of how that plate deforms can supplement the strain rates I’ve estimated in parts of the world where we can quantify them with GPS data,” Kreemer said. “Ultimately, we hope to have a good estimate of strain rates everywhere so that the models not only forecast earthquakes for places like Reno and San Francisco, but also for places where you may expect them the least.”

Study shows tectonic plates not rigid, deform horizontally in cooling process

Corné Kreemer, associate professor in the College of Science at the University of Nevada, Reno, conducts research on plate tectonics and geodetics. His latest research shows that oceanic tectonic plates deform due to cooling, causing shortening of the plates and mid-plate seismicity. -  Photo by Mike Wolterbeek, University of Nevada, Reno.
Corné Kreemer, associate professor in the College of Science at the University of Nevada, Reno, conducts research on plate tectonics and geodetics. His latest research shows that oceanic tectonic plates deform due to cooling, causing shortening of the plates and mid-plate seismicity. – Photo by Mike Wolterbeek, University of Nevada, Reno.

The puzzle pieces of tectonic plates that make up the outer layer of the earth are not rigid and don’t fit together as nicely as we were taught in high school.

A study published in the journal Geology by Corné Kreemer, an associate professor at the University of Nevada, Reno, and his colleague Richard Gordon of Rice University, quantifies deformation of the Pacific plate and challenges the central approximation of the plate tectonic paradigm that plates are rigid.

Using large-scale numerical modeling as well as GPS velocities from the largest GPS data-processing center in the world – the Nevada Geodetic Laboratory at the University of Nevada, Reno – Kreemer and Gordon have showed that cooling of the lithosphere, the outermost layer of Earth, makes some sections of the Pacific plate contract horizontally at faster rates than other sections. This causes the plate to deform.

Gordon’s idea is that the plate cooling, which makes the ocean deeper, also affects horizontal movement and that there is shortening and deformation of the plates due to the cooling. In partnering with Kreemer, the two put their ideas and expertise together to show that the deformation could explain why some parts of the plate tectonic puzzle didn’t fall neatly into place in recent plate motion models, which is based on spreading rates along mid-oceanic ridges. Kreemer and Gordon also showed that there is a positive correlation between where the plate is predicted to deform and where intraplate earthquakes occur. Their work was supported by the National Science Foundation.

Results of the study suggest that plate-scale horizontal thermal contraction is significant, and that it may be partly released seismically. . The pair of researchers are, as the saying goes, rewriting the textbooks.

“This is plate tectonics 2.0, it revolutionizes the concepts of plate rigidity,” Kreemer, who teaches in the University’s College of Science, said. “We have shown that the Pacific plate deforms, that it is pliable. We are refining the plate tectonic theory and have come up with an explanation for mid-plate seismicity.”

The oceanic plates are shortening due to cooling, which causes relative motion inside the plate, Kreemer said. The oceanic crust of the Pacific plate off shore California is moving 2 mm to the south every year relative to the Pacific/Antarctic plate boundary.

“It may not sound like much, but it is significant considering that we can measure crustal motion with GPS within a fraction of a millimeter per year,” he said. “Unfortunately, all existing GPS stations on Pacific islands are in the old part of the plate that is not expected nor shown to deform. New measurements will be needed within the young parts of the plate to confirm this study’s predictions, either on very remote islands or through sensors on the ocean floor.”

This work is complementary to Kreemer’s ongoing effort to quantify the deformation in all of the Earth’s plate boundary zones with GPS velocities – data that are for a large part processed in the Nevada Geodetic Laboratory. The main goal of the global modeling is to convert the strain rates to earthquake forecast maps.

“Because we don’t have GPS data in the right places of the Pacific plate, our prediction of how that plate deforms can supplement the strain rates I’ve estimated in parts of the world where we can quantify them with GPS data,” Kreemer said. “Ultimately, we hope to have a good estimate of strain rates everywhere so that the models not only forecast earthquakes for places like Reno and San Francisco, but also for places where you may expect them the least.”

Felling pine trees to study their wind resistance

Forestry experts of the French Institute for Agricultural Research INRA together with technicians from NEIKER-Tecnalia and the Chartered Provincial Council of Bizkaia felled radiata pine specimens of different ages in order to find out their resistance to gales and observe the force the wind needs to exert to blow down these trees in the particular conditions of the Basque Country.

This experience is of great interest for the managers of forests and will help them to manage their woodlands better and incorporate the wind variable into decisions like the distribution of plantations, or the most propitious moment for felling the trees.

Professionals like timber growers in the forestry sector, foresters, forestry technicians and researchers gathered to witness the simulation from close quarters. The trees were felled with steel cables that act as the wind force and which were fitted with sensors to measure the force need to bring the trees down. Each radiata pine had been fitted with three tilt meters that recorded the degree of tilt according to the force exerted on the tree. That way it was possible to determine the resistance of the roots and the strength of the trunk, two essential parameters to find out the capacity of the tree to withstand the thrust of the wind.

The experience carried out this morning is part of the seminar ‘FORRISK: Wind damage risk in forests’, which took place in the Bizkaia Aretoa in Bilbao, and was organised by NEIKER-Tecnalia in collaboration with the Chartered Provincial Council of Bizkaia, HAZI and the Atlantic Regional Office of EFI (European Forest Institute). The seminar is part of the European project “FORRISK- Network for innovation in silviculture and integrated systems for forest risk management”. This initiative has been co-funded by the ERDF and by the Sub-Ministry for Agriculture, Fisheries and Food Policy of the Government of the Basque Autonomous Community (region). The seminar took place in Bilbao because of its status as European Forest City 2014.

The seminar was used to present the detailed map of the characteristics of the wind in the Basque Country, which timber growers and forestry managers can now avail themselves of.The map has been produced by researchers at INRA, the French Institute for Agricultural Research, who have used information from the 57 meteorological stations equipped with anemometers in the network of the Basque Meteorological Authority, Euskalmet.

A tool for estimating wind damage

Those attending the seminar also had the chance to get to know the ForestGALES computing tool that allows managers to estimate the probability of wind damage in forests. ForestGALES was originally created for Britain and has been adapted to the characteristics of the Basque geography by INRA, NEIKER-Tecnalia and HAZI technicians. This innovative application is of great use in specifying concrete actions (for example: spacing, silvicultural interventions like clearing or thinning) bearing in mind the probability of wind damage on each plot.

To get the most out of this tool, it is necessary to know the resistance of the roots and strength of the trunks of the relevant species, as well as the characteristics of the wind where the trees are growing.So today’s simulation and the Basque wind map are two fundamental components for developing the ForestGALES model.

Increase in extreme winds owing to climate change

Cyclones like Klaus (2009) and Xynthia (2010) brought down over 200,000 cubic metres of timber as they passed through the Basque Country, owing to gusts of winds in excess of 228 kilometres per hour. Predictions indicate that the frequency of extreme phenomena like these is set to increase owing to climate change. So the forestry sector needs to have information and tools that will enable it to tackle the risks resulting from the wind.

Felling pine trees to study their wind resistance

Forestry experts of the French Institute for Agricultural Research INRA together with technicians from NEIKER-Tecnalia and the Chartered Provincial Council of Bizkaia felled radiata pine specimens of different ages in order to find out their resistance to gales and observe the force the wind needs to exert to blow down these trees in the particular conditions of the Basque Country.

This experience is of great interest for the managers of forests and will help them to manage their woodlands better and incorporate the wind variable into decisions like the distribution of plantations, or the most propitious moment for felling the trees.

Professionals like timber growers in the forestry sector, foresters, forestry technicians and researchers gathered to witness the simulation from close quarters. The trees were felled with steel cables that act as the wind force and which were fitted with sensors to measure the force need to bring the trees down. Each radiata pine had been fitted with three tilt meters that recorded the degree of tilt according to the force exerted on the tree. That way it was possible to determine the resistance of the roots and the strength of the trunk, two essential parameters to find out the capacity of the tree to withstand the thrust of the wind.

The experience carried out this morning is part of the seminar ‘FORRISK: Wind damage risk in forests’, which took place in the Bizkaia Aretoa in Bilbao, and was organised by NEIKER-Tecnalia in collaboration with the Chartered Provincial Council of Bizkaia, HAZI and the Atlantic Regional Office of EFI (European Forest Institute). The seminar is part of the European project “FORRISK- Network for innovation in silviculture and integrated systems for forest risk management”. This initiative has been co-funded by the ERDF and by the Sub-Ministry for Agriculture, Fisheries and Food Policy of the Government of the Basque Autonomous Community (region). The seminar took place in Bilbao because of its status as European Forest City 2014.

The seminar was used to present the detailed map of the characteristics of the wind in the Basque Country, which timber growers and forestry managers can now avail themselves of.The map has been produced by researchers at INRA, the French Institute for Agricultural Research, who have used information from the 57 meteorological stations equipped with anemometers in the network of the Basque Meteorological Authority, Euskalmet.

A tool for estimating wind damage

Those attending the seminar also had the chance to get to know the ForestGALES computing tool that allows managers to estimate the probability of wind damage in forests. ForestGALES was originally created for Britain and has been adapted to the characteristics of the Basque geography by INRA, NEIKER-Tecnalia and HAZI technicians. This innovative application is of great use in specifying concrete actions (for example: spacing, silvicultural interventions like clearing or thinning) bearing in mind the probability of wind damage on each plot.

To get the most out of this tool, it is necessary to know the resistance of the roots and strength of the trunks of the relevant species, as well as the characteristics of the wind where the trees are growing.So today’s simulation and the Basque wind map are two fundamental components for developing the ForestGALES model.

Increase in extreme winds owing to climate change

Cyclones like Klaus (2009) and Xynthia (2010) brought down over 200,000 cubic metres of timber as they passed through the Basque Country, owing to gusts of winds in excess of 228 kilometres per hour. Predictions indicate that the frequency of extreme phenomena like these is set to increase owing to climate change. So the forestry sector needs to have information and tools that will enable it to tackle the risks resulting from the wind.

The breathing sand

An Eddy Correlation Lander analyzes the strength of the oxygen fluxes at the bottom of the North Sea. -  Photo: ROV-Team, GEOMAR
An Eddy Correlation Lander analyzes the strength of the oxygen fluxes at the bottom of the North Sea. – Photo: ROV-Team, GEOMAR

A desert at the bottom of the sea? Although the waters of the North Sea exchange about every two to three years, there is evidence of decreasing oxygen content. If lower amounts of this gas are dissolved in seawater, organisms on and in the seabed produce less energy – with implications for larger creatures and the biogeochemical cycling in the marine ecosystem. Since nutrients, carbon and oxygen circulate very well and are processed quickly in the permeable, sandy sediments that make up two-thirds of the North Sea, measurements of metabolic rates are especially difficult here. Using the new Aquatic Eddy Correlation technique, scientists from GEOMAR Helmholtz Centre for Ocean Research Kiel, Leibniz Institute of Freshwater Ecology and Inland Fisheries, the University of Southern Denmark, the University of Koblenz-Landau, the Scottish Marine Institute and Aarhus University were able to demonstrate how oxygen flows at the ground of the North Sea. Their methods and results are presented in the Journal of Geophysical Research: Oceans.

“The so-called ‘Eddy Correlation’ technique detects the flow of oxygen through these small turbulences over an area of several square meters. It considers both the mixing of sediments by organisms living in it and the hydrodynamics of the water above the rough sea floor”, Dr. Peter Linke, a marine biologist at GEOMAR, explains. “Previous methods overlooked only short periods or disregarded important parameters. Now we can create a more realistic picture.” The new method also takes into account the fact that even small objects such as shells or ripples shaped by wave action or currents are able to impact the oxygen exchange in permeable sediments.

On the expedition CE0913 with the Irish research vessel CELTIC EXPLORER, scientists used the underwater robot ROV KIEL 6000 to place three different instruments within the “Tommeliten” area belonging to Norway: Two “Eddy Correlation Landers” recorded the strength of oxygen fluxes over three tidal cycles. Information about the distribution of oxygen in the sediment was collected with a “Profiler Lander”, a seafloor observatory with oxygen sensors and flow meters. A “Benthic chamber” isolated 314 square centimetres of sediment and took samples from the overlying water over a period of 24 hours to determine the oxygen consumption of the sediment.

“The combination of traditional tools with the ‘Eddy Correlation’ technique has given us new insights into the dynamics of the exchange of substances between the sea water and the underlying sediment. A variety of factors determine the timing and amount of oxygen available. Currents that provide the sandy sediment with oxygen, but also the small-scale morphology of the seafloor, ensure that small benthic organisms are able to process carbon or other nutrients. The dependencies are so complex that they can be decrypted only by using special methods”, Dr. Linke summarizes. Therefore, detailed measurements in the water column and at the boundary to the seafloor as well as model calculations are absolutely necessary to understand basic functions and better estimate future changes in the cycle of materials. “With conventional methods, for example, we would never have been able to find that the loose sandy sediment stores oxygen brought in by the currents for periods of less water movement and less oxygen introduction.”

Original publication:
McGinnis, D. F., S. Sommer, A. Lorke, R. N. Glud, P. Linke (2014): Quantifying tidally driven benthic oxygen exchange across permeable sediments: An aquatic eddy correlation study. Journal of Geophysical Research: Oceans, doi:10.1002/2014JC010303.

Links:

GEOMAR Helmholtz Centre for Ocean Research Kiel

Eddy correlation information page

Leibniz Institute of Freshwater Ecology and Inland Fisheries, IGB

University of Southern Denmark

University of Koblenz-Landau

Scottish Marine Institute

Aarhus University

Images:
High resolution images can be downloaded at http://www.geomar.de/n2110-e.

Video footage is available on request.

Contact:
Dr. Peter Linke (GEOMAR FB2-MG), Tel. 0431 600-2115, plinke@geomar.de

Maike Nicolai (GEOMAR, Kommunikation & Medien), Tel. 0431 600-2807, mnicolai@geomar.de

The breathing sand

An Eddy Correlation Lander analyzes the strength of the oxygen fluxes at the bottom of the North Sea. -  Photo: ROV-Team, GEOMAR
An Eddy Correlation Lander analyzes the strength of the oxygen fluxes at the bottom of the North Sea. – Photo: ROV-Team, GEOMAR

A desert at the bottom of the sea? Although the waters of the North Sea exchange about every two to three years, there is evidence of decreasing oxygen content. If lower amounts of this gas are dissolved in seawater, organisms on and in the seabed produce less energy – with implications for larger creatures and the biogeochemical cycling in the marine ecosystem. Since nutrients, carbon and oxygen circulate very well and are processed quickly in the permeable, sandy sediments that make up two-thirds of the North Sea, measurements of metabolic rates are especially difficult here. Using the new Aquatic Eddy Correlation technique, scientists from GEOMAR Helmholtz Centre for Ocean Research Kiel, Leibniz Institute of Freshwater Ecology and Inland Fisheries, the University of Southern Denmark, the University of Koblenz-Landau, the Scottish Marine Institute and Aarhus University were able to demonstrate how oxygen flows at the ground of the North Sea. Their methods and results are presented in the Journal of Geophysical Research: Oceans.

“The so-called ‘Eddy Correlation’ technique detects the flow of oxygen through these small turbulences over an area of several square meters. It considers both the mixing of sediments by organisms living in it and the hydrodynamics of the water above the rough sea floor”, Dr. Peter Linke, a marine biologist at GEOMAR, explains. “Previous methods overlooked only short periods or disregarded important parameters. Now we can create a more realistic picture.” The new method also takes into account the fact that even small objects such as shells or ripples shaped by wave action or currents are able to impact the oxygen exchange in permeable sediments.

On the expedition CE0913 with the Irish research vessel CELTIC EXPLORER, scientists used the underwater robot ROV KIEL 6000 to place three different instruments within the “Tommeliten” area belonging to Norway: Two “Eddy Correlation Landers” recorded the strength of oxygen fluxes over three tidal cycles. Information about the distribution of oxygen in the sediment was collected with a “Profiler Lander”, a seafloor observatory with oxygen sensors and flow meters. A “Benthic chamber” isolated 314 square centimetres of sediment and took samples from the overlying water over a period of 24 hours to determine the oxygen consumption of the sediment.

“The combination of traditional tools with the ‘Eddy Correlation’ technique has given us new insights into the dynamics of the exchange of substances between the sea water and the underlying sediment. A variety of factors determine the timing and amount of oxygen available. Currents that provide the sandy sediment with oxygen, but also the small-scale morphology of the seafloor, ensure that small benthic organisms are able to process carbon or other nutrients. The dependencies are so complex that they can be decrypted only by using special methods”, Dr. Linke summarizes. Therefore, detailed measurements in the water column and at the boundary to the seafloor as well as model calculations are absolutely necessary to understand basic functions and better estimate future changes in the cycle of materials. “With conventional methods, for example, we would never have been able to find that the loose sandy sediment stores oxygen brought in by the currents for periods of less water movement and less oxygen introduction.”

Original publication:
McGinnis, D. F., S. Sommer, A. Lorke, R. N. Glud, P. Linke (2014): Quantifying tidally driven benthic oxygen exchange across permeable sediments: An aquatic eddy correlation study. Journal of Geophysical Research: Oceans, doi:10.1002/2014JC010303.

Links:

GEOMAR Helmholtz Centre for Ocean Research Kiel

Eddy correlation information page

Leibniz Institute of Freshwater Ecology and Inland Fisheries, IGB

University of Southern Denmark

University of Koblenz-Landau

Scottish Marine Institute

Aarhus University

Images:
High resolution images can be downloaded at http://www.geomar.de/n2110-e.

Video footage is available on request.

Contact:
Dr. Peter Linke (GEOMAR FB2-MG), Tel. 0431 600-2115, plinke@geomar.de

Maike Nicolai (GEOMAR, Kommunikation & Medien), Tel. 0431 600-2807, mnicolai@geomar.de

Researcher receives $1.2 million to create real-time seismic imaging system

This is Dr. WenZhan Song. -  Georgia State University
This is Dr. WenZhan Song. – Georgia State University

Dr. WenZhan Song, a professor in the Department of Computer Science at Georgia State University, has received a four-year, $1.2 million grant from the National Science Foundation to create a real-time seismic imaging system using ambient noise.

This imaging system for shallow earth structures could be used to study and monitor the sustainability of the subsurface, or area below the surface, and potential hazards of geological structures. Song and his collaborators, Yao Xie of the Georgia Institute of Technology and Fan-Chi Lin of the University of Utah, will use ambient noise to image the subsurface of geysers in Yellowstone National Park.

“This project is basically imaging what’s underground in a situation where there’s no active source, like an earthquake. We’re using background noise,” Song said. “At Yellowstone, for instance, people visit there and cars drive by. All that could generate signals that are penetrating through the ground. We essentially use that type of information to tap into a very weak signal to infer the image of underground. This is very frontier technology today.”

The system will be made up of a large network of wireless sensors that can perform in-network computing of 3-D images of the shallow earth structure that are based solely on ambient noise.

Real-time ambient noise seismic imaging technology could also inform homeowners if the subsurface below their home, which can change over time, is stable or will sink beneath them.

This technology can also be used in circumstances that don’t need to rely on ambient noise but have an active source that produces signals that can be detected by wireless sensors. It could be used for real-time monitoring and developing early warning systems for natural hazards, such as volcanoes, by determining how close magma is to the surface. It could also benefit oil exploration, which uses methods such as hydrofracturing, in which high-pressure water breaks rocks and allows natural gas to flow more freely from underground.

“As they do that, it’s critical to monitor that in real time so you can know what’s going on under the ground and not cause damage,” Song said. “It’s a very promising technology, and we’re helping this industry reduce costs significantly because previously they only knew what was going on under the subsurface many days and even months later. We could reduce this to seconds.”

Until now, data from oil exploration instruments had to be manually retrieved and uploaded into a centralized database, and it could take days or months to process and analyze the data.

The research team plans to have a field demonstration of the system in Yellowstone and image the subsurface of some of the park’s geysers. The results will be shared with Yellowstone management, rangers and staff. Yellowstone, a popular tourist attraction, is a big volcano that has been dormant for a long time, but scientists are concerned it could one day pose potential hazards.

In the past several years, Song has been developing a Real-time In-situ Seismic Imaging (RISI) system using active sources, under the support of another $1.8 million NSF grant. His lab has built a RISI system prototype that is ready for deployment. The RISI system can be implemented as a general field instrumentation platform for various geophysical imaging applications and incorporate new geophysical data processing and imaging algorithms.

The RISI system can be applied to a wide range of geophysical exploration topics, such as hydrothermal circulation, oil exploration, mining safety and mining resource monitoring, to monitor the uncertainty inherent to the exploration and production process, reduce operation costs and mitigate the environmental risks. The business and social impact is broad and significant. Song is seeking business investors and partners to commercialize this technology.

###

For more information about the project, visit http://sensorweb.cs.gsu.edu/?q=ANSI.

Researcher receives $1.2 million to create real-time seismic imaging system

This is Dr. WenZhan Song. -  Georgia State University
This is Dr. WenZhan Song. – Georgia State University

Dr. WenZhan Song, a professor in the Department of Computer Science at Georgia State University, has received a four-year, $1.2 million grant from the National Science Foundation to create a real-time seismic imaging system using ambient noise.

This imaging system for shallow earth structures could be used to study and monitor the sustainability of the subsurface, or area below the surface, and potential hazards of geological structures. Song and his collaborators, Yao Xie of the Georgia Institute of Technology and Fan-Chi Lin of the University of Utah, will use ambient noise to image the subsurface of geysers in Yellowstone National Park.

“This project is basically imaging what’s underground in a situation where there’s no active source, like an earthquake. We’re using background noise,” Song said. “At Yellowstone, for instance, people visit there and cars drive by. All that could generate signals that are penetrating through the ground. We essentially use that type of information to tap into a very weak signal to infer the image of underground. This is very frontier technology today.”

The system will be made up of a large network of wireless sensors that can perform in-network computing of 3-D images of the shallow earth structure that are based solely on ambient noise.

Real-time ambient noise seismic imaging technology could also inform homeowners if the subsurface below their home, which can change over time, is stable or will sink beneath them.

This technology can also be used in circumstances that don’t need to rely on ambient noise but have an active source that produces signals that can be detected by wireless sensors. It could be used for real-time monitoring and developing early warning systems for natural hazards, such as volcanoes, by determining how close magma is to the surface. It could also benefit oil exploration, which uses methods such as hydrofracturing, in which high-pressure water breaks rocks and allows natural gas to flow more freely from underground.

“As they do that, it’s critical to monitor that in real time so you can know what’s going on under the ground and not cause damage,” Song said. “It’s a very promising technology, and we’re helping this industry reduce costs significantly because previously they only knew what was going on under the subsurface many days and even months later. We could reduce this to seconds.”

Until now, data from oil exploration instruments had to be manually retrieved and uploaded into a centralized database, and it could take days or months to process and analyze the data.

The research team plans to have a field demonstration of the system in Yellowstone and image the subsurface of some of the park’s geysers. The results will be shared with Yellowstone management, rangers and staff. Yellowstone, a popular tourist attraction, is a big volcano that has been dormant for a long time, but scientists are concerned it could one day pose potential hazards.

In the past several years, Song has been developing a Real-time In-situ Seismic Imaging (RISI) system using active sources, under the support of another $1.8 million NSF grant. His lab has built a RISI system prototype that is ready for deployment. The RISI system can be implemented as a general field instrumentation platform for various geophysical imaging applications and incorporate new geophysical data processing and imaging algorithms.

The RISI system can be applied to a wide range of geophysical exploration topics, such as hydrothermal circulation, oil exploration, mining safety and mining resource monitoring, to monitor the uncertainty inherent to the exploration and production process, reduce operation costs and mitigate the environmental risks. The business and social impact is broad and significant. Song is seeking business investors and partners to commercialize this technology.

###

For more information about the project, visit http://sensorweb.cs.gsu.edu/?q=ANSI.

Predicting landslides with light

Optical fiber sensors are used around the world to monitor the condition of difficult-to-access segments of infrastructure-such as the underbellies of bridges, the exterior walls of tunnels, the feet of dams, long pipelines and railways in remote rural areas.

Now, a team of researchers in Italy are expanding the reach of optical fiber sensors “to the hills” by embedding them in shallow trenches within slopes to detect and monitor both large landslides and slow slope movements. The team will present their research at The Optical Society’s (OSA) 98th Annual Meeting, Frontiers in Optics, being held Oct. 19-23 in Tucson, Arizona, USA.

As major disasters around the world this year have shown, landslides can be stark examples of nature at her most unforgiving. Within seconds, a major landslide can completely erase houses and structures that have stood for years, and the catastrophic toll they inflict on communities is felt not just in that destructive loss of property but in the devastating loss of life. The 1999 Vargus tragedy in Venezuela, for instance, killed tens of thousands of people and erased whole towns from the map without warning.

The motivation for an early warning technology, like the one the Italian team has devised, is to find a way to mitigate such losses -just as hurricane tracking can prompt coastal evacuations and save lives.

Predicting Landslides by Detecting Land Strains


Landslides are failures of a rock or soil mass, and are always preceded by various types of “pre-failure” strains-known technically as elastic, plastic and viscous volumetric and shear strains. While the magnitude of these pre-failure strains depends on the rock or soil involved-ranging from fractured rock debris and pyroclastic flows to fine-grained soils-they are measurable. This new technology can detect small shifts in soil slopes, and thus can detect the onset of landslides. Usually, electrical sensors have been used for monitoring landslides, but these sensors are easily damaged. Optical fiber sensors are more robust, economical and sensitive. This is where the new technology could make a difference.

“Distributed optical fiber sensors can act as a ‘nervous system’ of slopes by measuring the tensile strain of the soil they’re embedded within,” explained Professor Luigi Zeni, who is in the Department of Industrial & Information Engineering at the Second University of Naples.

Taking it a step further, Zeni and his colleagues worked out a way of combining several types of optical fiber sensors into a plastic tube that twists and moves under the forces of pre-failure strains. Researchers are then able to monitor the movement and bending of the optical fiber remotely to determine if a landslide is imminent.

The use of novel fiber optic sensors “allows us to overcome some limitations of traditional inclinometers, because fiber-based ones have no moving parts and can withstand larger soil deformations,” Zeni said. “These sensors can be used to cover very large areas-several square kilometers-and interrogated in a time-continuous way to pinpoint any critical zones.”

The findings clearly demonstrate the potential of distributed optical fiber sensors as an entirely new tool to monitor areas subject to landslide risk, Zeni said, and to develop early warning systems based on geo-indicators-early deformations-of slope failures.

Predicting landslides with light

Optical fiber sensors are used around the world to monitor the condition of difficult-to-access segments of infrastructure-such as the underbellies of bridges, the exterior walls of tunnels, the feet of dams, long pipelines and railways in remote rural areas.

Now, a team of researchers in Italy are expanding the reach of optical fiber sensors “to the hills” by embedding them in shallow trenches within slopes to detect and monitor both large landslides and slow slope movements. The team will present their research at The Optical Society’s (OSA) 98th Annual Meeting, Frontiers in Optics, being held Oct. 19-23 in Tucson, Arizona, USA.

As major disasters around the world this year have shown, landslides can be stark examples of nature at her most unforgiving. Within seconds, a major landslide can completely erase houses and structures that have stood for years, and the catastrophic toll they inflict on communities is felt not just in that destructive loss of property but in the devastating loss of life. The 1999 Vargus tragedy in Venezuela, for instance, killed tens of thousands of people and erased whole towns from the map without warning.

The motivation for an early warning technology, like the one the Italian team has devised, is to find a way to mitigate such losses -just as hurricane tracking can prompt coastal evacuations and save lives.

Predicting Landslides by Detecting Land Strains


Landslides are failures of a rock or soil mass, and are always preceded by various types of “pre-failure” strains-known technically as elastic, plastic and viscous volumetric and shear strains. While the magnitude of these pre-failure strains depends on the rock or soil involved-ranging from fractured rock debris and pyroclastic flows to fine-grained soils-they are measurable. This new technology can detect small shifts in soil slopes, and thus can detect the onset of landslides. Usually, electrical sensors have been used for monitoring landslides, but these sensors are easily damaged. Optical fiber sensors are more robust, economical and sensitive. This is where the new technology could make a difference.

“Distributed optical fiber sensors can act as a ‘nervous system’ of slopes by measuring the tensile strain of the soil they’re embedded within,” explained Professor Luigi Zeni, who is in the Department of Industrial & Information Engineering at the Second University of Naples.

Taking it a step further, Zeni and his colleagues worked out a way of combining several types of optical fiber sensors into a plastic tube that twists and moves under the forces of pre-failure strains. Researchers are then able to monitor the movement and bending of the optical fiber remotely to determine if a landslide is imminent.

The use of novel fiber optic sensors “allows us to overcome some limitations of traditional inclinometers, because fiber-based ones have no moving parts and can withstand larger soil deformations,” Zeni said. “These sensors can be used to cover very large areas-several square kilometers-and interrogated in a time-continuous way to pinpoint any critical zones.”

The findings clearly demonstrate the potential of distributed optical fiber sensors as an entirely new tool to monitor areas subject to landslide risk, Zeni said, and to develop early warning systems based on geo-indicators-early deformations-of slope failures.