New Oso report, rockfall in Yosemite, and earthquake models

From AGU’s blogs: Oso disaster had its roots in earlier landslides

A research team tasked with being some of the first scientists and engineers to evaluate extreme events has issued its findings on disastrous Oso, Washington, landslide. The report studies the conditions and causes related to the March 22 mudslide that killed 43 people and destroyed the Steelhead Haven neighborhood in Oso, Washington. The team from the Geotechnical Extreme Events Reconnaissance (GEER) Association, funded by the National Science Foundation, determined that intense rainfall in the three weeks before the slide likely was a major issue, but factors such as altered groundwater migration, weakened soil consistency because of previous landslides and changes in hillside stresses played key roles.

From this week’s Eos: Reducing Rockfall Risk in Yosemite National Park

The glacially sculpted granite walls of Yosemite Valley attract 4 million visitors a year, but rockfalls from these cliffs pose substantial hazards. Responding to new studies, the National Park Service recently took actions to reduce the human risk posed by rockfalls in Yosemite National Park.

From AGU’s journals: A new earthquake model may explain discrepancies in San Andreas fault slip

Investigating the earthquake hazards of the San Andreas Fault System requires an accurate understanding of accumulating stresses and the history of past earthquakes. Faults tend to go through an “earthquake cycle”-locking and accumulating stress, rupturing in an earthquake, and locking again in a well-accepted process known as “elastic rebound.” One of the key factors in preparing for California’s next “Big One” is estimating the fault slip rate, the speed at which one side of the San Andreas Fault is moving past the other.

Broadly speaking, there are two ways geoscientists study fault slip. Geologists formulate estimates by studying geologic features at key locations to study slip rates through time. Geodesists, scientists who measure the size and shape of the planet, use technologies like GPS and satellite radar interferometry to estimate the slip rate, estimates which often differ from the geologists’ estimations.

In a recent study by Tong et al., the authors develop a new three-dimensional viscoelastic earthquake cycle model that represents 41 major fault segments of the San Andreas Fault System. While previous research has suggested that there are discrepancies between the fault slip rates along the San Andreas as measured by geologic and geodetic means, the authors find that there are no significant differences between the two measures if the thickness of the tectonic plate and viscoelasticity are taken into account. The authors find that the geodetic slip rate depends on the plate thickness over the San Andreas, a variable lacking in previous research.

The team notes that of the 41 studied faults within the San Andreas Fault system, a small number do in fact have disagreements between the geologic and geodetic slip rates. These differences could be attributed to inadequate data coverage or to incomplete knowledge of the fault structures or the chronological sequence of past events.

New Oso report, rockfall in Yosemite, and earthquake models

From AGU’s blogs: Oso disaster had its roots in earlier landslides

A research team tasked with being some of the first scientists and engineers to evaluate extreme events has issued its findings on disastrous Oso, Washington, landslide. The report studies the conditions and causes related to the March 22 mudslide that killed 43 people and destroyed the Steelhead Haven neighborhood in Oso, Washington. The team from the Geotechnical Extreme Events Reconnaissance (GEER) Association, funded by the National Science Foundation, determined that intense rainfall in the three weeks before the slide likely was a major issue, but factors such as altered groundwater migration, weakened soil consistency because of previous landslides and changes in hillside stresses played key roles.

From this week’s Eos: Reducing Rockfall Risk in Yosemite National Park

The glacially sculpted granite walls of Yosemite Valley attract 4 million visitors a year, but rockfalls from these cliffs pose substantial hazards. Responding to new studies, the National Park Service recently took actions to reduce the human risk posed by rockfalls in Yosemite National Park.

From AGU’s journals: A new earthquake model may explain discrepancies in San Andreas fault slip

Investigating the earthquake hazards of the San Andreas Fault System requires an accurate understanding of accumulating stresses and the history of past earthquakes. Faults tend to go through an “earthquake cycle”-locking and accumulating stress, rupturing in an earthquake, and locking again in a well-accepted process known as “elastic rebound.” One of the key factors in preparing for California’s next “Big One” is estimating the fault slip rate, the speed at which one side of the San Andreas Fault is moving past the other.

Broadly speaking, there are two ways geoscientists study fault slip. Geologists formulate estimates by studying geologic features at key locations to study slip rates through time. Geodesists, scientists who measure the size and shape of the planet, use technologies like GPS and satellite radar interferometry to estimate the slip rate, estimates which often differ from the geologists’ estimations.

In a recent study by Tong et al., the authors develop a new three-dimensional viscoelastic earthquake cycle model that represents 41 major fault segments of the San Andreas Fault System. While previous research has suggested that there are discrepancies between the fault slip rates along the San Andreas as measured by geologic and geodetic means, the authors find that there are no significant differences between the two measures if the thickness of the tectonic plate and viscoelasticity are taken into account. The authors find that the geodetic slip rate depends on the plate thickness over the San Andreas, a variable lacking in previous research.

The team notes that of the 41 studied faults within the San Andreas Fault system, a small number do in fact have disagreements between the geologic and geodetic slip rates. These differences could be attributed to inadequate data coverage or to incomplete knowledge of the fault structures or the chronological sequence of past events.

Team develops a geothermometer for methane formation

John Eiler (left) and Daniel Stolper (right) with the Caltech-led team's prototype mass spectrometer -- the Thermo IRMS 253 Ultra. This instrument is the first equipped to measure abundances of rare isotopic versions of complex molecules, even where combinations of isotopic substitutions result in closely similar masses. This machine enabled the first precise measurements of molecules of methane that contain two heavy isotopes -- 13CH3D, which incorporates both a carbon-13 atom and a deuterium atom, and 12CH2D2, which includes two deuterium atoms. -  Caltech
John Eiler (left) and Daniel Stolper (right) with the Caltech-led team’s prototype mass spectrometer — the Thermo IRMS 253 Ultra. This instrument is the first equipped to measure abundances of rare isotopic versions of complex molecules, even where combinations of isotopic substitutions result in closely similar masses. This machine enabled the first precise measurements of molecules of methane that contain two heavy isotopes — 13CH3D, which incorporates both a carbon-13 atom and a deuterium atom, and 12CH2D2, which includes two deuterium atoms. – Caltech

Methane is a simple molecule consisting of just one carbon atom bound to four hydrogen atoms. But that simplicity belies the complex role the molecule plays on Earth-it is an important greenhouse gas, is chemically active in the atmosphere, is used in many ecosystems as a kind of metabolic currency, and is the main component of natural gas, which is an energy source.

Methane also poses a complex scientific challenge: it forms through a number of different biological and nonbiological processes under a wide range of conditions. For example, microbes that live in cows’ stomachs make it; it forms by thermal breakdown of buried organic matter; and it is released by hot hydrothermal vents on the sea floor. And, unlike many other, more structurally complex molecules, simply knowing its chemical formula does not necessarily reveal how it formed. Therefore, it can be difficult to know where a sample of methane actually came from.

But now a team of scientists led by Caltech geochemist John M. Eiler has developed a new technique that can, for the first time, determine the temperature at which a natural methane sample formed. Since methane produced biologically in nature forms below about 80°C, and methane created through the thermal breakdown of more complex organic matter forms at higher temperatures (reaching 160°C�°C, depending on the depth of formation), this determination can aid in figuring out how and where the gas formed.

A paper describing the new technique and its first applications as a geothermometer appears in a special section about natural gas in the current issue of the journal Science. Former Caltech graduate student Daniel A. Stolper (PhD ’14) is the lead author on the paper.

“Everyone who looks at methane sees problems, sees questions, and all of these will be answered through basic understanding of its formation, its storage, its chemical pathways,” says Eiler, the Robert P. Sharp Professor of Geology and professor of geochemistry at Caltech.

“The issue with many natural gas deposits is that where you find them-where you go into the ground and drill for the methane-is not where the gas was created. Many of the gases we’re dealing with have moved,” says Stolper. “In making these measurements of temperature, we are able to really, for the first time, say in an independent way, ‘We know the temperature, and thus the environment where this methane was formed.'”

Eiler’s group determines the sources and formation conditions of materials by looking at the distribution of heavy isotopes-species of atoms that have extra neutrons in their nuclei and therefore have different chemistry. For example, the most abundant form of carbon is carbon-12, which has six protons and six neutrons in its nucleus. However, about 1 percent of all carbon possesses an extra neutron, which makes carbon-13. Chemicals compete for these heavy isotopes because they slow molecular motions, making molecules more stable. But these isotopes are also very rare, so there is a chemical tug-of-war between molecules, which ends up concentrating the isotopes in the molecules that benefit most from their stabilizing effects. Similarly, the heavy isotopes like to bind, or “clump,” with each other, meaning that there will be an excess of molecules containing two or more of the isotopes compared to molecules containing just one. This clumping effect is strong at low temperatures and diminishes at higher temperatures. Therefore, determining how many of the molecules in a sample contain heavy isotopes clumped together can tell you something about the temperature at which the sample formed.

Eiler’s group has previously used such a “clumped isotope” technique to determine the body temperatures of dinosaurs, ground temperatures in ancient East Africa, and surface temperatures of early Mars. Those analyses looked at the clumping of carbon-13 and oxygen-18 in various minerals. In the new work, Eiler and his colleagues were able to examine the clumping of carbon-13 and deuterium (hydrogen-2).

The key enabling technology was a new mass spectrometer that the team designed in collaboration with Thermo Fisher, mixing and matching existing technologies to piece together a new platform. The prototype spectrometer, the Thermo IRMS 253 Ultra, is equipped to analyze samples in a way that measures the abundances of several rare versions, or isotopologues, of the methane molecule, including two “clumped isotope” species: 13CH3D, which has both a carbon-13 atom and a deuterium atom, and 12CH2D2, which includes two deuterium atoms.

Using the new spectrometer, the researchers first tested gases they made in the laboratory to make sure the method returned the correct formation temperatures.

They then moved on to analyze samples taken from environments where much is known about the conditions under which methane likely formed. For example, sometimes when methane forms in shale, an impermeable rock, it is trapped and stored, so that it cannot migrate from its point of origin. In such cases, detailed knowledge of the temperature history of the rock constrains the possible formation temperature of methane in that rock. Eiler and Stolper analyzed samples of methane from the Haynesville Shale, located in parts of Arkansas, Texas, and Louisiana, where the shale is not thought to have moved much after methane generation. And indeed, the clumped isotope technique returned a range of temperatures (169°C�°C) that correspond well with current reservoir temperatures (163°C�°C). The method was also spot-on for methane collected from gas that formed as a product of oil-eating bugs living on top of oil reserves in the Gulf of Mexico. It returned temperatures of 34°C and 48°C plus or minus 8°C for those samples, and the known temperatures of the sampling locations were 42°C and 48°C, respectively.

To validate further the new technique, the researchers next looked at methane from the Marcellus Shale, a formation beneath much of the Appalachian basin, where the gas-trapping rock is known to have formed at high temperature before being uplifted into a cooler environment. The scientists wanted to be sure that the methane did not reset to the colder temperature after formation. Using their clumped isotope technique, the researchers verified this, returning a high formation temperature.

“It must be that once the methane exists and is stable, it’s a fossil remnant of what its formation environment was like,” Eiler says. “It only remembers where it formed.”

An important application of the technique is suggested by the group’s measurements of methane from the Antrim Shale in Michigan, where groundwater contains both biologically and thermally produced methane. Clumped isotope temperatures returned for samples from the area clearly revealed the different origins of the gases, hitting about 40°C for a biologically produced sample and about 115°C for a sample involving a mix of biologically and thermally produced methane.

“There are many cases where it is unclear whether methane in a sample of groundwater is the product of subsurface biological communities or has leaked from petroleum-forming systems,” says Eiler. “Our results from the Antrim Shale indicate that this clumped isotope technique will be useful for distinguishing between these possible sources.”

One final example, from the Potiguar Basin in Brazil, demonstrates another way the new method will serve geologists. In this case the methane was dissolved in oil and had been free to migrate from its original location. The researchers initially thought there was a problem with their analysis because the temperature they returned was much higher than the known temperature of the oil. However, recent evidence from drill core rocks from the region shows that the deepest parts of the system actually got very hot millions of years ago. This has led to a new interpretation suggesting that the methane gas originated deep in the system at high temperatures and then percolated up and mixed into the oil.

“This shows that our new technique is not just a geothermometer for methane formation,” says Stolper. “It’s also something you can use to think about the geology of the system.”

Team develops a geothermometer for methane formation

John Eiler (left) and Daniel Stolper (right) with the Caltech-led team's prototype mass spectrometer -- the Thermo IRMS 253 Ultra. This instrument is the first equipped to measure abundances of rare isotopic versions of complex molecules, even where combinations of isotopic substitutions result in closely similar masses. This machine enabled the first precise measurements of molecules of methane that contain two heavy isotopes -- 13CH3D, which incorporates both a carbon-13 atom and a deuterium atom, and 12CH2D2, which includes two deuterium atoms. -  Caltech
John Eiler (left) and Daniel Stolper (right) with the Caltech-led team’s prototype mass spectrometer — the Thermo IRMS 253 Ultra. This instrument is the first equipped to measure abundances of rare isotopic versions of complex molecules, even where combinations of isotopic substitutions result in closely similar masses. This machine enabled the first precise measurements of molecules of methane that contain two heavy isotopes — 13CH3D, which incorporates both a carbon-13 atom and a deuterium atom, and 12CH2D2, which includes two deuterium atoms. – Caltech

Methane is a simple molecule consisting of just one carbon atom bound to four hydrogen atoms. But that simplicity belies the complex role the molecule plays on Earth-it is an important greenhouse gas, is chemically active in the atmosphere, is used in many ecosystems as a kind of metabolic currency, and is the main component of natural gas, which is an energy source.

Methane also poses a complex scientific challenge: it forms through a number of different biological and nonbiological processes under a wide range of conditions. For example, microbes that live in cows’ stomachs make it; it forms by thermal breakdown of buried organic matter; and it is released by hot hydrothermal vents on the sea floor. And, unlike many other, more structurally complex molecules, simply knowing its chemical formula does not necessarily reveal how it formed. Therefore, it can be difficult to know where a sample of methane actually came from.

But now a team of scientists led by Caltech geochemist John M. Eiler has developed a new technique that can, for the first time, determine the temperature at which a natural methane sample formed. Since methane produced biologically in nature forms below about 80°C, and methane created through the thermal breakdown of more complex organic matter forms at higher temperatures (reaching 160°C�°C, depending on the depth of formation), this determination can aid in figuring out how and where the gas formed.

A paper describing the new technique and its first applications as a geothermometer appears in a special section about natural gas in the current issue of the journal Science. Former Caltech graduate student Daniel A. Stolper (PhD ’14) is the lead author on the paper.

“Everyone who looks at methane sees problems, sees questions, and all of these will be answered through basic understanding of its formation, its storage, its chemical pathways,” says Eiler, the Robert P. Sharp Professor of Geology and professor of geochemistry at Caltech.

“The issue with many natural gas deposits is that where you find them-where you go into the ground and drill for the methane-is not where the gas was created. Many of the gases we’re dealing with have moved,” says Stolper. “In making these measurements of temperature, we are able to really, for the first time, say in an independent way, ‘We know the temperature, and thus the environment where this methane was formed.'”

Eiler’s group determines the sources and formation conditions of materials by looking at the distribution of heavy isotopes-species of atoms that have extra neutrons in their nuclei and therefore have different chemistry. For example, the most abundant form of carbon is carbon-12, which has six protons and six neutrons in its nucleus. However, about 1 percent of all carbon possesses an extra neutron, which makes carbon-13. Chemicals compete for these heavy isotopes because they slow molecular motions, making molecules more stable. But these isotopes are also very rare, so there is a chemical tug-of-war between molecules, which ends up concentrating the isotopes in the molecules that benefit most from their stabilizing effects. Similarly, the heavy isotopes like to bind, or “clump,” with each other, meaning that there will be an excess of molecules containing two or more of the isotopes compared to molecules containing just one. This clumping effect is strong at low temperatures and diminishes at higher temperatures. Therefore, determining how many of the molecules in a sample contain heavy isotopes clumped together can tell you something about the temperature at which the sample formed.

Eiler’s group has previously used such a “clumped isotope” technique to determine the body temperatures of dinosaurs, ground temperatures in ancient East Africa, and surface temperatures of early Mars. Those analyses looked at the clumping of carbon-13 and oxygen-18 in various minerals. In the new work, Eiler and his colleagues were able to examine the clumping of carbon-13 and deuterium (hydrogen-2).

The key enabling technology was a new mass spectrometer that the team designed in collaboration with Thermo Fisher, mixing and matching existing technologies to piece together a new platform. The prototype spectrometer, the Thermo IRMS 253 Ultra, is equipped to analyze samples in a way that measures the abundances of several rare versions, or isotopologues, of the methane molecule, including two “clumped isotope” species: 13CH3D, which has both a carbon-13 atom and a deuterium atom, and 12CH2D2, which includes two deuterium atoms.

Using the new spectrometer, the researchers first tested gases they made in the laboratory to make sure the method returned the correct formation temperatures.

They then moved on to analyze samples taken from environments where much is known about the conditions under which methane likely formed. For example, sometimes when methane forms in shale, an impermeable rock, it is trapped and stored, so that it cannot migrate from its point of origin. In such cases, detailed knowledge of the temperature history of the rock constrains the possible formation temperature of methane in that rock. Eiler and Stolper analyzed samples of methane from the Haynesville Shale, located in parts of Arkansas, Texas, and Louisiana, where the shale is not thought to have moved much after methane generation. And indeed, the clumped isotope technique returned a range of temperatures (169°C�°C) that correspond well with current reservoir temperatures (163°C�°C). The method was also spot-on for methane collected from gas that formed as a product of oil-eating bugs living on top of oil reserves in the Gulf of Mexico. It returned temperatures of 34°C and 48°C plus or minus 8°C for those samples, and the known temperatures of the sampling locations were 42°C and 48°C, respectively.

To validate further the new technique, the researchers next looked at methane from the Marcellus Shale, a formation beneath much of the Appalachian basin, where the gas-trapping rock is known to have formed at high temperature before being uplifted into a cooler environment. The scientists wanted to be sure that the methane did not reset to the colder temperature after formation. Using their clumped isotope technique, the researchers verified this, returning a high formation temperature.

“It must be that once the methane exists and is stable, it’s a fossil remnant of what its formation environment was like,” Eiler says. “It only remembers where it formed.”

An important application of the technique is suggested by the group’s measurements of methane from the Antrim Shale in Michigan, where groundwater contains both biologically and thermally produced methane. Clumped isotope temperatures returned for samples from the area clearly revealed the different origins of the gases, hitting about 40°C for a biologically produced sample and about 115°C for a sample involving a mix of biologically and thermally produced methane.

“There are many cases where it is unclear whether methane in a sample of groundwater is the product of subsurface biological communities or has leaked from petroleum-forming systems,” says Eiler. “Our results from the Antrim Shale indicate that this clumped isotope technique will be useful for distinguishing between these possible sources.”

One final example, from the Potiguar Basin in Brazil, demonstrates another way the new method will serve geologists. In this case the methane was dissolved in oil and had been free to migrate from its original location. The researchers initially thought there was a problem with their analysis because the temperature they returned was much higher than the known temperature of the oil. However, recent evidence from drill core rocks from the region shows that the deepest parts of the system actually got very hot millions of years ago. This has led to a new interpretation suggesting that the methane gas originated deep in the system at high temperatures and then percolated up and mixed into the oil.

“This shows that our new technique is not just a geothermometer for methane formation,” says Stolper. “It’s also something you can use to think about the geology of the system.”

Another concern arises over groundwater contamination from fracking accidents

The oil and gas extraction method known as hydraulic fracturing, or fracking, could potentially contribute more pollutants to groundwater than past research has suggested, according to a new study in ACS’ journal Environmental Science & Technology. Scientists are reporting that when spilled or deliberately applied to land, waste fluids from fracking are likely picking up tiny particles in the soil that attract heavy metals and other chemicals with possible health implications for people and animals.

Tammo S. Steenhuis and colleagues note that fracking, which involves injecting huge volumes of fluids underground to release gas and oil, has led to an energy boom in the U.S. But it has also ignited controversy for many reasons. One in particular involves flowback, which refers to fluids that surge back out of the fracked wells during the process. It contains water, lubricants, solvents and other substances from the original fracking fluid or extracted from the shale formation. High-profile spills and in some places, legal application of these liquids to land, have raised alarms. Research has linked fracking to groundwater contamination that could have major health effects. But another factor that no one has really addressed could play a role: colloids. These tiny pieces of minerals, clay and other particles are a concern because they attract heavy metals and other environmental toxins, and have been linked to groundwater contamination. Steenhuis’ team set out to take a closer look.

To simulate what would happen to colloids in soil after a fracking spill, the researchers flushed flowback fluids through sand with a known amount of colloids. They found that the fluids dislodged about a third of the colloids, far more than deionized water alone. When they increased the flow rate, the fluids picked up an additional 36 percent. “This indicates that infiltration of flowback fluid could turn soils into an additional source of groundwater contaminants such as heavy metals, radionuclides and microbial pathogens,” the scientists conclude. More research with real soils is planned.

Another concern arises over groundwater contamination from fracking accidents

The oil and gas extraction method known as hydraulic fracturing, or fracking, could potentially contribute more pollutants to groundwater than past research has suggested, according to a new study in ACS’ journal Environmental Science & Technology. Scientists are reporting that when spilled or deliberately applied to land, waste fluids from fracking are likely picking up tiny particles in the soil that attract heavy metals and other chemicals with possible health implications for people and animals.

Tammo S. Steenhuis and colleagues note that fracking, which involves injecting huge volumes of fluids underground to release gas and oil, has led to an energy boom in the U.S. But it has also ignited controversy for many reasons. One in particular involves flowback, which refers to fluids that surge back out of the fracked wells during the process. It contains water, lubricants, solvents and other substances from the original fracking fluid or extracted from the shale formation. High-profile spills and in some places, legal application of these liquids to land, have raised alarms. Research has linked fracking to groundwater contamination that could have major health effects. But another factor that no one has really addressed could play a role: colloids. These tiny pieces of minerals, clay and other particles are a concern because they attract heavy metals and other environmental toxins, and have been linked to groundwater contamination. Steenhuis’ team set out to take a closer look.

To simulate what would happen to colloids in soil after a fracking spill, the researchers flushed flowback fluids through sand with a known amount of colloids. They found that the fluids dislodged about a third of the colloids, far more than deionized water alone. When they increased the flow rate, the fluids picked up an additional 36 percent. “This indicates that infiltration of flowback fluid could turn soils into an additional source of groundwater contaminants such as heavy metals, radionuclides and microbial pathogens,” the scientists conclude. More research with real soils is planned.

Victoria’s volcano count rises

Geologists have discovered three previously unrecorded volcanoes in volcanically active southeast Australia.

The new Monash University research, published in the Australian Journal of Earth Sciences, gives a detailed picture of an area of volcanic centres already known to geologists in the region.

Covering an area of 19,000 square kilometres in Victoria and South Australia, with over 400 volcanoes, the Newer Volcanics Province (NVP) features the youngest volcanoes in Australia including Mount Schank and Mount Gambier.

Focusing on the Hamilton region, lead researcher Miss Julie Boyce from the School of Geosciences said the surprising discovery means additional volcanic centres may yet be discovered in the NVP.

“Victoria’s latest episode of volcanism began about eight million years ago, and has helped to shape the landscape. The volcanic deposits, including basalt, are among the youngest rocks in Victoria but most people know little about them,”Miss Boyce said.

“Though it’s been more than 5000 years since the last volcanic eruption in Australia, it’s important that we understand where, when and how these volcanoes erupted. The province is still active, so there may be future eruptions.”

The largest unrecorded volcano is a substantial maar-cone volcanic complex – a broad, low relief volcanic crater caused by an explosion when groundwater comes into contact with hot magma – identified 37 kilometres east of Hamilton.

Miss Boyce said the discoveries were made possible only by analysing a combination of satellite photographs, detailed NASA models of the topography of the area and the distribution of magnetic minerals in the rocks, alongside site visits to build a detailed picture of the Hamilton region of the NVP.

“Traditionally, volcanic sites are analysed by one or two of these techniques. This is the first time that this multifaceted approach has been applied to the NVP and potentially it could be used to study other volcanic provinces worldwide.”

The NVP is considered active, as carbon dioxide is released from the Earth’s mantle in several areas, where there is a large heat anomaly at depth. With an eruption frequency of one volcano every 10,800 years or less, future eruptions may yet occur.

It’s hoped that this multifaceted approach will lead to a better understanding of the distribution and nature of volcanism, allowing for more accurate hazard analysis and risk estimates for future eruptions.

Victoria’s volcano count rises

Geologists have discovered three previously unrecorded volcanoes in volcanically active southeast Australia.

The new Monash University research, published in the Australian Journal of Earth Sciences, gives a detailed picture of an area of volcanic centres already known to geologists in the region.

Covering an area of 19,000 square kilometres in Victoria and South Australia, with over 400 volcanoes, the Newer Volcanics Province (NVP) features the youngest volcanoes in Australia including Mount Schank and Mount Gambier.

Focusing on the Hamilton region, lead researcher Miss Julie Boyce from the School of Geosciences said the surprising discovery means additional volcanic centres may yet be discovered in the NVP.

“Victoria’s latest episode of volcanism began about eight million years ago, and has helped to shape the landscape. The volcanic deposits, including basalt, are among the youngest rocks in Victoria but most people know little about them,”Miss Boyce said.

“Though it’s been more than 5000 years since the last volcanic eruption in Australia, it’s important that we understand where, when and how these volcanoes erupted. The province is still active, so there may be future eruptions.”

The largest unrecorded volcano is a substantial maar-cone volcanic complex – a broad, low relief volcanic crater caused by an explosion when groundwater comes into contact with hot magma – identified 37 kilometres east of Hamilton.

Miss Boyce said the discoveries were made possible only by analysing a combination of satellite photographs, detailed NASA models of the topography of the area and the distribution of magnetic minerals in the rocks, alongside site visits to build a detailed picture of the Hamilton region of the NVP.

“Traditionally, volcanic sites are analysed by one or two of these techniques. This is the first time that this multifaceted approach has been applied to the NVP and potentially it could be used to study other volcanic provinces worldwide.”

The NVP is considered active, as carbon dioxide is released from the Earth’s mantle in several areas, where there is a large heat anomaly at depth. With an eruption frequency of one volcano every 10,800 years or less, future eruptions may yet occur.

It’s hoped that this multifaceted approach will lead to a better understanding of the distribution and nature of volcanism, allowing for more accurate hazard analysis and risk estimates for future eruptions.

Breakthrough provides picture of underground water

Superman isn’t the only one who can see through solid surfaces. In a development that could revolutionize the management of precious groundwater around the world, Stanford researchers have pioneered the use of satellites to accurately measure levels of water stored hundreds of feet below ground. Their findings were published recently in Water Resources Research.

Groundwater provides 25 to 40 percent of all drinking water worldwide, and is the primary source of freshwater in many arid countries, according to the National Groundwater Association. About 60 percent of all withdrawn groundwater goes to crop irrigation. In the United States, the number is closer to 70 percent. In much of the world, however, underground reservoirs or aquifers are poorly managed and rapidly depleted due to a lack of water-level data. Developing useful groundwater models, availability predictions and water budgets is very challenging.

Study co-author Rosemary Knight, a professor of geophysics and senior fellow, by courtesy, at the Stanford Woods Institute for the Environment, compared groundwater use to a mismanaged bank account: “It’s like me saying I’m going to retire and live off my savings without knowing how much is in the account.”

Lead author Jessica Reeves, a postdoctoral scholar in geophysics, extended Knight’s analogy to the connection among farmers who depend on the same groundwater source. “Imagine your account was connected to someone else’s account, and they were withdrawing from it without your knowing.”

Until now, the only way a water manager could gather data about the state of water tables in a watershed was to drill monitoring wells. The process is time and resource intensive, especially for confined aquifers, which are deep reservoirs separated from the ground surface by multiple layers of impermeable clay. Even with monitoring wells, good data is not guaranteed. Much of the data available from monitoring wells across the American West is old and of varying quality and scientific usefulness. Compounding the problem, not all well data is openly shared.

To solve these challenges, Reeves, Knight, Stanford Woods Institute-affiliated geophysics and electrical engineering Professor Howard Zebker, Stanford civil and environmental engineering Professor Peter Kitanidis and Willem Schreüder of Principia Mathematica Inc. looked to the sky.

The basic concept: Satellites that use electromagnetic waves to monitor changes in the elevation of Earth’s surface to within a millimeter could be mined for clues about groundwater. The technology, Interferometric Synthetic Aperture Radar (InSAR), had previously been used primarily to collect data on volcanoes, earthquakes and landslides.

With funding from NASA, the researchers used InSAR to make measurements at 15 locations in Colorado’s San Luis Valley, an important agricultural region and flyway for migrating birds. Based on observed changes in Earth’s surface, the scientists compiled water-level measurements for confined aquifers at three of the sampling locations that matched the data from nearby monitoring wells.

“If we can get this working in between wells, we can measure groundwater levels across vast areas without using lots of on-the-ground monitors,” Reeves said.

The breakthrough holds the potential for giving resource managers in Colorado and elsewhere valuable data as they build models to assess scenarios such as the effect on groundwater from population increases and droughts.

Just as computers and smartphones inevitably get faster, satellite data will only improve. That means more and better data for monitoring and managing groundwater. Eventually, InSAR data could play a vital role in measuring seasonal changes in groundwater supply and help determine levels for sustainable water use.

In the meantime, Knight envisions a Stanford-based, user-friendly online database that consolidates InSAR findings and a range of other current remote sensing data for soil moisture, precipitation and other components of a water budget. “Very few, if any, groundwater managers are tapping into any of the data,” Knight said. With Zebker, postdoctoral fellow Jingyi Chen and colleagues at the University of South Carolina, Knight recently submitted a grant proposal for this concept to NASA.

Breakthrough provides picture of underground water

Superman isn’t the only one who can see through solid surfaces. In a development that could revolutionize the management of precious groundwater around the world, Stanford researchers have pioneered the use of satellites to accurately measure levels of water stored hundreds of feet below ground. Their findings were published recently in Water Resources Research.

Groundwater provides 25 to 40 percent of all drinking water worldwide, and is the primary source of freshwater in many arid countries, according to the National Groundwater Association. About 60 percent of all withdrawn groundwater goes to crop irrigation. In the United States, the number is closer to 70 percent. In much of the world, however, underground reservoirs or aquifers are poorly managed and rapidly depleted due to a lack of water-level data. Developing useful groundwater models, availability predictions and water budgets is very challenging.

Study co-author Rosemary Knight, a professor of geophysics and senior fellow, by courtesy, at the Stanford Woods Institute for the Environment, compared groundwater use to a mismanaged bank account: “It’s like me saying I’m going to retire and live off my savings without knowing how much is in the account.”

Lead author Jessica Reeves, a postdoctoral scholar in geophysics, extended Knight’s analogy to the connection among farmers who depend on the same groundwater source. “Imagine your account was connected to someone else’s account, and they were withdrawing from it without your knowing.”

Until now, the only way a water manager could gather data about the state of water tables in a watershed was to drill monitoring wells. The process is time and resource intensive, especially for confined aquifers, which are deep reservoirs separated from the ground surface by multiple layers of impermeable clay. Even with monitoring wells, good data is not guaranteed. Much of the data available from monitoring wells across the American West is old and of varying quality and scientific usefulness. Compounding the problem, not all well data is openly shared.

To solve these challenges, Reeves, Knight, Stanford Woods Institute-affiliated geophysics and electrical engineering Professor Howard Zebker, Stanford civil and environmental engineering Professor Peter Kitanidis and Willem Schreüder of Principia Mathematica Inc. looked to the sky.

The basic concept: Satellites that use electromagnetic waves to monitor changes in the elevation of Earth’s surface to within a millimeter could be mined for clues about groundwater. The technology, Interferometric Synthetic Aperture Radar (InSAR), had previously been used primarily to collect data on volcanoes, earthquakes and landslides.

With funding from NASA, the researchers used InSAR to make measurements at 15 locations in Colorado’s San Luis Valley, an important agricultural region and flyway for migrating birds. Based on observed changes in Earth’s surface, the scientists compiled water-level measurements for confined aquifers at three of the sampling locations that matched the data from nearby monitoring wells.

“If we can get this working in between wells, we can measure groundwater levels across vast areas without using lots of on-the-ground monitors,” Reeves said.

The breakthrough holds the potential for giving resource managers in Colorado and elsewhere valuable data as they build models to assess scenarios such as the effect on groundwater from population increases and droughts.

Just as computers and smartphones inevitably get faster, satellite data will only improve. That means more and better data for monitoring and managing groundwater. Eventually, InSAR data could play a vital role in measuring seasonal changes in groundwater supply and help determine levels for sustainable water use.

In the meantime, Knight envisions a Stanford-based, user-friendly online database that consolidates InSAR findings and a range of other current remote sensing data for soil moisture, precipitation and other components of a water budget. “Very few, if any, groundwater managers are tapping into any of the data,” Knight said. With Zebker, postdoctoral fellow Jingyi Chen and colleagues at the University of South Carolina, Knight recently submitted a grant proposal for this concept to NASA.