Harmful particles in Icelandic volcanic ash fell first, says new research

The type of particles which are most harmful to jet engines were the first to fall out of the Eyjafjallajökull ash plume following the volcano’s eruption in 2010, delegates at the Goldschmidt conference will be told today (Wednesday 28th August).

The research, led by Dr Bernard Grobety of the University of Fribourg in Switzerland, will help to mitigate the impact of future volcanic eruptions on air travel.

Dr Grobety’s team analysed samples of volcanic ash taken at different points in its journey from the volcano across Europe. They found that the two different forms of the ash particles – crystalline and glassy – behaved differently during transport. As the plume travelled through the air, the crystalline particles, which are denser and heavier, fell out of the cloud first compared to even-sized glassy particles.

“It’s already known that the larger, heavier particles in an ash cloud will be the first to fall out as the cloud travels away from a volcano,” explains Dr Grobety. “It is also clear that particles of equal size but higher density will fall out faster. Our research, however, is the first to evidence the faster loss of crystalline particles in a volcanic cloud and that the overall composition of the ash changes during transport. Since crystalline particles are harder and melt at higher temperatures, they are more harmful to jet engines than glassy particles. Understanding the behaviour of these different forms in the ash cloud will enable the authorities to fine tune their response should another volcanic eruption take place.

The 2010 eruption of Eyjafjallajökull caused air traffic in Europe to be grounded for six days, with widespread disruption in over 20 countries. There has been extensive research since 2010 to reduce the impact of future eruptions, but much of the research treats the ash cloud as homogenous, focusing on its concentration and the size of particles within it. Dr Grobety’s research adds another layer of detail which could reduce the impact of any eruption still further.

“We’re already at the point where we can say that if the ash is at a certain concentration and a certain particle size, it poses no threat to aircraft,” says Dr Grobety. “However, it’s possible that even at a higher concentration, if no crystalline particles are present, planes may still be safe to fly. By monitoring how quickly these particles are falling out of the cloud, it could reduce the area affected or help restrictions to be lifted sooner.

“However, there are a lot of factors which will determine the impact of any future eruptions – from the nature of the eruption itself, to the prevailing winds and the concentration of the ash. While the detail we’re able to provide may be only one of many factors to take into account, anything that can limit the disruption to air travel has to be worth looking at.

Study finds earlier peak for Spain’s glaciers

Jane Willenbring (upper right) takes samples to date a boulder in Spain's Bejar mountain range. Her findings helped show that ancient glaciers in the region reached their maximum size several thousands of years earlier than once believed. -  University of Pennsylvania
Jane Willenbring (upper right) takes samples to date a boulder in Spain’s Bejar mountain range. Her findings helped show that ancient glaciers in the region reached their maximum size several thousands of years earlier than once believed. – University of Pennsylvania

The last glacial maximum was a time when Earth’s far northern and far southern latitudes were largely covered in ice sheets and sea levels were low. Over much of the planet, glaciers were at their greatest extent roughly 20,000 years ago. But according to a study headed by University of Pennsylvania geologist Jane Willenbring, that wasn’t true in at least one part of southern Europe. Due to local effects of temperature and precipitation, the local glacial maximum occurred considerably earlier, around 26,000 years ago.

The finding sheds new light on how regional climate has varied over time, providing information that could lead to more-accurate global climate models, which predict what changes Earth will experience in the future.

Willenbring, an assistant professor in Penn’s Department of Earth and Environmental Science in the School of Arts and Sciences, teamed with researchers from Spain, the United Kingdom, China and the United States to pursue this study of the ancient glaciers of southern Europe.

“We wanted to unravel why and when glaciers grow and shrink,” Willenbring said.

In the study site in central Spain, it is relatively straightforward to discern the size of ancient glaciers, because the ice carried and dropped boulders at the margin. Thus a ring of boulders marks the edge of the old glacier.

It is not as easy to determine what caused the glacier to grow, however. Glaciers need both moisture and cold temperatures to expand. Studying the boulders that rim the ancient glaciers alone cannot distinguish these contributions. Caves, however, provide a way to differentiate the two factors. Stalagmites and stalactites – the stony projections that grow from the cave floor and ceiling, respectively – carry a record of precipitation because they grow as a result of dripping water.

“If you add the cave data to the data from the glaciers, it gives you a neat way of figuring out whether it was cold temperatures or higher precipitation that drove the glacier growth at the time,” Willenbring said.

The researchers conducted the study in three of Spain’s mountain ranges: the Bejár, Gredos and Guadarrama. The nearby Eagle Cave allowed them to obtain indirect precipitation data.

To ascertain the age of the boulders strewn by the glaciers and thus come up with a date when glaciers were at their greatest extent, Willenbring and colleagues used a technique known as cosmogenic nuclide exposure dating, which measures the chemical residue of supernova explosions. They also used standard radiometric techniques to date stalagmites from Eagle Cave, which gave them information about fluxes in precipitation during the time the glaciers covered the land.

“Previously, people believe the last glacial maximum was somewhere in the range of 19-23,000 years ago,” Willenbring said. “Our chronology indicates that’s more in the range of 25-29,000 years ago in Spain.”

The geologists found that, although temperatures were cool in the range of 19,000-23,000 years ago, conditions were also relatively dry, so the glaciers did not regain the size they had obtained several thousand years earlier, when rain and snowfall totals were higher. They reported their findings in the journal Scientific Reports.

Given the revised timeline in this region, Willenbring and colleagues determined that the increased precipitation resulted from changes in the intensity of the sun’s radiation on the Earth, which is based on the planet’s tilt in orbit. Such changes can impact patterns of wind, temperature and storms.

“That probably means there was a southward shift of the North Atlantic Polar Front, which caused storm tracks to move south, too,” Willenbring said. “Also, at this time there was a nice warm source of precipitation, unlike before and after when the ocean was colder.”

Willenbring noted that the new date for the glacier maximum in the Mediterranean region, which is several thousands of years earlier than the date the maximum was reached in central Europe, will help provide more context for creating accurate global climate models.

“It’s important for global climate models to be able to test under what conditions precipitation changes and when sources for that precipitation change,” she said. “That’s particularly true in some of these arid regions, like the American Southwest and the Mediterranean.”

When glaciers were peaking in the Mediterranean around 26,000 years ago, the American Southwest was experiencing similar conditions. Areas that are now desert were moist. Large lakes abounded, including Lake Bonneville, which covered much of modern-day Utah. The state’s Great Salt Lake is what remains.

“Lakes in this area were really high for 5,000-10,000 years, and the cause for that has always been a mystery,” Willenbring said. “By looking at what was happening in the Mediterranean, we might eventually be able to say something about the conditions that led to these lakes in the Southwest, too.”

Earthquakes and tectonics in Pamir Tien Shan

Earthquake damage to buildings is mainly due to the existing shear waves which transfer their energy during an earthquake to the houses. These shear waves are significantly influenced by the underground and the topography of the surrounding area. Detailed knowledge of the landform and the near-surface underground structure is, therefore, an important prerequisite for a local seismic hazard assessment and for the evaluation of the ground-effect, which can strongly modify and increase local ground motion.

As described in the latest issue of Geophysical Journal International, a team of scientists from the GFZ German Research Center for Geosciences could show that it is possible to map complex shear wave velocity structures almost in real time by means of a newly developed tomgraphic approach.

The method is based on ambient seismic noise recordings and analyses. “We use small, hardly noticeable amplitude ground motions as well as anthropogenic ground vibrations”, Marco Pilz, a scientist at GFZ, explains. “With the help of these small signals we can obtain detailed images of the shallow seismic velocity structure”. In particular, images and velocity changes in the underground due to earthquakes and landslides can be obtained in almost real time.

“What is new about our method is the direct calculation of the shear wave velocity. Moreover, we are working on a local, small-scale level — compared to many other studies”, Marco Pilz continues.

This method has already been successfully applied: Many regions of Central Asia are threatened by landslides. Since the shear wave velocity usually drops significantly before a landslide slip this technique offers the chance to monitor changes in landslide prone areas almost in real time.

Further application can be used in earthquake research. The authors were able to map the detailed structure of a section of the Issyk-Ata fault, Kyrgyzstan, which runs along the southern border of the capital city, Bishkek, with a population of approx. 900.000 inhabitants. They showed that close to the surface of the mapped section a splitting into two different small fault branches can be observed. This can influence the pace of expansion or also an eventual halting of the propagation on the main fault.

Central Asia is extensively seismically endangered; the accompanying processes and risks are investigated by the Central-Asian Institute of Applied Geosciences (CAIAG) in Bishkek, a joint institution established by the GFZ and the Kyrgyz government.

Why do these earthquakes occur?

The Pamir and Tien Shan are the result of the crash of two continental plates: the collision of India and Eurasia causes the high mountain ranges. This process is still ongoing today and causes breaking of the Earths crust, of which earthquakes are the consequence.

A second group of GFZ-scientists has investigated together with colleagues from Tajikistan and CAIAG the tectonic process of collision in this region. They were, for the first time, able to image continental crust descending into the Earth’s mantle. In the scientific journal Earth and Planetary Sciences Letters the scientists report that this subduction of continental crust has, to date, never been directly observed. To make their images, the scientists applied a special seismological method (so-called receiver function-analysis) on seismograms that had been collected in a two years long field experiment in the Tien Shan-Pamir-Hindu Kush area. Here, the collision of the Indian and Eurasian plates presents an extreme dimension.

“These extreme conditions cause the Eurasian lower crust to subduct into the Earth’s mantle”, explains Felix Schneider from the GFZ German Research Centre for Geosciences.” Such a subduction can normally be observed during the collision of ocean crust with continental crust, as the ocean floors are heavier than continental rock.”

Findings at the surface of metamorphic rocks that must have arisen from ultra-high pressures deep in the Earth’s mantle also provide evidence for subduction of continental crust in the Pamir region. Furthermore, the question arises, how the occurrence of numerous earthquakes at unusual depths of down to 300 km in the upper mantel can be explained. Through the observation of the subducting part of the Eurasian lower crust, this puzzle could, however, be solved.

Mathematical models help locate raw materials

This image shows Dr. Raimon Tolosana-Delgado. -  HZDR
This image shows Dr. Raimon Tolosana-Delgado. – HZDR

The engineering geologist Dr. Raimon Tolosana-Delgado from the Helmholtz Institute Freiberg for Resource Technology (HIF) at the Helmholtz-Zentrum Dresden-Rossendorf will receive this year’s Felix Chayes Prize from the International Association for Mathematical Geosciences (IAMG). He is to be honored for his mathematical models on the development of rocks and sediments; which might be helpful in indicating where crude oil as well as other raw material deposits are located. The prize, which is endowed with US $7,000, will be presented at the IAMG’s annual conference to be held in Madrid, Spain, between September 2 and 6, 2013.

Raimon Tolosana-Delgado is particularly delighted since he will receive the Felix Chayes Prize in his home country, Spain. The scientist has been working at the HIF since October 2012. He completed his studies and his doctoral dissertation at different Spanish universities. During this time, he also carried out a five months’ predoctoral research stage at the Ernst Moritz Arndt University of Greifswald. He then worked for several years at the Georg-August-Universität Goettingen and at the Technical University of Catalonia (UPC), one of the largest technical universities in Spain.

Raimon Tolosana-Delgado uses statistical calculations to ascertain how rocks are formed and how specific types of sediments are created through erosion and transportation of rocks by glaciers and rivers. Rocks and sediments are the source of many raw materials on earth, whether it be such energy carriers as, for example, crude oil or mineral and metalliferous resources. Numerous factors play a role in this, for example, moderately high pressures and temperatures; but also the climate or the environment have an impact on the formation of raw material deposits. “If we can understand and model these conditions, then this will help us assess where we can find these raw materials,” he says. The Spanish scientist is above all interested in the formation of these materials. “It’s my goal to develop general models which can be used for different raw materials.”

So far, Raimon Tolosana-Delgado’s primary focus has been on the characteristic properties and the development of sediments as well as the associated chemical processes. They are particularly relevant when it comes to the formation of oil deposits. Similar data are also needed in order to determine which rocks might have minerals in them considered as valuable industrial resources. “In the future, quantitative mathematical models could help improve the exploration of deposits and the assessment of whether a particular source of raw materials is significant enough or too small. These models could also be important because the easy-to-find, near-surface deposits have essentially been found, and future explorations have to be carried out at greater depths,” notes the researcher. But the models could also permit the development of more efficient methods for the extraction of raw materials.

3-D Earth model developed at Sandia Labs more accurately pinpoints source of earthquakes, explosions

Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth's mantle and crust designed to help pinpoint the location of all types of explosions. -  Photo by Randy Montoya, Sandia National Laboratories
Sandia National Laboratories researcher Sandy Ballard and colleagues from Sandia and Los Alamos National Laboratory have developed SALSA3D, a 3-D model of the Earth’s mantle and crust designed to help pinpoint the location of all types of explosions. – Photo by Randy Montoya, Sandia National Laboratories

During the Cold War, U.S. and international monitoring agencies could spot nuclear tests and focused on measuring their sizes. Today, they’re looking around the globe to pinpoint much smaller explosives tests.

Under the sponsorship of the National Nuclear Security Administration’s Office of Defense Nuclear Nonproliferation R&D, Sandia National Laboratories and Los Alamos National Laboratory have partnered to develop a 3-D model of the Earth’s mantle and crust called SALSA3D, or Sandia-Los Alamos 3D. The purpose of this model is to assist the US Air Force and the international Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) in Vienna, Austria, more accurately locate all types of explosions.

The model uses a scalable triangular tessellation and seismic tomography to map the Earth’s “compressional wave seismic velocity,” a property of the rocks and other materials inside the Earth that indicates how quickly compressional waves travel through them and is one way to accurately locate seismic events, Sandia geophysicist Sandy Ballard said. Compressional waves – measured first after seismic events – move the particles in rocks and other materials minute distances backward and forward between the location of the event and the station detecting it.

SALSA3D also reduces the uncertainty in the model’s predictions, an important feature for decision-makers who must take action when suspicious activity is detected, he added.

“When you have an earthquake or nuclear explosion, not only do you need to know where it happened, but also how well you know that. That’s a difficult problem for these big 3-D models. It’s mainly a computational problem,” Ballard said. “The math is not so tough, just getting it done is hard, and we’ve accomplished that.”

A Sandia team has been writing and refining code for the model since 2007 and is now demonstrating SALSA3D is more accurate than current models.

In recent tests, SALSA3D was able to predict the source of seismic events over a geographical area that was 26 percent smaller than the traditional one-dimensional model and 9 percent smaller than a recently developed Regional Seismic Travel Time (RSTT) model used with the one-dimensional model.

GeoTess software release

Sandia recently released SALSA3D’s framework – the triangular tessellated grid on which the model is built – to other Earth scientists, seismologists and the public. By standardizing the framework, the seismological research community can more easily share models of the Earth’s structure and global monitoring agencies can better test different models. Both activities are hampered by the plethora of models available today, Ballard said. (See box.)

“GeoTess makes models compatible and standardizes everything,” he said. “This would really facilitate sharing of different models, if everyone agreed on it.”

Seismologists and researchers worldwide can now download GeoTess, which provides a common model parameterization for multidimensional Earth models and a software support system that addresses the construction, population, storage and interrogation of data stored in the model. GeoTess is not specific to any particular data, so users have considerable flexibility in how they store information in the model. The free package, including source code, is being released under the very liberal BSD Open Source License. The code is available in Java and C++, with interfaces to the C++ version written in C and Fortran90. GeoTess has been tested on multiple platforms, including Linux, SunOS, MacOSX and Windows. GeoTess is available here.

When an explosion goes off, the energy travels through the Earth as waves that are picked up by seismometers at U.S. and international ground monitoring stations associated with nuclear explosion monitoring organizations worldwide. Scientists use these signals to determine the location.

They first predict the time taken for the waves to travel from their source through the Earth to each station. To calculate that, they have to know the seismic velocity of the Earth’s materials from the crust to the inner core, Ballard said.

“If you have material that has very high seismic velocity, the waves travel very quickly, but the energy travels less quickly through other kinds of materials, so it takes the signals longer to travel from the source to the receiver,” he says.

For the past 100 years, seismologists have predicted the travel time of seismic energy from source to receiver using one-dimensional models. These models, which are still widely used today, account only for radial variations in seismic velocity and ignore variations in geographic directions. They yield seismic event locations that are reasonably accurate, but not nearly as precise as locations calculated with high fidelity 3-D models.

Modern 3-D models of the Earth, like SALSA3D, account for distortions of the seismic wavefronts caused by minor lateral differences in the properties of rocks and other materials.

For example, waves are distorted when they move through a geological feature called a subduction zone, such as the one beneath the west coast of South America where one tectonic plate under the Pacific Ocean is diving underneath the Andes Mountains. This happens at about the rate at which fingernails grow, but, geologically speaking, that’s fast, Ballard said.

One-dimensional models, like the widely used ak135 developed in the 1990s, are good at predicting the travel time of waves when the distance from the source to the receiver is large because these waves spend most of their time traveling through the deepest, most homogenous parts of the Earth. They don’t do so well at predicting travel time to nearby events where the waves spend most of their time in the Earth’s crust or the shallowest parts of the mantle, both of which contain a larger variety of materials than the lower mantle and the Earth’s core.

RSTT, a previous model developed jointly by Sandia, Los Alamos and Lawrence Livermore national laboratories, tried to solve that problem and works best at ranges of about 60-1,200 miles (100-2,000 kilometers).

Still, “the biggest errors we get are close to the surface of the Earth. That’s where the most variability in materials is,” Ballard said.

Seismic tomography gives SALSA3D accuracy

Today, Earth scientists are mapping three dimensions: the radius, latitude and longitude.

Anyone who’s studied a globe or world atlas knows that the traditional grid of longitudinal and latitudinal lines work all right the closer you are to the equator, but at the poles, the lines are too close together. For nuclear explosion monitoring, Earth models must accurately characterize the polar regions even though they are remote because seismic waves travel under them, Ballard said.

Triangular tessellation solves that with nodes, or intersections of the triangles, that can be accurately modeled even at the poles. The triangles can be smaller where more detail is needed and larger in areas that require less detail, like the oceans. Plus the model extends into the Earth like columns of stacked pieces of pie without the rounded crust edges.

The way Sandia calculates the seismic velocities uses the same math that is used to detect a tumor in an MRI, except on a global, rather than a human, scale.

Sandia uses historical data from 118,000 earthquakes and 13,000 current and former monitoring stations worldwide collected by Los Alamos Lab’s Ground Truth catalog.

“We apply a process called seismic tomography where we take millions of observed travel times and invert them for the seismic velocities that would create that data set. It’s mathematically similar to doing linear regression, but on steroids,” Sandy says. Linear regression is a simple mathematical way to model the relationship between a known variable and one or more unknown variables. Because the Sandia team models hundreds of thousands of unknown variables, they apply a mathematical method called least squares to minimize the discrepancies between the data from previous seismic events and the predictions.

With 10 million data points, Sandia uses a distributed computer network with about 400 core processors to characterize the seismic velocity at every node.

Monitoring agencies could use SALSA3D to precompute the travel time from each station in their network to every point on Earth. When it comes time to compute the location of a new seismic event in real-time, source-to-receiver travel times can be computed in a millisecond and pinpoint the energy’s source in about a second, he said.

Uncertainty modeling a SALSA3D feature

But no model is perfect, so Sandia has developed a way to measure the uncertainty in each prediction SALSA3D makes, based on uncertainty in the velocity at each node and how that uncertainty affects the travel time prediction of each wave from a seismic event to each monitoring station.

SALSA3D estimates for the users at monitoring stations the most likely location of a seismic event and the amount of uncertainty in the answer to help inform their decisions.

International test ban treaties require that on-site inspections can only occur within a 1,000-square-kilometer (385-square-mile) area surrounding a suspected nuclear test site. Today, 3-D Earth models like SALSA3D are helping to meet and sometimes significantly exceed this threshold in most parts of the world.

“It’s extremely difficult to do because the problem is so large,” Ballard said. “But we’ve got to know it within 1,000 square kilometers or they might search in the wrong place.”

Rising mountains, cooling oceans prompted spread of invasive species 450 million years ago

This slab of rock contains fossils of invasive species that populated the continent of Laurentia 450 million years ago after a major ecological shift occurred. Ohio University geologists found that rising mountains and cooling oceans prompted the spread of these invasive species. -  Alycia Stigall
This slab of rock contains fossils of invasive species that populated the continent of Laurentia 450 million years ago after a major ecological shift occurred. Ohio University geologists found that rising mountains and cooling oceans prompted the spread of these invasive species. – Alycia Stigall

New Ohio University research suggests that the rise of an early phase of the Appalachian Mountains and cooling oceans allowed invasive species to upset the North American ecosystem 450 million years ago.

The study, published recently in the journal PLOS ONE, took a closer look at a dramatic ecological shift captured in the fossil record during the Ordovician period. Ohio University scientists argue that major geological developments triggered evolutionary changes in the ancient seas, which were dominated by organisms such as brachiopods, corals, trilobites and crinoids.

During this period, North America was part of an ancient continent called Laurentia that sat near the equator and had a tropical climate. Shifting of the Earth’s tectonic plates gave rise to the Taconic Mountains, which were forerunners of the Appalachian Mountains. The geological shift left a depression behind the mountain range, flooding the area with cool water from the surrounding deep ocean.

Scientists knew that there was a massive influx of invasive species into this ocean basin during this time period, but didn’t know where the invaders came from or how they got a foothold in the ecosystem, said Alycia Stigall, an Ohio University associate professor of geological sciences who co-authored the paper with former Ohio University graduate student David Wright, now a doctoral student at Ohio State University.

“The rocks of this time record a major oceanographic shift, pulse of mountain building and a change in evolutionary dynamics coincident with each other,” Stigall said. “We are interested in examining the interactions between these factors.”

Using the fossils of 53 species of brachiopods that dominated the Laurentian ecosystem, Stigall and Wright created several phylogenies, or trees of reconstructed evolutionary relationships, to examine how individual speciation events occurred.

The invaders that proliferated during this time period were species within the groups of animals that inhabited Laurentia, Stigall explained. Within the brachiopods, corals and cephalopods, for example, some species are invasive and some are not.

As the geological changes slowly played out over the course of a million years, two patterns of survival emerged, the scientists report.

During the early stage of mountain building and ocean cooling, the native organisms became geographically divided, slowly evolving into different species suited for these niche habitats. This process, called vicariance, is the typical method by which new species originate on Earth, Stigall said.

As the geological changes progressed, however, species from other regions of the continent began to directly invade habitats, a process called dispersal. Although biodiversity may initially increase, this process decreases biodiversity in the long term, Stigall explained, because it allows a few aggressive species to populate many sites quickly, dominating those ecosystems.

This is the second time that Stigall and her team have found this pattern of speciation in the geological record. A study published in 2010 on the invasive species that prompted a mass extinction during the Devonian period about 375 million years ago also discovered a shift from vicariance to dispersal that contributed to a decline in biodiversity, Stigall noted.

It’s a pattern that’s happening during our modern biodiversity crisis as well, she said.

“Only one out of 10 invaders truly become invasive species. Understanding the process can help determine where to put conservation resources,” she said.

X-ray vision to detect unseen gold

Powerful x-rays can now be used to rapidly and accurately detect gold in ore samples, thanks to a new technique developed by CSIRO – a move that could save Australia’s minerals industry hundreds of millions of dollars each year.

CSIRO has conducted a pilot study that shows that gamma-activation analysis (GAA) offers a much faster, more accurate way to detect gold than traditional chemical analysis methods.

This will mean mining companies can measure what’s coming in and out of their processing plants with greater accuracy, allowing them to monitor process performance and recover small traces of gold – worth millions of dollars – that would otherwise be discarded.

GAA works by scanning mineral samples – typically weighing around half a kilogram – using high-energy x-rays similar to those used to treat patients in hospitals. The x-rays activate any gold in the sample, and the activation is then picked up using a sensitive detector.

According to project leader Dr James Tickner, CSIRO’s study showed that this method is two-to-three times more accurate than the standard industry technique ‘fire assay’, which requires samples to be heated up to 1200°C.

“The big challenge for this project was to push the sensitivity of GAA to detect gold at much lower levels – well below a threshold of one gram per tonne,” he says.

Dr Tickner explains that a gold processing plant may only recover between 65 and 85 per cent of gold present in mined rock. Given a typical plant produces around A$1 billion of gold each year, this means hundreds of millions of dollars worth of gold is going to waste.

“Our experience suggests that better process monitoring can help reduce this loss by about a third,” he says.

Last year, Australia produced over A$10 billion worth of gold. Even if GAA only led to a modest 5 per cent improvement in recovery, that would be worth half a billion dollars annually to the industry.

Dr Tickner says that the other major benefit of GAA is that it is easily automated, allowing for much quicker analysis of ore samples.

“Fire assay usually involves sending samples off to a central lab and waiting several days for the results. Using GAA we can do the analysis in a matter of minutes, allowing companies to respond much more quickly to the data they’re collecting.”

“A compact GAA facility could even be trucked out to remote sites for rapid, on-the-spot analysis.”

Another great advantage of GAA is that it is more sustainable – unlike fire assay it doesn’t require the use of heavy metals such as lead.

It is also very adaptable. “While most of the work we’ve done has been based on the gold industry, the technique can be modified for other valuable commodities such as silver, lead, zinc, tin, copper and the platinum group metals.”

Now that the research team has proved the effectiveness of the technique, their next goal is to partner with local and international companies in order to get a full-scale analysis facility up and running in Australia. They hope to achieve this within the next two years.

Molten magma can survive in upper crust for hundreds of millennia

The formations in the Grand Canyon of the Yellowstone, in Yellowstone National Park, are an example of  silica-rich volcanic rock. -  Sarah Gelman/University of Washington
The formations in the Grand Canyon of the Yellowstone, in Yellowstone National Park, are an example of silica-rich volcanic rock. – Sarah Gelman/University of Washington

Reservoirs of silica-rich magma – the kind that causes the most explosive volcanic eruptions – can persist in Earth’s upper crust for hundreds of thousands of years without triggering an eruption, according to new University of Washington modeling research.

That means an area known to have experienced a massive volcanic eruption in the past, such as Yellowstone National Park, could have a large pool of magma festering beneath it and still not be close to going off as it did 600,000 years ago.

“You might expect to see a stewing magma chamber for a long period of time and it doesn’t necessarily mean an eruption is imminent,” said Sarah Gelman, a UW doctoral student in Earth and space sciences.

Recent research models have suggested that reservoirs of silica-rich magma, or molten rock, form on and survive for geologically short time scales – in the tens of thousands of years – in the Earth’s cold upper crust before they solidify. They also suggested that the magma had to be injected into the Earth’s crust at a high rate to reach a large enough volume and pressure to cause an eruption.

But Gelman and her collaborators took the models further, incorporating changes in the crystallization behavior of silica-rich magma in the upper crust and temperature-dependent heat conductivity. They found that the magma could accumulate more slowly and remain molten for a much longer period than the models previously suggested.

Gelman is the lead author of a paper explaining the research published in the July edition of Geology. Co-authors are Francisco Gutiérrez, a former UW doctoral student now with Universidad de Chile in Santiago, and Olivier Bachmann, a former UW faculty member now with the Swiss Federal Institute of Technology in Zurich.

There are two different kinds of magma and their relationship to one another is unclear. Plutonic magma freezes in the Earth’s crust and never erupts, but rather becomes a craggy granite formation like those commonly seen in Yosemite National Park. Volcanic magma is associated with eruptions, whether continuous “oozing” types of eruption such as Hawaii’s Kilauea Volcano or more explosive eruptions such as Mount Pinatubo in the Philippines or Mount St. Helens in Washington state.

Some scientists have suggested that plutonic formations are what remain in the crust after major eruptions eject volcanic material. Gelman believes it is possible that magma chambers in the Earth’s crust could consist of a core of partially molten material feeding volcanoes surrounded by more crystalline regions that ultimately turn into plutonic rock. It is also possible the two rock types develop independently, but those questions remain to be answered, she said.

The new work suggests that molten magma reservoirs in the crust can persist for far longer than some scientists believe. Silica content is a way of judging how the magma has been affected by being in the crust, Gelman said. As the magma is forced up a column from lower in the Earth to the crust, it begins to crystallize. Crystals start to drop out as the magma moves higher, leaving the remaining molten rock with higher silica content.

“These time scales are in the hundreds of thousands, even up to a million, years and these chambers can sit there for that long,” she said.

Even if the molten magma begins to solidify before it erupts, that is a long process, she added. As the magma cools, more crystals form giving the rock a kind of mushy consistency. It is still molten and capable of erupting, but it will behave differently than magma that is much hotter and has fewer crystals.

The implications are significant for volcanic “arcs,” found near subduction zones where one of Earth’s tectonic plates is diving beneath another. Arcs are found in various parts of the world, including the Andes Mountains of South America and the Cascades Range of the Pacific Northwest.

Scientists have developed techniques to detect magma pools beneath these arcs, but they cannot determine how long the reservoirs have been there. Because volcanic magma becomes more silica-rich with time, its explosive potential increases.

“If you see melt in an area, it’s important to know how long that melt has been around to determine whether there is eruptive potential or not,” Gelman said. “If you image it today, does that mean it could not have been there 300,000 years ago? Previous models have said it couldn’t have been. Our model says it could. That doesn’t mean it was there, but it could have been there.”

How shale fracking led to an Ohio town’s first 100 earthquakes

Since records began in 1776, the people of Youngstown, Ohio had never experienced an earthquake. However, from January 2011, 109 tremors were recorded and new research in Geophysical Research-Solid Earth reveals how this may be the result of shale fracking.

In December 2010, Northstar 1, a well built to pump wastewater produced by fracking in the neighboring state of Pennsylvania, came online. In the year that followed seismometers in and around Youngstown recorded 109 earthquakes; the strongest being a magnitude 3.9 earthquake on December 31, 2011.

The study authors analyzed the Youngstown earthquakes, finding that their onset, cessation, and even temporary dips in activity were all tied to the activity at the Northstar 1 well. The first earthquake recorded in the city occurred 13 days after pumping began, and the tremors ceased shortly after the Ohio Department of Natural Resources shut down the well in December 2011.

Dips in earthquake activity correlated with Memorial Day, the Fourth of July, Labor Day, and Thanksgiving, as well as other periods when the injection at the well was temporarily stopped.

“In recent years, waste fluid generated during the shale gas production – hydraulic fracturing, had been increasing steadily in United States. Earthquakes were triggered by these waste fluid injection at a deep well in Youngstown, Ohio during Jan. 2011 – Feb. 2012. We found that the onset of earthquakes and cessation were tied to the activity at the Northstar 1 deep injection well. The earthquakes were centered in subsurface faults near the injection well. These shocks were likely due to the increase in pressure from the deep waste water injection which caused the existing fault to slip,” said Dr. Won-Young Kim. “Throughout 2011, the earthquakes migrated from east to west down the length of the fault away from the well-indicative of the earthquakes being caused by expanding pressure front.”

Slow earthquakes may foretell larger events

Scanning electron microscope images showing localized shear surfaces in cross-section and oblique view. Sense of shear is top to the right Note striations on shear surface.  Similar patterns appear with serpentine. -  Haines, S. H.; Kaproth, B.; Marone, C.; Saffer, D. and B. A. van der Pluijm
Scanning electron microscope images showing localized shear surfaces in cross-section and oblique view. Sense of shear is top to the right Note striations on shear surface. Similar patterns appear with serpentine. – Haines, S. H.; Kaproth, B.; Marone, C.; Saffer, D. and B. A. van der Pluijm

Monitoring slow earthquakes may provide a basis for reliable prediction in areas where slow quakes trigger normal earthquakes, according to Penn State geoscientists.

“We currently don’t have any way to remotely monitor when land faults are about to move,” said Chris Marone, professor of geophysics. “This has the potential to change the game for earthquake monitoring and prediction, because if it is right and you can make the right predictions, it could be big.”

Marone and Bryan Kaproth-Gerecht, recent Ph.D. graduate, looked at the mechanisms behind slow earthquakes and found that 60 seconds before slow stick slip began in their laboratory samples, a precursor signal appeared.

Normal stick slip earthquakes typically move at a rate of three to 33 feet per second, but slow earthquakes, while they still stick and slip for movement, move at rates of about 0.004 inches per second taking months or more to rupture. However, slow earthquakes often occur near traditional earthquake zones and may precipitate potentially devastating earthquakes.

“Understanding the physics of slow earthquakes and identifying possible precursory changes in fault zone properties are increasingly important goals,” the researchers report on line in today’s (Aug. 15) issue of Science Express.

Using serpentine, a common mineral often found in slow earthquake areas, Marone and Kaproth-Gerecht performed laboratory experiments applying shear stress to rock samples so that the samples exhibited slow stick slip movement. The researchers repeated experiments 50 or more times and found that, at least in the laboratory, slow fault zones undergo a transition from a state that supports slow velocity below about 0.0004 inches per second to one that essentially stops movement above that spee

“We recognize that this is complicated and that velocity depends on the friction,” said Marone. “We don’t know for sure what is happening, but, from our lab experiments, we know that this phenomenon is occurring.”

The researchers think that what makes this unusual pattern of movement is that friction contact strength goes down as velocity goes up, but only for a small velocity range. Once the speed increases enough, the friction contact area becomes saturated. It can’t get any smaller and other physical properties take over, such as thermal effects. This mechanism limits the speed of slow earthquakes.
Marone and Kaproth-Gerecht also looked at the primary elastic waves and the secondary shear waves produced by their experiments.

“Here we see elastic waves moving and we know what’s going on with P and S waves and the acoustic speed,” said Marone. “This is important because this is what you can see in the field, what seismographs record.”

Marone notes that there are not currently sufficient measuring devices adjacent to known fault lines to make any type of prediction from the precursor signature of the movement of the elastic waves. It is, however, conceivable that with the proper instrumentation, a better picture of what happens before a fault moves in slip stick motion is possible and perhaps could lead to some type of predictions.