How does a volcanic crater grow? Grab some TNT and find out

A new University at Buffalo study in the journal Geophysical Research Letters examines maar craters, which resemble the bowl-like cavities formed by meteorites but are in some ways more mysterious.

Scientists often can discern pertinent details about meteorites — when they struck, how large they were, the angle they approached Earth and other information — by measuring the diameter and volume of the impact crater.

Maar craters, which form when fissures of magma beneath Earth’s surface meet groundwater, causing volcanic explosions, are not as telling, scientists say. The possibility of multiple explosions at varying depths led most scientists to believe that measuring a maar’s size is not the best way to gauge the energy of individual explosions or determine future hazards.

UB geologist Greg A. Valentine, PhD, and other volcano researchers found instead that examining a maar’s shape and the distance it ejects magma, ash and other debris to be a more accurate barometer of the eruption’s force. The findings are important, he said, because they could assist scientists in estimating how big future volcano eruptions might b

“It’s something that, up until this point, had only been suspected,” said Valentine, a professor of geology and lead author of the Geophysical Research Letters paper. “The simulations we did prove that crater diameter is not a good indicator of explosion energy for these volcanoes.”
The scientists drew their conclusions on a series of UB-funded experiments conducted last summer at a test site in Ashford, N.Y. They built three test beds of gravel, limestone and asphalt. In the first experiment (see the video below) one charge of TNT and plastic explosive was detonated.

In subsequent experiments, the charge was divided into three parts and detonated individually at different depths. The final dimensions of each crater were about the same. That matters, according to Valentine, because it shows that it’s easy to overestimate the energy of explosions if one assumes that the crater comes from one blast, not several.

The dispersal of ejected material differed depending on the location of the charge. For example, the first experiment launched debris more than 50 feet from the crater. Debris from subsequent experiments simulating blasts further underground mostly went up in the air and fell back into the crater or around its rim. As a result, it forced dusty gas — like the ash that shut down air travel in Iceland and beyond in 2010 — into the surrounding air. This can be seen in the video below.

Although the experiments provided valuable information, Valentine said they were similar to a practice run. More detailed experiments are being planned for the near future, he said.

Working under extreme conditions

Freezing temperatures, icing, snow and an unpredictable climate have a bigger impact on a platform in the Barents Sea than in the North Sea. -  Illustration: Ole Andre Hauge/ NettOp, UiS
Freezing temperatures, icing, snow and an unpredictable climate have a bigger impact on a platform in the Barents Sea than in the North Sea. – Illustration: Ole Andre Hauge/ NettOp, UiS

“Try to imagine changing a tire in freezing weather, snow and darkness,” says professor Tore Markeset, a specialist in cold climate technology at the University of Stavanger (UiS).

That is his way of visualising the challenges facing oil companies seeking to produce oil and gas from the far north of the Norwegian continental shelf (NCS).

Weather, winter darkness, vast distances, and safety and emergency response challenges for petroleum facilities in the Barents Sea will all be more extreme than in the North Sea.

“Compared with our expertise from the southern NCS, we know little about how to run an offshore production installation in a cold climate,” adds professor Ove Tobias Gudmestad at the UiS.

Both he and Markeset are active in teaching, research and finding solutions for developing fields in areas with Arctic conditions.


The Norwegian government opened the southern part of Norway’s Barents Sea sector for petroleum operations in the early 1980s. Activity was high but finds few, so interest declined.

But enthusiasm for these waters has recovered sharply in the recent past, particularly after the Skrugard oil discovery was made during 2011.

Promising seismic data, several large oil and gas finds and new technology mean that a growing number of oil companies are discussing opportunities to explore further north on the NCS.

Norway’s Statoil company announced earlier this autumn that it will more than treble its spending on technology for Arctic waters from NOK 80 million in 2012 to NOK 250 million next year.

The increased interest in the far north reflects not least clear government policies, with petroleum and energy minister Ola Borten Moe talking enthusiastically about Barents Sea activities.

Norway currently has one gas production facility operating in the southern Barents Sea. This Snøhvit field has been on stream since 2007.

And operator Eni aims to bring its Goliat oil development into production in 2014. Both that field and Snøhvit lie north-west of Hammerfest, 85 and 140 kilometres from land respectively.

Skrugard, for its part, is located roughly 100 kilometres further north-west from Snøhvit and is expected to come on stream during 2018.


Climate is the main challenge for impatient oil companies and official agencies, says Gudmestad. “It’s unpredictability for much of the year which is really special about the far north.

“Weather conditions in these waters differ from the North Sea in terms of low temperatures, icing, fog, heavy snowfalls and sudden changes.”

The North Sea weather is easier to forecast, he notes. “When a low pressure area over Iceland moves east, we know it’ll bring bad weather and can plan operations accordingly.”

“In the Barents Sea, however, deep troughs of low pressure develop at the interface between ice and open water. These can’t be predicted, and may create sudden storms and hurricanes.”

Wind and waves have much to say for the safety of shipping, fishing and offshore operations. So accurate weather forecasts are important.

With fewer monitoring stations gathering data in the Arctic, however, forecasting in the Barents Sea is less accurate than in the North Sea.


The further north you go, the longer the winter becomes and the shorter the summer season. A good deal of offshore work must necessarily be squeezed into the latter. But the weather in these brief months may not allow planned activities to go ahead.

Arctic conditions can also mean that equipment fails or breaks down in unfamiliar ways, or more frequently than is the case in warmer regions.

Repairing broken equipment is also likely to take more time, or preventive maintenance may be necessary to make sure that the equipment functions properly.

“Things will take longer,” explains Markeset. “If you need to make repairs but the equipment is covered in snow, you’ll have to dig it out before the job can begin.

“It’s also harder to work in a temperature of -30°C than in 10°C. Putting small screws into place is a slow business with gloves on.”

The more frequently equipment breaks down and the longer it takes to repair, the less time will be available for producing oil and gas, he points out.

“When we invest in a production facility for the far north, it must be up and running for as much of the time as possible so that we can produce profitably.

“A huge problem would be faced if an installation failed to cope with the local climate. We’d be left with a massive and expensive machine which yields little.”


Among the issues he and Gudmestad work on is winterisation – in other words, tailoring equipment and workplaces so that they can operate normally in a harsh winter climate.

They say that much could be different in the far north, including the need for special steels and equipment when temperatures fall low enough.

Plastics, rubber, metals and lubricants are examples of materials which change their properties under extreme cold, and which must be adapted to the Arctic environment.

Electrical systems, sensors, cables, valves, motors and pumps must all be specially manufactured. Piping, tanks and pumps containing liquids which could freeze have to be kept warm even if the installation shuts down to avoid being burst by frozen fluids.

An increased need to heat equipment and facilities and to provide lighting will boost energy consumption on installations – from heating cables in corridors and on helidecks, for instance.


The cold means that more equipment must be enclosed on units working in the Barents Sea than further south. And that in turn calls for more fans to prevent gas accumulations.

“Outfitting facilities in the far north will be more complex because of the need for heating and increased use of sensors to measure equipment condition,” observes Markeset.

“The companies will rely more on remote monitoring of and support for equipment via centres located in such places as Stavanger or Tromsø.

“Real-time diagnosis of systems will be crucial, making use of the internet, fibreoptic cables, satellites and specially developed sensors.

“A key role will also be played here by experts who could be located either at suppliers or in service companies in Germany, Italy, the USA or elsewhere.”

On the other hand, he points out, equipment protected through proper winterisation technology (such as being enclosed, with heat tracing and special designs) and better monitored with the aid of sensors may actually be more reliable and break down less frequently than in warmer waters.


Companies must also take account of longer distances to market for delivering oil and gas as well as for providing operational, maintenance and support services, Markeset notes.

“Operators have to make more thorough preparations, keep more spare parts on hand and perhaps have more and better expertise on board the production facility. That’ll all add to costs.”

In addition, facilities operating above the Arctic Circle must take 24-hour winter darkness into account. “We know that more accidents happen at night,” Markeset says.

“An important question which needs to be researched is how the perpetual winter darkness affects work processes on an offshore facility.”


The south-western Barents Sea is not much troubled by sea ice. The UiS scientists are accordingly looking at opportunities for field development and operation in an environment largely free of sea ice, but where drift ice must nevertheless be expected.

“Ice floes may weigh several hundred tonnes,” Gudmestad points out. “They can be tossed around by waves and driven with great force against an installation, causing serious damage.

“Our present equipment isn’t up to a collision with drift ice. That’ll limit the time we can drill in the northern Barents Sea – unlike year-round drilling in the North Sea.”

He points to the Shtokman gas field in the Russian sector of the Barents Sea, which lies in an area affected by both drift and pack ice.

“Pack ice involves extreme forces. It can exert pressure from every direction. A platform placed in the middle of it must be able to withstand such forces.

“In my view, we shouldn’t aim to position facilities on the NCS in pack ice – in other words, in the northern Barents Sea – to start with.”

He accordingly believes in a gradual advance northwards, in order to learn from conditions at each stage and to keep in step with technological progress.

“Although operations have begun in pack ice off Russia and Alaska, we should aim first and foremost to research and develop production solutions for the south-western Barents Sea.

“This is an area with extreme weather conditions and below-freezing temperatures, but where drift ice only appears now and then.”


“Before deciding to build oil installations for the northern Barents Sea, we must know whether this is achievable,” Gudmestad explains.

“Plant availability must be satisfactory so that it’s profitable. We must also think about how to prevent accidents and how to respond should one nevertheless occur.”

His concern is to ensure that equipment, organisation and working methods are tailored to the Arctic environment, and thereby to reduce the probability of undesirable incidents.

“We must, for example, be certain that the evacuation system works in the Arctic. We can’t use freefall lifeboats if there’s ice on the water, to take a case in point.”


Both he and Markeset are concerned that oil spills in the far north pose a big threat to the vulnerable environment in these waters.

“Much of the existing oil spill clean-up equipment hasn’t been designed to operate in a cold climate and drift ice,” says Markeset.

“We must accordingly come up with better methods for collecting any oil spill in waters where sea ice could be encountered.”

The two UiS scientists do not get involved in Norwegian oil policies, and stress that determining how far north petroleum activities should extend on the NCS is a job for the politicians.

“Our role is to make it technically and organisationally possible to operate in the various sea areas when the government decides to set things going,” says Gudmestad.

“We must, for example, build even more safely in the Barents Sea than in the North and Norwegian Seas, and reduce the probability of oil spills even further.

“That’s precisely because the consequences of such discharges are greater in the far north. We must devote extensive resources and work to avoiding oil spills.”

Magnesium oxide: From Earth to super-Earth

The mantles of Earth and other rocky planets are rich in magnesium and oxygen. Due to its simplicity, the mineral magnesium oxide is a good model for studying the nature of planetary interiors. New work from a team led by Carnegie’s Stewart McWilliams studied how magnesium oxide behaves under the extreme conditions deep within planets and found evidence that alters our understanding of planetary evolution. It is published November 22 by Science Express.

Magnesium oxide is particularly resistant to changes when under intense pressures and temperatures. Theoretical predictions claim that it has just three unique states with different structures and properties present under planetary conditions: solid under ambient conditions (such as on the Earth’s surface), liquid at high temperatures, and another structure of the solid at high pressure. The latter structure has never been observed in nature or in experiments.

McWilliams and his team observed magnesium oxide between pressures of about 3 million times normal atmospheric pressure (0.3 terapascals) to 14 million times atmospheric pressure (1.4 terapascals) and at temperatures reaching as high as 90,000 degrees Fahrenheit (50,000 Kelvin), conditions that range from those at the center of our Earth to those of large exo-planet super-Earths. Their observations indicate substantial changes in molecular bonding as the magnesium oxide responds to these various conditions, including a transformation to a new high-pressure solid phase.

In fact, when melting, there are signs that magnesium oxide changes from an electrically insulating material like quartz (meaning that electrons do not flow easily) to a metal similar to iron (meaning that electrons do flow easily through the material).

Drawing from these and other recent observations, the team concluded that while magnesium oxide is solid and non-conductive under conditions found on Earth in the present day, the early Earth’s magma ocean might have been able to generate a magnetic field. Likewise, the metallic, liquid phase of magnesium oxide can exist today in the deep mantles of super-Earth planets, as can the newly observed solid phase.

“Our findings blur the line between traditional definitions of mantle and core material and provide a path for understanding how young or hot planets can generate and sustain magnetic fields,” McWilliams said.

“This pioneering study takes advantage of new laser techniques to explore the nature of the materials that comprise the wide array of planets being discovered outside of our Solar System,” said Russell Hemley, director of Carnegie’s Geophysical Laboratory. “These methods allow investigations of the behavior of these materials at pressures and temperatures never before explored experimentally

GOCE’s second mission improving gravity map

ESA’s GOCE gravity satellite has already delivered the most accurate gravity map of Earth, but its orbit is now being lowered in order to obtain even better results.

The Gravity field and steady-state Ocean Circulation Explorer (GOCE) has been orbiting Earth since March 2009, reaching its ambitious objective to map our planet’s gravity with unrivalled precision.

Although the planned mission has been completed, the fuel consumption was much lower than anticipated because of the low solar activity over the last two years. This has enabled ESA to extend GOCE’s life, improving the quality of the gravity model.

To be able to measure the strength of Earth’s gravity, the satellite was flying in an extraordinarily low orbit about 255 km high – about 500 km lower than most Earth observation satellites.

Based on a clear preference from the GOCE user community, ESA’s Earth Scientific Advisory Committee recommended lowering the orbit to 235 km starting in August.

Lowering the orbit increases the accuracy and resolution of GOCE’s measurements, improving our view of smaller ocean dynamics such as eddy currents.

The control team began the manoeuvres in August, lowering GOCE by about 300 m per day.

After coming down by 8.6 km, the satellite’s performance and new environment were assessed. Now, GOCE is again being lowered while continuing its gravity mapping. Finally, it is expected to reach 235 km in February.

As the orbit drops, atmospheric drag increasingly pulls the satellite towards Earth. But GOCE was designed to fly low, the tiny thrust of its ion engine continuously compensating for any drag.

The expected increase in data quality is so high that scientists are calling it GOCE’s ‘second mission.’

“For us at ESA, GOCE has been a fantastic mission and it continues to surprise us,” said Volker Liebig, ESA’s Director of Earth Observation Programmes.

“What the team of ESA engineers is now doing has not been done before and it poses a challenge. But it will also trigger new research in the field of gravity based on the high-resolution data we are expecting.”

The first ‘geoid’ based on GOCE’s gravity measurements was unveiled in June 2010. It is the surface of an ideal global ocean in the absence of tides and currents, shaped only by gravity.

A geoid is a crucial reference for conducting precise measurements of ocean circulation, sea-level change and ice dynamics.

The mission has also been providing new insight into air density and wind in space, and its information was recently used to produce the first global high-resolution map of the boundary between Earth’s crust and mantle.

USA’s ancient hurricane belt and the US-Canada equator

Mountain and valley near G.B. Schley Fjord, North Greenland: researchers found a strange limestone in a 400 meter high mountain. The 150 metre thick brachiopod coquina is shown in yellow. Note the geologist camp for scale of mountain. At the bottom left is an inserted view showing how the rock appears in close-up. White areas in the dark limestone are brachiopod shells. -  Christian Mac Ørum Rasmussen
Mountain and valley near G.B. Schley Fjord, North Greenland: researchers found a strange limestone in a 400 meter high mountain. The 150 metre thick brachiopod coquina is shown in yellow. Note the geologist camp for scale of mountain. At the bottom left is an inserted view showing how the rock appears in close-up. White areas in the dark limestone are brachiopod shells. – Christian Mac Ørum Rasmussen

The recent storms that have battered settlements on the east coast of America may have been much more frequent in the region 450 million years ago, according to scientists.

New research pinpointing the positions of the Equator and the landmasses of the USA, Canada and Greenland, during the Ordovician Period 450 million years ago, indicates that the equator ran down the western side of North America with a hurricane belt to the east.

The hurricane belt would have affected an area covering modern day New York State, New Jersey and most of the eastern seaboard of the USA.

An international research team led by Durham University, UK, used the distribution of fossils and sediments to map the line of the Ordovician Equator down to southern California.

The study, published in the journal Geology, is the first to accurately locate and map the ancient Equator and adjacent tropical zones. Previous studies had fuelled controversy about the precise location of the ancient equator. The researchers say the new results show how fossils and sediments can accurately track equatorial change and continental shifts over time.

Co-lead author Professor David Harper, Department of Earth Sciences, Durham University, UK, said: “The equator, equatorial zones and hurricane belts were in quite different places in the Ordovician. It is likely that the weather forecast would have featured frequent hurricane-force storms in New York and other eastern states, and warmer, more tropical weather from Seattle to California.”

Since Polar Regions existed 450 million years ago, the scientists believe that there would have been similar climate belts to those of today.

The research team from Durham University, UK, and universities in Canada, Denmark and the USA, discovered a belt of undisturbed fossils and sediments -deposits of shellfish- more than 6000 km long stretching from the south-western United States to North Greenland. The belt also lacks typical storm-related sedimentary features where the deposits are disturbed by bad weather. The researchers say that this shows that the Late Ordovician equatorial zone, like the equatorial zone today, had few hurricane-grade storms.

In contrast, sedimentary deposits recorded on either side of the belt provide evidence of disturbance by severe storms. Hurricanes tend to form in the areas immediately outside of equatorial zones where temperatures of at least 260C combine with the Earth’s rotation to create storms. The researchers believe that hurricane belts would probably have existed on either side of the ancient equator, within the tropics.

The position of the equatorial belt, defined by undisturbed fossil accumulations and sediments, is coincident with the Late Ordovician equator interpreted from magnetic records (taken from rocks of a similar age from the region). This provides both a precise equatorial location and confirms that the Earth’s magnetic field operated much in the same way as it does today.

The scientists pieced together the giant jigsaw map using the evidence of the disturbed and undisturbed sedimentary belts together with burrows and shells. Using the findings from these multiple sites, they were able to see that North America sat on either side of the Equator.

Co-author Christian Rasmussen, University of Copenhagen, said: “The layers of the earth build up over time and are commonly exposed by plate tectonics. We are able to use these ancient rocks and their fossils as evidence of the past to create an accurate map of the Ordovician globe.”

Professor Harper added: “The findings show that we had the same climate belts of today and we can see where North America was located 450 million years ago, essentially on the Equator.”

“While the Equator has remained in approximately the same place over time, the landmasses have shifted dramatically over time through tectonic movements. The undisturbed fossil belt helps to locate the exact position of the ancient Laurentian landmass, now known as North America.”

Surveying Earth’s interior with atomic clocks

An initial high-precision atomic clock prototype, ACES (Atomic Clock Ensemble in Space), is already due to be taken to the Columbus Space Lab at the International Space Station (ISS) by 2014. -  European Space Agency ESA, D. Ducros
An initial high-precision atomic clock prototype, ACES (Atomic Clock Ensemble in Space), is already due to be taken to the Columbus Space Lab at the International Space Station (ISS) by 2014. – European Space Agency ESA, D. Ducros

Have you ever thought to use a clock to identify mineral deposits or concealed water resources within the Earth? An international team headed by astrophysicists Philippe Jetzer and Ruxandra Bondarescu from the University of Zurich is convinced that ultraprecise portable atomic clocks will make this a reality in the next decade. The scientists argue that these atomic clocks have already reached the necessary degree of precision to be useful for geophysical surveying. They say that such clocks will provide the most direct measurement of the geoid – the Earth’s true physical form. It will also be possible to combine atomic clocks measurements to existent geophysical methods to explore the interior of the Earth.

Determining geoid from general relativity

Today, the Earth’s geoid – the surface of constant gravitational potential that extends the mean sea level – can only be determined indirectly. On continents, the geoid can be calculated by tracking the altitude of satellites in orbit. Picking the right surface is a complicated, multivalued problem. The spatial resolution of the geoid computed this way is low – approximately 100 km.

Using atomic clocks to determine the geoid is an idea based on general relativity that has been discussed for the past 30 years. Clocks located at different distances from a heavy body like our Earth tick at different rates. Similarly, the closer a clock is to a heavy underground structure the slower it ticks – a clock positioned over an iron ore will tick slower than one that sits above an empty cave. “In 2010 ultraprecise atomic clocks have measured the time difference between two clocks, one positioned 33 centimeters above the other,” explains Bondarescu before adding: “Local mapping of the geoid to an equivalent height of 1 centimeter with atomic clocks seems ambitions, but within the reach of atomic clock technology.”

Geophysical surveying with atomic clocks

According to Bondarescu, if an atomic clock is placed at sea level, i.e., at the exact altitude of the geoid, a second clock could be positioned anywhere on the continent as long as it is synchronized with the first clock. The connection between the clocks can be made with fiber optics cable or via telecommunication satellite provided that the transmission is reliable enough. The second clock will tick faster or slower, depending on whether it is above of beneath the geoid. The local measurement of the geoid can then be combined with other geophysical measurements such as those from gravimeters, which measure the acceleration of the gravitational field, to get a better idea of the underground structure.

Mappings possible to great depths

In principle, atomic clock surveying is possible to great depth provided that the heavy underground structure to be studied is large enough to affect the tick rates of clocks in a measurable manner. The smallest structure that atomic clocks accurate to 1 centimeter in geoid height can determine is a buried sphere with a radius of about 1.5 kilometer buried at 2 kilometers under the surface provided it has a density contrast of about 20% with the surrounding upper crust. However, scientists estimate that the same clocks would be sensitive to a buried sphere with a radius of 4 kilometers at a depth of about 30 kilometers for the same density contrast.

Currently, ultraprecise atomic clocks only work in labs. In other words, they are not transportable and thus cannot be used for measurements in the field. However, this is all set to change in the next few years: Various companies and research institutes, including the Centre Suisse d’Electronique et de Microtechnique CSEM based in Neuchâtel, are already working on the development of portable ultraprecise atomic clocks. “By 2022 at the earliest, one such ultraprecise portable atomic clock will fly into Space on board an ESA satellite,” says Professor Philippe Jetzer, the Swiss delegate for the STE-Quest satellite mission aimed at testing the general relativity theory very precisely. As early as 2014 or 2015, the “Atomic Clock Ensemble in Space ACES” is to be taken to the International Space Station ISS. ACES is an initial prototype that does not yet have the precision of STE-QUEST.

At least 6 major earthquakes on the Alhama de Murcia fault in the last 300,000 years

This is the Church of Santiago de Lorca, destroyed in the earthquake of 2011. -  Antonio Periago Miñarro.
This is the Church of Santiago de Lorca, destroyed in the earthquake of 2011. – Antonio Periago Miñarro.

Enjoying Spanish participation, an international group of researchers have analysed the most recent history of the Alhama de Murcia fault. They discovered that it has experienced six major earthquakes above 7 on the Richter scale. According to the scientists, this provides “convincing evidence” that the maximum earthquake magnitudes in the area are higher than originally thought.

Since 2001, researchers from the Universities of Barcelona, Leon, Complutense de Madrid (UCM), Coimbra (Portugal), Aahus (Denmark) and the National Autonomous University of Mexico have been working on the Alhama de Murcia fault in order to identify those high magnitude earthquakes that have occurred during the Quaternary period – the most recent of geological ages.

“Due to lack of information, up until just ten years ago there were no geological data on paleoseismic activity for the active faults in Spain and very little had been invested in studying their geology. The evidence used to understand active faults came merely from historical seismic records that hardly collected data on the largest of earthquakes related to these faults. As geology can go even further back in time, the earthquakes that we have found are bigger and of a greater magnitude,” as explained to SINC by Jose J. Martínez Díaz, researcher at the UCM and coauthor of the study published in the journal ‘Geological Society of America Bulletin‘.

This fault is a fracture plane of the land that crosses the entire earth’s crust. Therefore, to identify the prehistoric earthquakes in its walls, scientists had to make surface excavations perpendicular to the fault (trenches of between 20 and 30 meters long and 4 meters deep). This allowed for them to take an exceptionally extensive paleoseismic record.

As Martínez adds, “when large earthquakes exceed magnitude 6, they usually break at the surface and as a result, we have been able to identify this in their walls.” These tectonic deformations were dated using carbon-14 and infrared stimulated luminescence techniques.

In order to understand the behavioural patterns of the Alhama de Murcia fault, the researchers had to reconstruct hundreds of thousands of years “much more than the Americans or the Japanese, who can understand their fault patterns by studying just 10 thousand years.”

This is because faults in Spain are slow-moving and there is therefore much more time between major earthquakes (to the tune of thousands of years) compared to much faster-moving faults like San Andreas in California. According to their estimations, the Alhama fault would be created more than 9 million years ago and would have caused earthquakes from the outset thus shaping the landscape of the region.

“It was in our interest to detect the seismic activity from the Quaternary period, or, in other words, earthquakes that occurred more than 1.8 million years ago. In total, we have identified a minimum of six earthquakes of high magnitude during the period studied (more than 300,000 years) but we know that the real number is actually much higher. In some cases, sedimentary evidence could have disappeared or maybe they can be found in parts of the fault that have yet to be studied,” outlines the researcher.

An underestimated seismic danger?

Another revelation according to the article is that the area could suffer from a stronger earthquake than originally thought. “During earthquakes, the entire length of the fault does not break. It does so in segments. We have proven that this fault could break at once at the two western segments, from Góñar (Almería) to Totana (Murcia) causing at the same time an earthquake of a magnitude above 7,” explains Martínez.

“This fault has already produced an earthquake of magnitude 6.5 or 7 thousands of years ago, and could do so again tomorrow. As a result, it is vital to bear in mind the earthquake risk calculations and building codes on the area,” outline the researchers.

The seismic hazard map forming the basis of the Spanish Seismic Resistance Construction Standard assigns the area of Lorca with a maximum acceleration for construction design of 0.19 g. However, the recent earthquake reached a magnitude of 5.2 and generated a much higher acceleration of 0.36 g.

“The area’s hazard level was underestimated because until now estimations have been based on the historical earthquake catalogue which only records events from the last 2000 years,” points out Martínez. The researcher believes that the fault activity parameters obtained through paleoseismic studies like this can help to improve risk calculations.

But will they be capable of predicting the next high magnitude movement? The authors stress that it is indeed possible to determine the maximum magnitude as well as the location of the earthquake. However, at present there is still no way of predicting the moment it will strike, as this involves a complex geological phenomenon governed by non-linear physical processes.

“Earthquakes like the one in Lorca and ones before produce fault stress changes which increase in certain points of the fault. We know this thanks to models and results that we have published from previous studies. The next earthquake is more likely to occur in these areas. However, estimating when is impossible,” ensures the scientist.

With regards to the study recently published in the ‘Nature Geoscience’ journal that alludes to water extraction being the possible cause of the 2011 earthquake in Lorca, the researcher is doubtful.

“There is much scientific discussion on the matter, we are part of various groups that have been working in the area for some time and I am not the only one who is sceptic of the idea. The 2011 earthquake in Lorca was similar to those that took place in 1674 and 1818 at a time when aquifer exploitation was not practiced. I believe that there is no need to search for any unusual reason behind the earthquake. It was down to the fault’s natural tectonic evolution. It was a completely normal earthquake from a geological point of view – the small magnitude kind which occurs on a fault every so often,” concludes Martínez.

Taking the ‘pulse’ of volcanoes using satellite images

This image shows averaged 2006-2009 ground velocity map of the west Sunda volcanic region from the Japanese Space Agency's ALOS satellite. Positive velocity (red colors) represents movement towards the satellite (e.g. uplift) and negative velocity (blue colors) movement away from the satellite (e.g. subsidence). Locations of volcanoes are marked by black triangles, historically active volcanoes by red triangles. Insets show six inflating volcanoes. -  Estelle Chaussard, University of Miami
This image shows averaged 2006-2009 ground velocity map of the west Sunda volcanic region from the Japanese Space Agency’s ALOS satellite. Positive velocity (red colors) represents movement towards the satellite (e.g. uplift) and negative velocity (blue colors) movement away from the satellite (e.g. subsidence). Locations of volcanoes are marked by black triangles, historically active volcanoes by red triangles. Insets show six inflating volcanoes. – Estelle Chaussard, University of Miami

A new study by scientists at the University of Miami (UM) Rosenstiel School of Marine & Atmospheric Science uses Interferometric Synthetic Aperture Radar (InSAR) data to investigate deformation prior to the eruption of active volcanoes in Indonesia’s west Sunda arc. Led by geophysicist Estelle Chaussard and UM Professor Falk Amelung, the study uncovered evidence that several volcanoes did in fact ‘inflate’ prior to eruptions due to the rise of magma. The fact that such deformation could be detected by satellite is a major step forward in volcanology; it is the first unambiguous evidence that remotely detected ground deformation could help to forecast eruptions at volcanoes.

“Surveying entire volcanic regions using satellite data is of primary importance to the detection of ground deformation prior to the onset of eruptions. If volcanic inflation is observed, it can help us to predict where the next eruption may occur. Moreover, in regions like Indonesia, where volcanoes are prevalent and pose a threat to millions of people, and where ground-based monitoring is sparse, remote sensing via satellite could become a major forecasting tool,” said Chaussard.

Analyzing more than 800 InSAR images from the Japanese Space Exploration Agency’s ALOS satellite, the team surveyed 79 volcanoes in Indonesia between 2006 and 2009. They detected deformation at six volcanic centers, three of which erupted after the observation period, confirming that inflation is a common precursor of volcanic eruptions at west Sunda volcanoes.

“The notion of detecting deformation prior to a volcanic eruption has been around for a while,” said Amelung, who has been studying active volcanoes for 15 years. “Because this region is so volcanically active, our use of InSAR has been very successful. We now have a tool that can tell us where eruptions are more likely to occur.”

The team will now study other parts of Indonesia and then in the Philippines, also prone to volcanic activity. They will use data from the Japanese Space Agency’s ALOS-2 which will be launched next year.

“The monitoring of changes to the Earth’s surface helps us to better predict the onset of volcanic activity, which can have devastating impacts on human life,” said Amelung. “Like with earthquakes and tsunamis, however, we cannot predict activity with certainty, but we hope that new tools like satellite remote sensing will help us to gather critical information in near real-time so we can anticipate the risk of eruptions and deploy resources in a timely manner.”

This study also reveals that there are regional trends in depths of magma storage. Indonesian volcanoes have magma reservoirs at shallow depths probably due to the tectonic setting of the region, which account for the way the region is deforming. If a volcanic chamber is located close to the surface it is usually associated with a higher risk for significant eruption, thus these observations play a major role in volcanic hazard assessment.

Field geologists (finally) going digital

Not very long ago a professional geologist’s field kit consisted of a Brunton compass, rock hammer, magnifying glass, and field notebook. No longer. In the field and in the labs and classrooms, studying Earth has undergone an explosive change in recent years, fueled by technological leaps in handheld digital devices, especially tablet computers and cameras.

Geologist Terry Pavlis’ digital epiphany came almost 20 years ago when he was in a museum looking at a 19th-century geology exhibit that included a Brunton compass. “Holy moly!” he remembers thinking, “We’re still using this tool.” This is despite the fact that technological changes over the last 10 years have not only made the Brunton compass obsolete, but swept away paper field notebooks as well (the rock hammer and hand-lens magnifier remain unchallenged, however).

The key technologies that replace the 19th-century field tools are the smart phone, PDA, handheld GPS, and tablet PC and iPad. Modern tablets, in particular, can do everything a Brunton compass can, plus take pictures and act as both a notebook and mapping device, and gather precise location data using GPS. They can even be equipped with open-source GIS software.

Pavlis, a geology professor at The University of Texas at El Paso, and Stephen Whitmeyer of James Madison University will be presenting the 21st-century way to do field geology on Monday, 5 Nov., at the meeting of the Geological Society of America (GSA) in Charlotte, N.C. The presentations are a part of a digital poster Pardee Keynote Symposium titled, “Digital Geology Speed-Dating: An Innovative Coupling of Interactive Presentations and Hands-On Workshop.”

“I had a dream we would not be touching paper anymore,” says Pavlis. “I’m now sort of an evangelist on this subject.”

That’s not to say that the conversion to digital field geology is anywhere near complete. The new technology is not quite catching on in some university field courses because the technology is more expensive and becomes obsolete quickly, says Pavlis.

“Field geology courses are expensive enough for students,” he notes. As a result, the matter of teaching field geology with digital tools is actually rather controversial among professors.

Meanwhile, on the classroom side of earth science education, there are new digital tools that bring the field into the classroom. One of them is GigaPans – gigantic panorama images.

“A GigaPan is basically a really big picture that’s made of lots of full-resolution zoomed-in photos,” explains geologist Callan Bentley of Northern Virginia Community College. To make a GigaPan, you need a GigaPan Robot that looks at the scene and breaks it into a grid, then shoots the grid. That can result in hundreds or even thousands of images. The GigaPan system then stitches them together. The resulting stitched image is uploaded to the website where everybody can see it.

“In geology, we look at things in multiple scales,” says Bentley. “A well-composed GigaPan is very useful.” Bentley will be presenting GigaPans at the same GSA meeting session as Pavlis, along with others using the latest technology to study and teach geology.

GigaPans were developed by Google, NASA, and the robotics lab at Carnegie Mellon University. Bentley got involved when the “Fine Outreach for Science” program recruited him. Since then, he has documenting geology of the Mid-Atlantic region.

“I have used some of it in the classroom,” said Bentley. “I have students look at a scene, make a hypothesis then look closer to test the hypothesis.”

Geologist calls for advances in restoration sedimentology

This is Douglas Edmonds. -  Indiana University
This is Douglas Edmonds. – Indiana University

Rapid advances in the new and developing field of restoration sedimentology will be needed to protect the world’s river deltas from an array of threats, Indiana University Bloomington geologist Douglas A. Edmonds writes in the journal Nature Geoscience.

The commentary, published this week in the November issue, addresses the fact that land is disappearing from river deltas at alarming rates. And deltas are extraordinarily important: They are ecologically rich and productive, and they are home to about 10 percent of the world’s population.

“There’s a lot of talk about ecological restoration of the coast,” Edmonds said. “But with delta environments, before ecological restoration can happen you have to stabilize the coastline.”

Under naturally occurring processes, coastal land is both created and destroyed at river deltas. River sediment is deposited at the delta, building land. Erosion takes some of the land away. The rate of land growth or loss depends on the balance between “sources” and “sinks,” which is influenced by the complex interaction of floods, ocean waves and tides, vegetative decay and wind.

But sea-level rise and coastal subsidence have tilted the scales toward land loss, and dams and levees built for flood control have interfered with the delivery of sediment. In the Mississippi River delta, the chief focus of the article, an expanse of land the size of a football field disappears every hour.

Edmonds says there is potential for restoring deltas by designing river diversions that direct sediment from rivers to areas where it can do the most good.

“The main challenges for restoration sedimentology,” he writes, “are understanding the sources and sinks, and predicting the rate of land growth under any given river diversion scenario.”

For example, river sediment must be deposited near the shore, not carried into the deep ocean, to help create land. Hurricanes and waves carry away that sediment in some circumstances but in others they encourage deposition.

Because of dams and flood-control barriers, the Mississippi River doesn’t appear to carry enough sediment to offset sea-level rise and coastal subsidence. “From today’s perspective,” Edmonds says, “the future of the Mississippi River delta is grim. But river diversions have proven successful, and there is a lot we don’t know about the sedimentological processes of land-building that may change projections.”

For instance, much remains to be learned about the interaction of forces that affect delta sedimentology. The “most significant unknown,” he says, is the contribution of organic matter from decomposing plants to land building — it is estimated to be as high as 34 percent in the Mississippi delta.

“The idea is to better understand the pathways by which sedimentology constructs delta land and the sinks by which that land is lost,” Edmonds said. “It’s all about that balance. And the more we know, the better we can engineer scenarios to tip the balance in favor of building land as opposed to drowning land.”

Edmonds holds the Robert R. Schrock Professorship in Sedimentary Geology and is an assistant professor in the IU Bloomington Department of Geological Sciences in the College of Arts and Sciences. His research focuses on the sedimentology, stratigraphy and geomorphology of depositional systems, which he studies using mathematical modeling, field observation and occasionally experimentation.