When the Earth mantle finds its core

This is a scanning electron microscopy image of a 'mantle' sample after transformation, stuck on a copper grille and thinned down by a focused ion beam (FIB). It allows to detect the different synthesized minerals and liquids during these experiments: a matrix, consisting of a phase of a perovskite structure ((Mg,Fe)SiO3), -- the most abundant mineral in the Earth because it is the most stable in the inferior mantle), is shown in light gray. The veins and liquid pockets enriched in iron and calcium are visible (in dark grey). Scale of the horizontal bar is 2 micrometers. -  G. Fiquet, IMPMC.
This is a scanning electron microscopy image of a ‘mantle’ sample after transformation, stuck on a copper grille and thinned down by a focused ion beam (FIB). It allows to detect the different synthesized minerals and liquids during these experiments: a matrix, consisting of a phase of a perovskite structure ((Mg,Fe)SiO3), — the most abundant mineral in the Earth because it is the most stable in the inferior mantle), is shown in light gray. The veins and liquid pockets enriched in iron and calcium are visible (in dark grey). Scale of the horizontal bar is 2 micrometers. – G. Fiquet, IMPMC.

The Earth’s mantle and its core mix at a distance of 2900 km under our feet in a mysterious zone. A team of geophysicists has just verified that the partial fusion of the mantle is possible in this area when the temperature reaches 4200 Kelvin. This reinforces the hypothesis of the presence of a deep magma ocean. The originality of this work, carried out by the scientists of the Institut de minéralogie et de physique des milieux condensés (UPMC/Université Paris Diderot/Institut de Physique du Globe/CNRS/IRD), lies in the use of X-ray diffraction at the European Synchrotron Radiation Facility in Grenoble (France). The results will have an effect in the understanding of the dynamics, composition and the formation of the depths of our planet.

On top of the core of the Earth, constituted of liquid iron, lies the solid mantle, which is made up essentially of magnesium oxides, iron and silicon. The border between the core and the mantle, located at 2900 km under our feet, is highly intriguing to geophysicists. With a pressure of around 1.4 million times the atmospheric pressure and a temperature of more than 4000 Kelvin, this zone is the home to chemical reactions and changes in states of matter still unknown. The seismologists who have studied this subject have acknowledged an abrupt reduction of the speed of the seismic waves, which sometimes reach 30% when getting close to this border. This fact has led scientists to formulate the hypothesis, for the last 15 years, of the partial melting of the Earth mantle at the level of this mantle-core border. Today it has been confirmed.

In order to access the depths of the Earth, scientists have not only seismological images but also a precious experimental technique: diamond anvil cells, coupled with a heating layer. This instrument allows to re-create the same pressure and temperature condition than those in the interior of the Earth, on samples of a few microns. This is the technique used by the researchers of the Institut de minéralogie et de physique des milieux condensés on natural samples that are representatives of the Earth mantle and that have been put under pressures of more than 140 gigapascals (or 1.4 million times the atmospheric pressure), and temperatures of more than 5000 Kelvin.

A new approach to this study has been the use of the X-ray diffraction technique at the European synchrotron ESRF. This has allowed the scientists to determine what mineral phases melt first, and they have also established, without extrapolation, fusion curves of the deep Earth mantle, i.e. the characterization of the passage from a solid state to a partially liquid state. Their observations show that the partial fusion of the mantle is possible when the temperature approaches 4200 Kelvin. These experiments also prove that the liquid produced during this partial fusion is dense and that it can hold multiple chemical elements, among which are important markers of the dynamics of the Earth mantle. These studies will allow geophysicists and geochemists to achieve a deeper knowledge of the mechanisms of differentiation of the Earth and the history of its formation, which started around 4.5 billion years ago.

Technology in the extreme

Radio transmitters that can withstand temperatures of up to 900 oC could soon be dropped into the depths of the earth to provide early warning of a volcanic eruption.

The state-of-the-art technology being pioneered by experts at Newcastle University uses Silicon Carbide electronics that can withstand temperatures equal to the inside of a jet engine.

Measuring subtle changes in the levels of key volcanic gases such as carbon dioxide and sulphur dioxide, the wireless sensor would feed back real-time data to the surface, providing vital information about volcanic activity and any impending eruption.

And because of its unique molecular structure – which is more stable than silicon – Silicon Carbide also has a high radiation tolerance opening up possibilities for its use in the nuclear industry.

The team has developed the necessary components and are now working to integrate them into a device about the size of an iPhone that could be used in a variety of locations such as power plants, aircraft engines and even volcanoes.

The device, featured today in The Engineer, is one of a number of technologies which has been developed by experts at the university’s Centre for Extreme Environment Technology, which was set up to ‘go where no technology has gone before’ and unlock the secrets of some of the world’s harshest environments.

Building reliable components that will continue to work under these conditions has been an on-going challenge for electronic engineers and the team at Newcastle University is recognised as a world leader in the field.

Dr Alton Horsfall, who leads the Silicon Carbide work alongside Professor Nick Wright, explains: “At the moment we have no way of accurately monitoring the situation inside a volcano and in fact most data collection actually goes on post-eruption. With an estimated 500 million people living in the shadow of a volcano this is clearly not ideal.

“We still have some way to go but using silicon carbide technology we hope to develop a wireless communication system that could accurately collect and transmit chemical data from the very depths of a volcano.”

And the device has other uses. “If someone sets off a bomb on the underground, for example, this will still sit on the wall and tell you what’s going on,” says Dr Horsfall.

Volcanic monitoring is just one of the strands of research being carried out at the Centre for Extreme Environment Technology.

With expertise in underwater communications, Professor Bayan Sharif, Jeff Neasham and Dr Charalampos Tsimenidis have developed a micro Remotely-Operated Vehicle that can be used to feed back environmental data about our coastlines. The team is also working on through metal communications which involves transmitting a signal through almost 10cm of steel and wireless sensor networks.

Professor Nick Wright, pro-vice chancellor for innovation and research at Newcastle University, added: “The situations we are planning to use our technology in means it’s not enough for the electronics to simply withstand extremes of temperature, pressure or radiation – they have to continue operating absolutely accurately and reliably.

“Increasingly mankind is spreading out into harsher and more extreme environments as our population grows and we explore new areas for possible sources of energy and food in order to sustain it.

“But with this comes new challenges and this is why research into extreme technologies is becoming ever more important.”

The biggest crash on Earth

During the collision of India with the Eurasian continent, the Indian plate is pushed about 500 kilometers under Tibet, reaching a depth of 250 kilometers. The result of this largest collision in the world is the world’s highest mountain range, but the tsunami in the Indian Ocean from 2004 was also created by earthquakes generated by this collision. The clash of the two continents is very complex, the Indian plate, for example, is compressed where it collides with the very rigid plate of the Tarim Basin at the north-western edge of Tibet. On the eastern edge of Tibet, the Wenchuan earthquake in May 2008 claimed over 70,000 deaths. Scientists at the GFZ German Research Center for Geosciences report in the latest issue of the scientific journal “Science” (vol. 329, Sept. 17, 2010) on the results of a new seismic method which was used to investigate the collision process.

In international cooperation, it was possible to follow the route of the approximately 100 kilometers thick Indian continental plate beneath Tibet. To achieve this, a series of large seismic experiments were carried out in Tibet, during which the naturally occuring earthquakes were recorded. By evaluating weak waves that were scattered at the lower edge of the continental plate, this edge was made visible in detail. The boundary between the rigid lithosphere and the softer asthenosphere proved to be much more pronounced than was previously believe

The entire Indian sub-continent moves continuously north over millions of years and has moved 2 meters below Tibet in the last 50 years alone. The Himalayas and the highlands of Tibet, the highest and largest plateau in the world, were formed this way. But the recurring catastrophic earthquakes in China are also caused by this collision of two continents. For a better understanding of the processes involved in the collision of the two plates, it is hoped to ultimately reduce the earthquake risk to millions of people across the entire collision zone.

Watch your seas: Marine scientists call for European marine observatory network

More than 100 marine scientists, policy makers and members of industry unanimously call for action towards an integrated network of observatories monitoring Europe’s seas, at the Marine Board-ESF Forum ‘Towards a European Network of Marine Observatories’. This will give reliable, long-term data to underpin science and policy regarding the use of seas for fisheries, aquaculture, energy, shipping, as well as tourism and recreation.

“We should not take for granted the wealth and well-being provided by the seas and oceans” said Lars Horn from the Research Council of Norway and chair of the Marine Board. “This call needs to be heard by national and European decision makers and budget holders. Stable, long-term marine observations are essential so that we can interact with our marine environment in a sustainable way.”

Participants in the Forum will adopt a joint vision for a European network of long-term marine observatories. A wide-range of issues related to marine observatories will be discussed: the role of observatories in providing marine knowledge, the technical challenges, the funding schemes and innovative governing structures needed.

Long-term sets of data from the marine environment would enable us to better understand ocean, earth and climate system processes. They are also critical for monitoring the scale and extent of environmental change which affects global society and economy. According to the European Commission, a 25% reduction in uncertainty about future sea-level rise alone could save ?100 million annually in European coastal defences.

Professor Peter Haugan from the University of Bergen, Norway is moderating the discussions and will carry the message to the EurOCEAN 2010 Conference, a high level science policy event organised by the Belgian EU Presidency on 12-13 October 2010 in Ostend, Belgium. At this conference the European marine and maritime research community is expected to call on the Member and Associated States of the European Union and the EU institutions, to recognise that “The seas and oceans are one of the Grand Challenges for the 21st Century”. A European network of long-term marine observatories for monitoring and research would provide an effective tool to address this challenge.

Arctic sea ice reaches lowest 2010 extent, third lowest in satellite record

Arctic sea ice reached what appears to be the lowest 2010 extent, making it the third lowest extent in the satellite record. -  CU-Boulder/National snow and Ice Data Center
Arctic sea ice reached what appears to be the lowest 2010 extent, making it the third lowest extent in the satellite record. – CU-Boulder/National snow and Ice Data Center

The Arctic sea ice cover appears to have reached its minimum extent for the year, the third-lowest recorded since satellites began measuring sea ice extent in 1979, according to the University of Colorado at Boulder’s National Snow and Ice Data Center.

While this year’s September minimum extent was greater than 2007 and 2008, the two record-setting and near-record-setting low years, it is still significantly below the long-term average and well outside the range of natural climate variability, according to CU-Boulder’s NSIDC scientists. Most researchers believe the shrinking Arctic sea ice is tied to warming temperatures caused by an increase in human-produced greenhouse gases being pumped into Earth’s atmosphere.

On Sept. 10 the sea ice extent dropped to 1.84 million square miles, or 4.76 million square kilometers, and is likely the lowest ice extent of the year as sea ice appears to have begun its annual cycle of growth.

The 2010 minimum ice extent is 93,000 square miles, or 240,000 square kilometers, above the 2008 numbers and 240,000 square miles, or 630,000 square kilometers, above the record low in 2007. The 2010 sea ice extent is 130,000 square miles, or 340,000 square kilometers, below 2009, according to Serreze.

“We are still looking at summers with an ice-free Arctic Ocean in perhaps 20 to 30 years,” said Serreze, also a professor in CU-Boulder’s geography department.

The 2010 minimum is 753,000 square miles, or 1.95 million square kilometers, below the 1879-2000 average minimum and 625,000 square miles, or 1.62 million square kilometers, below the 1979 to 2010 average minimum.

Since NSIDC researchers determine the minimum sea ice extent using a five-day running average, there is still a small chance the sea ice extent could fall slightly, said Serreze. CU-Boulder’s NSIDC will provide more detailed information in early October with a full analysis of the 2010 Arctic ice conditions, including aspects of the melt season and conditions heading into the winter ice-growth season.

Glaciers help high-latitude mountains grow taller

This is the south flank of glaciated Cordillera Darwin and Bahia Pia (Pia Bay), highest point on Tierra del Fuego, Chile, taken from the Beagle Channel. These peaks and fjords were named after exploration undertaken during Charles Darwin's Voyage of the Beagle to this region in the 1830s. -  Stuart N. Thomson.
This is the south flank of glaciated Cordillera Darwin and Bahia Pia (Pia Bay), highest point on Tierra del Fuego, Chile, taken from the Beagle Channel. These peaks and fjords were named after exploration undertaken during Charles Darwin’s Voyage of the Beagle to this region in the 1830s. – Stuart N. Thomson.

Glaciers can help actively growing mountains become higher by protecting them from erosion, according to a University of Arizona-led research team.

The finding is contrary to the conventional view of glaciers as powerful agents of erosion that carve deep fjords and move massive amounts of sediment down mountains. Mountains grow when movements of the Earth’s crust push the rocks up.

The research is the first to show that the erosion effect of glaciers – what has been dubbed the “glacial buzzsaw” – reverses on mountains in colder climates.

The researchers were surprised, said first author Stuart N. Thomson, a research scientist in the UA department of geosciences. “We were expecting to see the buzzsaw.”

The team discovered the protective effects of glaciers by studying the Andes Mountains in the southernmost region of South America, known as Patagonia.

UA co-author Peter W. Reiners said, “It’s been thought that glaciers limit the height of mountain ranges worldwide.”

The key is climate. Glaciers atop mountains in temperate latitudes flow downhill, scouring away the surface of the mountain. Over millennia, such erosion can reduce the height and width of a mountain range by miles.

However in very cold climates such as the Patagonian Andes, rather than scraping away the surface of the mountain, the team found that glaciers protect the mountain top and sides from erosion.

The team dubs the action of the cold-climate glaciers “glacial armoring.”

“Climate, especially through glaciers, has a really big impact on how big mountains get,” said Reiners, a UA professor of geosciences.

“What we’re seeing is that below certain latitudes, glacial buzzsaws clearly and efficiently operate, but south of about 45 degrees, it not only doesn’t work – it has the opposite effect,” he said. “The glaciers actually protect the surface and allow the mountains to grow higher.”

He and his colleagues anticipate that glacial armoring also occurs on cold-climate mountains very far north, such as those in Alaska.

The team’s paper, “Glaciation as a destructive and constructive control on
mountain building,” is scheduled for publication in the Sept. 16 issue of
the journal Nature. Additional co‑authors are Mark T. Brandon and Nathaniel
J. Wilson of Yale University in New Haven, Conn.; Jonathan H. Tomkin of the
University of Illinois at Urbana‑Champaign; and Cristián Vásquez of the
Universidad de Chile in Santiago. The National Science Foundation and the
Chilean Fondecyt funded the work.

The Andes are the textbook example of actively growing mountains that are limited in height and size by glaciers, Thomson said. The Andes are actively being pushed higher by movements of the Earth’s crust. However, if the glacial buzzsaw is active, the mountains also are ground down.

“We’re trying to understand how mountains are built and destroyed,” Thomson said. “Why are mountains high?”

In actively growing mountains, hot rocks from deep in the Earth are being thrust up. At the same time, erosion sands away the tops and sides of the mountains, bringing those once-hot rocks closer to surface. The speed at which the rocks cool indicates how rapidly the surface material above the rocks was removed by erosion.

To figure out how fast the glaciers had scoured the Andes, Thomson and his colleagues needed to analyze rocks now exposed on the mountains. The scientists sailed up glacially-cut fjords to the foot of remote glaciers and collected soccer-ball-sized rocks. The team collected rocks from latitude 38 degrees south to 56 degrees south, for a total of 146 samples.

The researchers analyzed the rocks in laboratories at the UA and at Yale University to determine what geologists call the “cooling age” of the rocks. The cooling age tells how fast the rock was exposed by erosion.

The researchers used two independent dating methods, apatite uranium-thorium-helium and fission-track dating, to determine cooling ages. Both methods showed the same result — that the rocks cooled faster in the north and slower in the south. The slower the cooling, the more slowly the mountains are eroding.

Reiners said, “What corroborates this is that the mountains are higher in the south than in the north. Uplift is winning in the south, and the glacial buzzsaw is winning in the north.”

The importance of climate in the formation of mountains is currently a matter of scientific debate, Thomson said.

The new finding indicates that climate plays a key role.

Said Thomson: “Climate determines the size of a mountain range – whether there is a glacial buzzsaw or glacial armoring.”

Commercial-scale test of new technology to recover coal from sludge successful

Roe-Hoan Yoon
Roe-Hoan Yoon

A new technology for removing water from ultrafine coal slurry has been successfully tested at the commercial scale at an operating coal cleaning plant. The technology offers the possibility of reducing the coal slurry impoundment problem from the source. A peer-reviewed paper on this new technology was presented Sept. 15 at the 13th Australian Coal Preparation Society Conference, Cairns, Queensland.

Cleaning coal after it has been mined is done with water. The bulk of the coal mined is relatively coarse in size and, therefore, can be readily washed of impurities and subsequently dewatered. However, a portion of mined coal is smaller than approximately 30-40 microns – something like the size of talcum powder – and is difficult to dewater after cleaning, said Roe-Hoan Yoon, the Nicholas T. Camicia Professor of Mining and Mineral Engineering in Virginia Tech’s College of Engineering. As a result, finer coal is often discarded to slurry impoundments. There are hundreds of sludge impoundments in the U.S., mostly in Appalachia, creating environmental and safety concerns, said Yoon.

Yoon presented the paper in Australia with co-author Wally Schultz, executive vice president of Decanter Machine Inc. of Johnson City, Tenn., the largest supplier of screen-bowl centrifuges internationally.

Yoon, Gerald Luttrell, Massey Professor of Mining and Mineral Engineering, and their colleagues at the Center for Advanced Separation Technologies (CAST) at Virginia Tech have developed a hyperbaric centrifuge that was patented by Virginia Tech Intellectual Properties Inc. and sublicensed to Decanter Machine. “The new technology compliments what Decanter already has,” said Yoon.

Encouraged by the results of a pilot-scale test conducted in 2009, Jim Walter Resources Inc. of Brookwood, Ala. (Walter Energy Inc.) tested a full-scale commercial unit successfully. “Everything has performed as promised by Decanter,” said Joel Franklin, preparation engineer for Jim Walter Resources.

In the pilot-scale test, coal slurries consisting of ultrafine coal were dewatered to less than 20 percent moisture. “The product coal feels like dry powder when you touch it because the water left with the coal is spread so thinly across its large surface area,” Luttrell said.

According to a National Research Council report, the U.S. coal industry discards annually 70 to 90 million tons of fine refuse to slurry impoundments. “The dewatering technologies developed by CAST will help coal companies recover all of the mined coal. The technology can also be used to recover the coal in existing impoundments, which can help clean-up the environment and create jobs in the coal producing regions like Southwest Virginia,” said Congressman Rick Boucher (D-VA 9th District), who has supported funding for CAST and other energy projects.

“We are very optimistic,” said Decanter Machine Inc. President Ken Robinette.

The centrifuge technology is the most recent of the advanced technologies developed by CAST. Microcel? flotation column was the first major separation technology developed. It uses microbubbles to separate fine coal from mineral matter that becomes ash when burned at power plants and from other impurities, and is used widely in Australia.

As part of a project funded by National Energy Technology Laboratory (NETL), CAST has developed two other advanced dewatering processes. One is the novel dewatering aids that are currently marketed by Nalco Company. The other is a technology that may be more useful for recovering and dewatering the ultrafine coal from existing impoundments, according to Yoon. Virginia Tech has applied for a patent for this new technology.

In June 2010, Yoon testified before the subcommittee of the West Virginia Legislature’s Joint Standing Committee that was charged with addressing the issues concerning coal slurry impoundments. Yoon suggested that the CAST research funded by NETL can offer technological solutions.

A tectonic zip

The complex fracture pattern created by the earthquake in Concepción (Chile) on 27 February 2010 was to a certain extent predictable. GPS observations from the years before the earthquake showed the pattern of stresses that had accumulated through the plate movements during the past 175 years in this area. The stress distribution derived from the observations correlates highly with the subsequent fracture distribution. In all likelihood the tremor removed all the stress that had accumulated since the last earthquake in this region, which was observed by Charles Darwin in 1835. An earthquake of similar magnitude in this area is therefore unlikely in the near future. This result was presented by scientists of the GFZ German Centre for Geosciences (Helmholtz Association) in the latest edition of the scientific journal Nature (09 September 2010).

“The Maule earthquake near Concepción, Chile, on the 27 Februar registered with a momentum magnitude of 8.8, makes it one of the largest earthquakes to have been recorded in its entirety via a modern network of space-geodetic and geophysical instruments on the ground,” says Professor Onno Oncken, head of the Department “Geodynamics” at GFZ. “It thus offers a unique opportunity to compare detailed observations prior to the earthquake with those taken during and after it, and to re-evaluate hypotheses regarding the predictability of such events.”

Measurements using the satellite navigation system GPS showed that the seafloor of the Nazca plate in the Pacific Ocean does not slide evenly under the western boundary of the South American continent. Rather, it appears from the GPS measurements that some parts of the ocean floor got locked with the subsurface of the continent. In the gaps, however, the Nazca plate continued to push under South America. The resulting uneven stress pattern was released by the earthquake of the 27 February in such a way that, just like a zipper, the locked patches were ruptured one after the other. As a result, this seismic gap off the Chilean westcoast is now closed, one last gap remains in northern Chile. Here, the GFZ scientists set up a plate boundary observatory, in order to make use of the entire range of geoscientific instruments to record the conditions before, during and after an earthquake- an important step in understanding the processes of plate tectonics.

Modern Earth science may still not be able to predict the location, time and magnitude of an earthquake. But the present study offers an optimistic perspective concerning the predictability of possible fracture patterns and magnitudes of expected earthquakes.

Study adds new clue to how last ice age ended

Thick ice once filled New Zealand's Irishman Basin. -  Aaron Putnam, University of Maine
Thick ice once filled New Zealand’s Irishman Basin. – Aaron Putnam, University of Maine

As the last ice age was ending, about 13,000 years ago, a final blast of cold hit Europe, and for a thousand years or more, it felt like the ice age had returned. But oddly, despite bitter cold winters in the north, Antarctica was heating up. For the two decades since ice core records revealed that Europe was cooling at the same time Antarctica was warming over this thousand-year period, scientists have looked for an explanation.

A new study in Nature brings them a step closer by establishing that New Zealand was also warming, indicating that the deep freeze up north, called the Younger Dryas for the white flower that grows near glaciers, bypassed much of the southern hemisphere.

“Glaciers in New Zealand receded dramatically at this time, suggesting that much of the southern hemisphere was warming with Antarctica,” said study lead author, Michael Kaplan, a geochemist at Columbia University’s Lamont-Doherty Earth Observatory. “Knowing that the Younger Dryas cooling in the northern hemisphere was not a global event brings us closer to understanding how Earth finally came out of the ice age.”

Ice core records show that warming of the southern hemisphere, starting 13,000 years ago, coincided with rising levels of the heat-trapping gas, carbon dioxide. The study in Nature is the first to link this spike in CO2 to the impressive shrinking of glaciers in New Zealand. The scientists estimate that glaciers lost more than half of their extent over a thousand years, and that their creep to higher elevations was a response to the local climate warming as much as 1 degree C.

To reconstruct New Zealand’s past climate, the study’s authors tracked one glacier’s retreat on South Island’s Irishman Basin. When glaciers advance, they drag mounds of rock and dirt with them. When they retreat, cosmic rays bombard these newly exposed ridges of rock and dirt, called moraines. By crushing this material and measuring the build-up of the cosmogenic isotope beryllium 10, scientists can pinpoint when the glacier receded. The beryllium-10 method allowed the researchers to track the glacier’s retreat upslope through time and indirectly calculate how much the climate warmed.

The overall trigger for the end of the last ice age came as Earth’s orientation toward the sun shifted, about 20,000 years ago, melting the northern hemisphere’s large ice sheets. As fresh melt water flooded the North Atlantic Ocean, the Gulf Stream weakened, driving the north back into the ice age. During this time, temperatures in Greenland dropped by about 15 degrees C. For years, scientists have tried to explain how the so-called Younger Dryas cooling fit with the simultaneous warming of Antarctica that eventually spread across the globe.

The Nature paper discusses the two dominant explanations without taking sides. In one, the weakening of the Gulf Stream reconfigures the planet’s wind belts, pushing warm air and seawater south, and pulling carbon dioxide from the deep ocean into the air, causing further warming. In the other, the weakened Gulf Stream triggers a global change in ocean currents, allowing warm water to pool in the south, heating up the climate.

Bob Anderson, a geochemist at Lamont-Doherty who argues the winds played the dominant role, says the Nature paper adds another piece to the puzzle. “This is one of the most pressing problems in paleoclimatology because it tells us about the fundamental processes linking climate changes in the northern and southern hemispheres,” he said. “Understanding how regional changes influence global climate will allow scientists to more accurately predict regional variations in rain and snowfall.”

Oil remains below surface, will come ashore in pulses

Gregory Stone, director of LSU’s WAVCIS Program and also of the Coastal Studies Institute in the university’s School of the Coast & Environment, disagrees with published estimates that more than 75 percent of the oil from the Deepwater Horizon incident has disappeared.

Stone recently participated in a three-hour flyover of the affected area in the Gulf, where he said that subsurface oil was easily visible from overhead.

“It’s most definitely there,” said Stone. “It’s just a matter of time before it makes itself known again.”

Readings from WAVCIS indicate that the direction of the ocean currents near the middle and bottom of the water column are aimed offshore; in other words, this submerged oil will be pushed out to sea, where it will then rise higher into the water column and be washed onto land, particularly during storms.

“It is going to come on shore not consistently, but rather in pulses because it is beneath the surface,” he said. “You may get one or two, maybe even five or 10 waves coming ashore with absolutely no oil ? but eventually, it’s going to come ashore.” He also cautions that whatever oil doesn’t remain suspended in the water column may simply sit atop the seafloor, waiting to be mixed back into the currents.

“It will simply be stirred up during rough seas or changing currents and reintroduced into the water column,” he explained.

Another timely concern is hurricane season since September is generally one of the most active months of the year. “Storm surge, when combined with storm waves from a hurricane, could stir up this submerged oil and bring it – lots of it – onshore and into the wetlands,” Stone said. “Even a tropical storm could result in more oil on the shoreline. And that’s a reality we need to consider and be prepared for.”

Formally known as the Wave-Current-Surge Information System, WAVCIS is based off of a network of buoys, oil platforms sensors and ADCPs, or Acoustic Doppler Current Profilers, in the Gulf of Mexico. The ADCPs are exceptionally sensitive. Housed on the seafloor, they send acoustic signals up to the surface of the water, measuring the entire water column for everything from current direction to speed and temperature. It’s also integrated with the National Data Buoy Center, or NDBC, system, providing researchers worldwide with a comprehensive look at the Gulf environment, which is an invaluable research tool during the inevitable hurricane season, and also during disasters such as the Deepwater Horizon tragedy.

“WAVCIS is among the most sensitive ocean observing systems in the entire nation,” said Stone. “We measure a wide variety of physical parameters at the water surface, water column and on the sea bed. This information is extremely helpful in predicting or determining where the oil is – and where it’s going to go. Because our information is updated hourly and available to the public, our lab has played a primary role in providing facts about the situation surrounding the oil’s movement and location.”

Stone, whose experience with WAVCIS has spanned everything from natural to manmade disasters, knows that only time will tell the severity of the oil’s impact.

“This is a long term problem. It’s not simply going to go away. I was in Prince William Sound 10 years after the Exxon-Valdez event, and when I lifted up a rock, there was still residual oil beneath it,” he said. “Thus, the residence time of oil in the coastal environment can be substantial, although ecosystem conditions along the northern Gulf are very different and will likely recover quicker than in Alaska. We here at WAVCIS can at least track Gulf conditions to monitor the situation as closely as possible.