New oil detection technique

CSIRO scientists have developed a revolutionary technique for the rapid on-site detection and quantification of petroleum hydrocarbons (commonly derived from crude oil) in soil, silt, sediment, or rock.

Developed in collaboration with waste technology specialist, Ziltek Pty Ltd, the technique means that the presence of petroleum hydrocarbons can now be quantified simply by using a hand-held infrared spectrometer to take readings at the site of interest, without the need to take samples or perform any kind of processing.

The technique could be used for oil exploration purposes. It will also be particularly useful in assessing and monitoring contaminated sites such as coastal land following off-shore oil spills and industrial sites planned for urban redevelopment.

“Petroleum hydrocarbons are a valuable resource, but can also be pretty nasty environmental contaminants,” says CSIRO scientist, Sean Forrester.

“They can remain in the environment for extended periods of time and can be harmful to wildlife, plants and humans. Better tools to detect them makes a rapid response possible.”

The technique uses an infrared signal to detect the presence of petroleum hydrocarbons in samples.

By contrast, current methods use sampling and processing techniques that are labor intensive, time consuming, require sensitive equipment and are not well suited to on-site analysis.

“The ability of this new technique to rapidly detect the presence of contaminants at the site has the potential to provide significant cost advantages, in terms of reduced testing costs and the avoidance of delays,” Mr Forrester says.

“Rapid analysis allows immediate measures to be undertaken to prevent further contamination or to limit contaminant spread.”

A significant portion of the time and financial costs involved in assessing and remediating contaminated sites is consumed by monitoring and analysis.

By decreasing analysis time and reducing costs this new technique can assist in the fast and effective identification of oil and other petroleum products in the environment, as well as treatment and protection of environmental assets threatened by petroleum contamination.

Pinpointing where volcanic eruptions could strike

A better way to pinpoint where volcanic eruptions are likely to occur has been produced by an international team of geophysicists.

Scientists from the universities of Leeds, Purdue, Indiana and Addis Ababa, investigated volcanic activity occurring in the remote Afar desert of Northern Ethiopia between 2005 and 2009.

By studying a rare sequence of 13 magmatic events – where hot molten rock was intruded into a crack between the African and Arabian plates – they found that the location of each intrusion was not random. They showed that they were linked because each event changed the amount of tension in the earth’s crust.

The findings, published in Nature Geoscience, will help scientists to more accurately predict where volcanic eruptions could strike and contribute to efforts to limit the damage they can cause.

Lead author Dr Ian Hamling, who completed the analysis as part of his PhD in the School of Earth and Environment at the University of Leeds said: “It’s been known for some time that a large earthquake has a role to play in triggering subsequent earthquakes, but until now, our knowledge of volcanic events has been based on isolated cases. We have demonstrated that volcanic eruptions can influence each other. This will help us predict where future volcanic eruptions are likely to happen.”

The team studied the region around a large volcanic dyke – a vertical crack which is created when Magma seeps from underground through rifts in the surface of the earth – which erupted in the Afar desert in September 2005.

he Magma – hot molten rock – was injected along the dyke between depths of 2 and 9 km, and altered the tension of the earth. The team was able to watch the 12 smaller dykes that subsequently took place in the same region over a four year period.

By monitoring levels of tension in the ground near where each dyke was intruded they found that subsequent eruptions were more likely in places where the tension increases.

Dr Hamling said: “If you look at this year’s eruptions at Ejafjallajokull in Iceland, by estimating the tension in the crust at other volcanoes nearby, you could estimate whether the likelihood of them eruption has increased or decreased. Knowing the state of stress in this way won’t tell you when an eruption will happen, but it will give a better idea of where it is most likely to occur.”

Groundwater depletion rate accelerating worldwide

In recent decades, the rate at which humans worldwide are pumping dry the vast underground stores of water that billions depend on has more than doubled, say scientists who have conducted an unusual, global assessment of groundwater use.

These fast-shrinking subterranean reservoirs are essential to daily life and agriculture in many regions, while also sustaining streams, wetlands, and ecosystems and resisting land subsidence and salt water intrusion into fresh water supplies. Today, people are drawing so much water from below that they are adding enough of it to the oceans (mainly by evaporation, then precipitation) to account for about 25 percent of the annual sea level rise across the planet, the researchers find.

Soaring global groundwater depletion bodes a potential disaster for an increasingly globalized agricultural system, says Marc Bierkens of Utrecht University in Utrecht, the Netherlands, and leader of the new study.

“If you let the population grow by extending the irrigated areas using groundwater that is not being recharged, then you will run into a wall at a certain point in time, and you will have hunger and social unrest to go with it,” Bierkens warns. “That is something that you can see coming for miles.”

He and his colleagues will publish their new findings in an upcoming issue of Geophysical Research Letters, a journal of the American Geophysical Union.

In the new study, which compares estimates of groundwater added by rain and other sources to the amounts being removed for agriculture and other uses, the team taps a database of global groundwater information including maps of groundwater regions and water demand. The researchers also use models to estimate the rates at which groundwater is both added to aquifers and withdrawn. For instance, to determine groundwater recharging rates, they simulate a groundwater layer beneath two soil layers, exposed at the top to rainfall, evaporation, and other effects, and use 44 years worth of precipitation, temperature, and evaporation data (1958-2001) to drive the model.

Applying these techniques worldwide to regions ranging from arid areas to those with the wetness of grasslands, the team finds that the rate at which global groundwater stocks are shrinking has more than doubled between 1960 and 2000, increasing the amount lost from 126 to 283 cubic kilometers (30 to 68 cubic miles) of water per year. Because the total amount of groundwater in the world is unknown, it’s hard to say how fast the global supply would vanish at this rate. But, if water was siphoned as rapidly from the Great Lakes, they would go bone-dry in around 80 years.

Groundwater represents about 30 percent of the available fresh water on the planet, with surface water accounting for only one percent. The rest of the potable, agriculture friendly supply is locked up in glaciers or the polar ice caps. This means that any reduction in the availability of groundwater supplies could have profound effects for a growing human population.

The new assessment shows the highest rates of depletion in some of the world’s major agricultural centers, including northwest India, northeastern China, northeast Pakistan, California’s central valley, and the midwestern United States.

“The rate of depletion increased almost linearly from the 1960s to the early 1990s,” says Bierkens. “But then you see a sharp increase which is related to the increase of upcoming economies and population numbers; mainly in India and China.”

As groundwater is increasingly withdrawn, the remaining water “will eventually be at a level so low that a regular farmer with his technology cannot reach it anymore,” says Bierkens. He adds that some nations will be able to use expensive technologies to get fresh water for food production through alternative means like desalinization plants or artificial groundwater recharge, but many won’t.

Most water extracted from underground stocks ends up in the ocean, the researchers note. The team estimates the contribution of groundwater depletion to sea level rise to be 0.8 millimeters per year, which is about a quarter of the current total rate of sea level rise of 3.1 millimeters per year. That’s about as much sea-level rise as caused by the melting of glaciers and icecaps outside of Greenland and Antarctica, and it exceeds or falls into the high end of previous estimates of groundwater depletion’s contribution to sea level rise, the researchers add.

High pressure experiments reproduce mineral structures 1,800 miles deep

X-ray diffraction image of the post-perovskite phase of the mineral magnesium silicate glass (MgSiO3), produced in a diamond-anvil cell under 1.85 million times atmospheric pressure and heated to 3500 Kelvin. -  Advanced Light Source/LBNL
X-ray diffraction image of the post-perovskite phase of the mineral magnesium silicate glass (MgSiO3), produced in a diamond-anvil cell under 1.85 million times atmospheric pressure and heated to 3500 Kelvin. – Advanced Light Source/LBNL

University of California, Berkeley, and Yale University scientists have recreated the tremendous pressures and high temperatures deep in the Earth to resolve a long-standing puzzle: why some seismic waves travel faster than others through the boundary between the solid mantle and fluid outer core.

Below the earth’s crust stretches an approximately 1,800-mile-thick mantle composed mostly of a mineral called magnesium silicate perovskite (MgSiO3). Below this depth, the pressures are so high that perovskite is compressed into a phase known as post-perovskite, which comprises a layer 125 miles thick at the core-mantle boundary. Below that lies the earth’s iron-nickel core.

Understanding the physics of post-perovskite, and therefore the physics of the core-mantle boundary, has proven tough because of the difficulty of recreating the extreme pressure and temperature at such depths.

The researchers, led by Yale post-doctoral fellow Lowell Miyagi, a former UC Berkeley graduate student, used a diamond-anvil cell to compress an MgSiO3 glass to nearly 1.4 million times atmospheric pressure and heated it to 3,500 Kelvin (more than 3,000 degrees Celsius, or nearly 6,000 degrees Fahrenheit) to create a tiny rock of post-perovskite. They then further compressed this to 2 million times atmospheric pressure and zapped the substance with an intense X-ray beam from the Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory to obtain a diffraction picture that reveals the deformation behavior of post-perovskite.

They found that the orientation of post-perovskite’s crystals in the deformed rock allowed some seismic waves – those polarized parallel to the core-mantle boundary – to travel faster than those polarized perpendicular to it. This anisotropic structure may explain the observations of seismologists using seismic waves to probe the earth’s interior.

“For the first time, we can use mineral physics with diamond-anvil cells at the ALS to get information about how this mineral, post-perovskite, performs under intense pressure,” said co-author Hans-Rudolf Wenk, a Professor of the Graduate School in UC Berkeley’s Department of Earth and Planetary Science and Miyagi’s Ph.D. thesis advisor. “People had suggested this as an explanation for the anisotropy, but now we have experimental evidence.”

“Understanding how post-perovskite behaves is a good start to understanding what’s happening near the mantle’s lower reaches,” Miyagi said. “We can now begin to interpret flow patterns in this deep layer in the earth.”

The study, which appears in the Sept. 24 issue of the journal Science, has important implications for understanding how the earth’s internal heating and cooling processes work.

“This will give seismologists confidence in their models by matching what these observations predict with the seismic data they get,” said coauthor Waruntorn “Jane” Kanitpanyacharoen, a UC Berkeley graduate student.

Post-perovskite was first recognized as a high-pressure phase in the mantle in 2004, and subsequent experiments in diamond-anvil cells have produced the mineral. Wenk and his colleagues in 2007 conducted experiments that they thought had determined the deformation behavior of post-perovskite, but which now appear to have been related to the phase transformation to post-perovskite. This transition takes place at about 1,300,000 times atmospheric pressure (127 gigaPascals) and 2,500 Kelvin (4,000 degrees Fahrenheit).

The current experiment showed that post-perovskite’s crystal structure is deformed by pressure into a more elongated shape. Because seismic waves travel faster in the stretched direction, this matches the observed difference in velocity between seismic waves polarized horizontally and vertically traveling through the post-perovskite zone above the earth’s core.

If scientists can gain a better understanding of the core-mantle boundary’s behavior, it will give them clues as to how Earth’s internal convection works there, where cool tectonic plates descend from the ocean floor through the mantle eventually nearing the dense, liquid-iron outer core, heat up, and begin moving upward again in a repeated cycle that mixes material and heat through the mantle.

Arctic soil study turns up surprising results

Professor Paul Grogan's findings findings have been accepted for publication in the journal Environmental Microbiology.
Professor Paul Grogan’s findings findings have been accepted for publication in the journal Environmental Microbiology.

Across the globe, the diversity of plant and animal species generally increases from the North and South Poles towards the Equator but surprisingly that rule isn’t true for soil bacteria, according to a new study by Queen’s University biology professor Paul Grogan.

“It appears that the rules determining the patterns for plant and animal diversity are different than the rules for bacteria,” says Professor Grogan.

The finding is important because one of the goals in ecology is to explain patterns in the distribution of species and understand the biological and environmental factors that determine why species occur where they do.

Researchers examined the composition and genetic difference of soil bacterial communities from 29 remote arctic locations scattered across Canada, Alaska, Iceland, Greenland and Sweden.

The report also had a second surprising finding. The researchers expected that soil samples taken 20 meters apart would be more similar in terms of bacterial diversity than soil samples taken 5,500 kilometers apart because, in theory, plant or animal communities from nearby locations are likely to be more genetically similar than those from distant locations.

Generally, they found that each soil sample contained thousands of bacterial types, about 50 per cent of which were unique to each sample.

“It turns out that there is no similarity pattern in relation to distance at all, even in comparing side-by-side samples with samples taken from either side of a continent – this really amazed me,” says Professor Grogan.

Scientist joins global study of decomposing permafrost

The Lapland gate connects one side of Lapland to another; as seen from Abisco, Sweden, north of the Arctic Circle. -  Courtesy Jeff Chanton, FSU Department of Earth, Ocean and Atmospheric Science
The Lapland gate connects one side of Lapland to another; as seen from Abisco, Sweden, north of the Arctic Circle. – Courtesy Jeff Chanton, FSU Department of Earth, Ocean and Atmospheric Science

Florida State University oceanographer Jeff Chanton is part of an international team embarking on a new study of permafrost decomposition in arctic Sweden. What he and his fellow researchers discover there may be critical given the permafrost’s key role in climate change, and vice versa.

It is all part of an ominous feedback loop, Chanton says.

The warming climate is causing the Swedish permafrost to thaw and decompose and, as it does, the greenhouse gases carbon dioxide and methane are released into the atmosphere, creating a feedback loop of further warming temperatures and accelerating permafrost’s decomposition.

“There are 1,672 gigatons of carbon stored in the permafrost as soil and peat organic matter,” Chanton said. “To put that quantity in perspective, it is three times the amount of carbon found in our atmosphere, which contains 550 gigatons in the form of carbon dioxide. What will happen if all the permafrost thaws, releasing its gigantic store of carbon into the atmosphere? Will the respiration of that decomposing organic matter by bacteria produce not only carbon dioxide but also methane, a greenhouse gas 25 times more potent?

“We know that increasing carbon dioxide and methane in the atmosphere creates a positive feedback to global warming,” he said. “Our new study will shed vital additional light on how the thawing affects the atmosphere, which affects warming, and how the thawing of the permafrost affects the organic carbon stored there.”

A three-year, $2.8 million grant from the U.S. Department of Energy will fund the collaborative investigation, to be undertaken by researchers from five universities on three continents. University of Arizona scientists are leading the team, which includes Florida State’s Chanton and research colleagues at the universities of New Hampshire, Stockholm (Sweden), and Queensland (Australia).

Chanton will receive a $300,000 share of the DOE grant. He also has a part of a larger share of the award that will be used to purchase lasers and other field instruments for the entire team.

The study will periodically find Chanton north of the Arctic Circle, where he kicked off his research in August near Abisco, Sweden, amid the mosquitoes and black flies typical of the arctic summer there. While this is his first foray into Sweden’s remote arctic realms, Chanton is no stranger to permafrost research. His previous studies focused on Alaska and Siberia.

Video simulations of real earthquakes made available to worldwide network




The researchers have created videos of earthquakes that incorporate both real data and computer simulations known as synthetic seismograms. These simulations fill the gaps between the actual ground motion recorded at specific locations in the region, providing a more complete view of the earthquake. This video illustrates the January 2010 earthquake that devastated Haiti.
The researchers have created videos of earthquakes that incorporate both real data and computer simulations known as synthetic seismograms. These simulations fill the gaps between the actual ground motion recorded at specific locations in the region, providing a more complete view of the earthquake. This video illustrates the January 2010 earthquake that devastated Haiti.

A Princeton University-led research team has developed the capability to produce realistic movies of earthquakes based on complex computer simulations that can be made available worldwide within hours of a disastrous upheaval.

The videos show waves of ground motion spreading out from an epicenter. In making them widely available, the team of computational seismologists and computer scientists aims to aid researchers working to improve understanding of earthquakes and develop better maps of the Earth’s interior.

“In our view, this could truly change seismic science,” said Princeton’s Jeroen Tromp, a professor of geosciences and applied and computational mathematics, who led the effort. “The better we understand what happens during earthquakes, the better prepared we can be. In addition, advances in understanding seismic waves can aid basic science efforts, helping us understand the underlying physics at work in the Earth’s interior. These visualizations, we believe, will add greatly to the research effort.”

In a scientific paper describing the system, which appeared online Sept. 16 and will be published in the October 2010 Geophysical Journal International, the team describes how it creates the videos. The movies will be made available for free to scientists and members of the public and news organizations interested in featuring such images on television and the Internet. The easily downloadable videos can be viewed at: http://global.shakemovie.princeton.edu. They tell the story in a language that is easy to understand, said Tromp, who also is the director of the Princeton Institute for Computational Science and Engineering (PICSciE).

When an earthquake takes place, data from seismograms measuring ground motion are collected by a worldwide network of more than 1,800 seismographic stations operated by members of the international Federation of Digital Seismograph Networks. The earthquake’s location, depth and intensity also are determined. The ShakeMovie system at Princeton will now collect these recordings automatically using the Internet.

The scientists will input the recorded data into a computer model that creates a ”virtual earthquake.” The videos will incorporate both real data and computer simulations known as synthetic seismograms. These simulations fill the gaps between the actual ground motion recorded at specific locations in the region, providing a more complete view of the earthquake.

The animations rely on software that produces numerical simulations of seismic wave propagation in sedimentary basins. The software computes the motion of the Earth in three dimensions based on the actual earthquake recordings, as well as what is known about the subsurface structure of the region. The shape of underground geological structures in the area not recorded on seismograms is key, Tromp said, as the structures can greatly affect wave motion by bending, speeding, slowing or simply reflecting energy. The simulations are created on a parallel processing computer cluster built and maintained by PICSciE and on a computer cluster located at the San Diego Supercomputing Center

After the three-dimensional simulations are computed, the software program plugs in data capturing surface motion, including displacement, velocity and acceleration, and maps it onto the topography of the region around the earthquake. The movies then are automatically published via the ShakeMovie portal. An e-mail also is sent to subscribers, including researchers, news media and the public.

The simulations will be made available to scientists through the data management center of the Incorporated Research Institutions for Seismology (IRIS) in Seattle. The organization distributes global scientific data to the seismological community via the Internet. Scientists can visit the IRIS website and download information. Due to the research team’s work, they now will be able to compare seismograms directly with synthetic versions.

Advanced computing power made the synthetic seismograms possible, according to Dennis McRitchie, another author on the paper and a lead high-performance computing analyst for Princeton’s Office of Information Technology. ”This is computationally intensive — it takes five hours to produce a 100-minute simulation,” McRitchie said. The effort to numerically solve the differential equations that govern how the waves propagate through these complicated earth models requires 384 computers operating in parallel to analyze and process the numbers.

When an earthquake occurs, seismic waves are generated that propagate away from the fault rupture and course along the Earth’s surface. The videos show the up-and-down motion of the waves in red (up) and blue (down). Strong red waves indicate rapid upward motion. Strong blue waves indicate the Earth’s surface is moving quickly downward. The simulation shows that the waves are of uneven strength in different areas, depending on the quality of the soil and the orientation of the earthquake fault. When the waves pass through soft, sedimentary soils, they slow down and gain strength. Waves speed up through hard rock, lessening the impact on surface areas above. A clock in the video shows the time since the earthquake occurred.

The ShakeMovie portal showing earthquakes around the world is similar to one maintained at the California Institute of Technology that routinely does simulations of seismic events in the Southern California region.

Earthquake movies will be available for download about 1.5 hours after the occurrence of a quake of magnitude 5.5 or greater

Florida institutions to host Gulf of Mexico oil spill conference

The University of South Florida, Florida Institute of Oceanography, Mote Marine Laboratory, and the State of Florida Oil Spill Academic Task Force will host a major oil spill research conference, February 9-11, 2011, at the Hilton St. Petersburg Bayfront in St. Petersburg, Florida.

Oral or poster presentations are invited on substantial and original research on all aspects of the Gulf Oil Spill disaster and its impact. Abstract submission deadline is October 25, 2010. Abstracts may be submitted online at http://oilspill.usf.edu/.

The Deepwater Horizon Oil Spill will forever change the Gulf of Mexico, significantly impacting its citizens, environment, economy and policy of the region-and beyond. As efforts are considered to mitigate effects of the spill, plans must also be made to prepare for a different Gulf of Mexico -5, 10 and 20 years out. This disaster is of global importance and demands new approaches and methods, as well as the shared experience and insight of those who have been engaged in such disasters world-wide (e.g., from Alaska, Brazil, India, Europe, Saudi Arabia, Mexico, Nigeria). One goal is to ensure that the tools and models are in place to deal with similar crises globally.

To deliberate these issues, the conference will bring together representatives from academia, government, NGOs and the private sector to inform each other about the state of research in relevant topical areas and to translate research into policy and management for predicting and adapting to a changed future, the extent of which is unknown.

Key topics will include:

  • Geotechnical Engineering
  • Regional Oceanography
  • Chemical Weathering – Biological Consumption
  • Dispersants
  • Ecological Consequences and Toxicity
  • Economic and Social Impacts
  • Human Health Issues
  • Stakeholders, Science and Policy

The conference co-chairs are Robert H. Weisberg, Distinguished University Professor of Physical Oceanography, University of South Florida; William T. Hogarth, Dean, USF College of Marine Science and Acting Director Florida Institute of Oceanography; and Michael P. Crosby, Senior Vice President for Research, Mote Marine Laboratory.

Earth’s highest coastal mountain on the move

The Earth's highest coastal mountain, Colombia's Sierra Nevada de Santa Marta, moved north from Peru and rotated to form a new basin. -  Camilo Montes
The Earth’s highest coastal mountain, Colombia’s Sierra Nevada de Santa Marta, moved north from Peru and rotated to form a new basin. – Camilo Montes

The rocks of Colombia’s Sierra Nevada de Santa Marta-the highest coastal mountain on Earth-tell a fascinating tale: The mountain collides and then separates from former super-continents. Volcanoes are born and die. The mountain travels from Peru to northern Colombia and finally rotates in a clockwise direction to open up an entirely new geological basin. Smithsonian scientists were part of a four-year project to study Santa Marta’s geological evolution. Their findings are published in the October 2010 special issue of the Journal of South American Earth Sciences.

The study involved state-of-the-art geological, structural, paleomagnetic, geochemical and geochronological techniques applied by collaborators from universities and research institutions in several European countries and the Americas. “This integrated study represents a long-awaited contribution-particularly to the international scientific community who work in the circum-Caribbean-and fills a notorious gap in the picture of the region’s geology,” said Agustin Cardona, post doctoral fellow at the Smithsonian Tropical Research Institute.

The diverse rock record exposed in Santa Marta rests on an ancient foundation that is more than 1 billion years old. One of the studies links the foundation to other old massifs in the Americas. Using the ancient magnetic field recorded in these rocks, the Smithsonian research group revealed Santa Marta’s 2,200-kilometer journey from northern Peru to its modern position on the Caribbean coast of Colombia during the past 170 million years.

Sophisticated laboratory analyses of Santa Marta rock samples also offered scientists an explanation of their origin as remnants of extinct volcanoes and mountains that once existed but were later obliterated by powerful geologic forces.

Other studies revealed observations pertaining to recent dislocations along the Sierra’s bounding faults-evidence of historic earthquakes and a large submarine canyon carved in the floor of the Caribbean Sea. “We hope that this contribution will serve as a catalyst to accelerate the pace of geological research along this margin of South America,” said German Ojeda, co-leader of the research team and geologist at Colombia’s Ecopetrol energy company. Sponsoring agencies included the geological and marine science research institutes of the Colombian government.

Tsunami detection improves, but coastal areas still vulnerable

The nation's ability to detect and forecast tsunamis has improved since the 2004 Indian Ocean tsunami, but current efforts are still not sufficient to meet challenges posed by tsunamis generated near land that leave little time for warning, says a new congressionally requested report from the National Research Council.
The nation’s ability to detect and forecast tsunamis has improved since the 2004 Indian Ocean tsunami, but current efforts are still not sufficient to meet challenges posed by tsunamis generated near land that leave little time for warning, says a new congressionally requested report from the National Research Council.

The nation’s ability to detect and forecast tsunamis has improved since the 2004 Indian Ocean tsunami, but current efforts are still not sufficient to meet challenges posed by tsunamis generated near land that leave little time for warning, says a new congressionally requested report from the National Research Council. The report calls for a comprehensive national assessment of tsunami risk and improved communication and coordination among the two federal Tsunami Warning Centers, emergency managers, media, and the public.

“For a tsunami warning system to be effective, it must operate flawlessly, and emergency officials must coordinate seamlessly and communicate clearly,” said John Orcutt, chair of the committee that wrote the report and a professor at the Cecil H. and Ida M. Green Institute of Geophysics and Planetary Physics at Scripps Institution of Oceanography in La Jolla, Calif. “However, if a large earthquake near shore triggers a tsunami, it could reach the coast within minutes, allowing hardly any time to disseminate warnings and for the public to react. Education and preparation are necessary to ensure that people know how to recognize natural cues — such as earthquake tremors or receding of the water line — and take appropriate action, even if they do not receive an official warning.”

At the request of Congress for a review of the nation’s tsunami efforts, the report finds many enhancements have been made since 2004, including an increase in the amount and quality of hazard and evacuations maps and the expansion of the Deep-Ocean Assessment and Reporting of Tsunamis (DART) sensor network that indicates the size of tsunamis. There have also been improvements in coastal sea-level stations and the Global Seismic Network operated and maintained by the U.S. Geological Survey and the National Science Foundation. Moreover, various states have evaluated select tsunami-prone communities and initiated several tsunami education and awareness efforts. However, improvements to the DART network’s reliability, station coverage, and operations are needed.

In addition to expanding detection and forecasting, an ideal and comprehensive tsunami program will require risk assessments, public education, and a well-coordinated response, the committee concluded. Persistent progress in these areas will be needed. To gauge how to prioritize efforts, the report recommends first completing a comprehensive risk assessment that characterizes the hazards, inventories the threatened populations and assets, measures the preparedness and ability of individuals and communities to successfully evacuate, and estimates expected losses. Tsunami education and preparation could also be improved by undertaking periodic and comprehensive vulnerability assessments, establishing a review strategy and leadership chain for post-tsunami events, and creating new tsunami detection techniques and analysis.

Improving Warning Messages and Coordination

The committee determined that the likelihood of individuals responding appropriately to tsunami warnings increases if they receive consistent, clear, and accurate warning messages from the two Tsunami Warning Centers (TWCs), stationed in Alaska and Hawaii, and local and state emergency managers. Operating under the National Oceanic and Atmospheric Administration’s National Weather Service, the TWCs monitor seismic activity and sea levels to detect tsunamis and warn emergency managers in their respective regional “areas of responsibility.” However, even though the TWCs issue the warning messages, they cannot order evacuations because they are part of the federal government. Rather, local officials are responsible for transmitting alerts throughout their respective jurisdictions, issuing evacuation orders, managing evacuations, and declaring all-clears. Therefore, close coordination between the TWCs, states, and local jurisdictions is needed to ensure that the public receives consistent information about the threat and proper protective action.

Although multiple, consistent tsunami warning messages increase public responsiveness, the organizational model of two TWCs is problematic because the two sets of warnings often conflict, causing confusion among the media, some local officials, and the public. For example, in June 2005, media outlets in the Pacific Northwest received messages from both TWCs that seemingly contradicted each other because their respective areas of responsibility were not understood. The committee recommended that the message content be improved or that the two TWCs release one message that includes information for all areas under their responsibility.

The TWCs are also designed to be backups for each other, but they do not operate as such, creating an illusion of redundancy that could prove dangerous and costly, the report says. Significant changes need to occur in the management, operations, software and hardware architecture, and organizational culture for the two to become functionally redundant. Additionally, the committee recommended that as part of the long-range planning efforts, the TWC organizational structure should be evaluated — including deciding whether multiple TWCs should issue a single message or a single, centrally managed center should be created, such as the Hurricane Center.