Paired earthquakes separated in time and space


Earthquakes occurring at the edges of tectonic plates can trigger events at a distance and much later in time, according to a team of researchers reporting in today’s (Jan. 31) issue of Nature. These doublet earthquakes may hold an underestimated hazard, but may also shed light on earthquake dynamics.



“The last great outer rise earthquakes that occurred were in the 1930s and 1970s,” said Charles J. Ammon, associate professor of geoscience, Penn State. “We did not then have the equipment to record the details of those events.” The outer rise is the region seaward of the deep-sea trench that marks the top of the plate boundary.



In late 2006 and early 2007, two large earthquakes occurred near Japan separated by about 60 days. These earthquakes took place in the area of the Kuril Islands that are located from the westernmost point of the Japanese Island of Hokkaido to the southern tip of the Kamchatka Peninsula. The first event took place on Nov. 15, 2006 when the edge of the Pacific plate thrust under the arc of the Kuril Islands, initiating a magnitude 8.3 event and causing some damage in Japan and a small tsunami that caused minor damage in Crescent City, California. About 60 days later, on Jan. 13, 2007, a magnitude 8.1 earthquake occurred in “the upper portion of the Pacific plate, producing one of the largest recorded shallow extensional earthquakes.”



This second earthquake was not at a plate boundary and was not directly caused by subduction – the moving of one plate beneath the other. Rather, it was a normal faulting event, where the Pacific plate stretched, bent and broke.



While Japan and the Kamchatka Peninsula are active earthquake areas, the region of the Kuril Islands where the large November earthquake occurred, had not had a large earthquake since 1915 and researchers are unsure of the exact nature of that event.



Working with Hiroo Kanamori, the John E. and Hazel S. Smits professor of geophysics, emeritus, California Institute of Technology, and Thorne Lay, professor of Earth & planetary sciences, University of California, Santa Cruz, the Penn State researcher looked at the sequence of seismic activity that link these two earthquakes into a doublet.



“Such large doublet earthquakes, though rare, could be an underestimated hazard,” says Ammon. “We are also interested in what these events tell us about how earthquakes interact, how the stresses and interactions allow one earthquake to trigger another.”


Looking at the seismic record, the researchers found a series of smaller, foreshock earthquakes beginning about 45 days before Nov. 15. On Nov. 15, there was the magnitude 8.3 earthquake on the plate boundary, the largest event of 2006.



“Within minutes of the Nov. 15 earthquake, seismic activity began on the Pacific plate in the area where the January earthquake would take place,” says Ammon. “This large second earthquake generated a larger amplitude of shaking in the frequency range that affects human-made structures than the first earthquake.”



Usually, aftershocks from a large earthquake are at least one order of magnitude less than the main event and taper off rapidly. In this case, the events within the Pacific plate east of the plate boundary did not taper off, and the second event that occurred in January was about the same size as the first earthquake.



Earthquakes at plate boundaries in subduction zones occur when the plate that is going under – being subducted – gets temporarily stuck and causes compression in the plate away from the edge. Tension builds and when the plate overcomes the friction holding it, it moves downward, slipping under the top plate and causing an earthquake. According to the researchers, the second earthquake that occurred on the Pacific plate happened because of bending experienced by the pacific plate that occurs before it subducts beneath the upper plate. As the front edge of the plate slipped, the plate east of the November earthquake bent, cracked and broke in January.



Like pie crust, when the Earth’s crust bends, small cracks begin to appear – these were the small shocks that began immediately after the first earthquake – but when the bending becomes severe, a larger region of the crust breaks – creating the second, very large event.



In the United States, subduction zones exist only in the Pacific Northwest, Alaska and the area around Puerto Rico. The researchers note, “Triggering of a large outer rise rupture with strong high-frequency shaking constitutes an important potential seismic hazard that needs to be considered in other regions.”



The National Science Foundation and the U.S. Geological Survey funded this research.

Innovative Method Improves Tsunami Warning Systems, Offers New Insights





Using GPS data (purple arrows) to measure ground displacements, scientists replicated the December 2004 Indian Ocean tsunami, whose crests and troughs are shown here in reds and blues, respectively. The research showed GPS data can be used to reliably estimate a tsunami's destructive potential within minutes. (Credit: NASA/JPL)
Using GPS data (purple arrows) to measure ground displacements, scientists replicated the December 2004 Indian Ocean tsunami, whose crests and troughs are shown here in reds and blues, respectively. The research showed GPS data can be used to reliably estimate a tsunami’s destructive potential within minutes. (Credit: NASA/JPL)

A wave of new NASA research on tsunamis has yielded an innovative method to improve existing tsunami warning systems, and a potentially groundbreaking new theory on the source of the December 2004 Indian Ocean tsunami.



In one study, published last fall in Geophysical Research Letters, researcher Y. Tony Song of NASA’s Jet Propulsion Laboratory, Pasadena, Calif., demonstrated that real-time data from NASA’s network of global positioning system (GPS) stations can detect ground motions preceding tsunamis and reliably estimate a tsunami’s destructive potential within minutes, well before it reaches coastal areas. The method could lead to development of more reliable global tsunami warning systems, saving lives and reducing false alarms.



Conventional tsunami warning systems rely on estimates of an earthquake’s magnitude to determine whether a large tsunami will be generated. Earthquake magnitude is not always a reliable indicator of tsunami potential, however. The 2004 Indian Ocean quake generated a huge tsunami, while the 2005 Nias (Indonesia) quake did not, even though both had almost the same magnitude from initial estimates. Between 2005 and 2007, five false tsunami alarms were issued worldwide. Such alarms have negative societal and economic effects.



Song’s method estimates the energy an undersea earthquake transfers to the ocean to generate a tsunami by using data from coastal GPS stations near the epicenter. With these data, ocean floor displacements caused by the earthquake can be inferred. Tsunamis typically originate at undersea boundaries of tectonic plates near the edges of continents.



“Tsunamis can travel as fast as jet planes, so rapid assessment following quakes is vital to mitigate their hazard,” said Ichiro Fukumori, a JPL oceanographer not involved in the study. “Song and his colleagues have demonstrated that GPS technology can help improve both the speed and accuracy of such analyses.”


Song’s method works as follows: an earthquake’s epicenter is located using seismometer data. GPS displacement data from stations near the epicenter are then gathered to derive seafloor motions. Based upon these data, local topography data and new theoretical developments, a new “tsunami scale” measurement from one to 10 is generated, much like the Richter Scale used for earthquakes. Song proposes using the scale to make a distinction between earthquakes capable of generating destructive tsunamis from those unlikely to do so.



To demonstrate his methodology on real earthquake-tsunamis, Song examined three historical tsunamis with well-documented ground motion measurements and tsunami observations: Alaska in 1964; the Indian Ocean in 2004; and Nias Island, Indonesia in 2005. His method successfully replicated all three. The data compared favorably with conventional seismic solutions that usually take hours or days to calculate.



Song said many coastal GPS stations are already in operation, measuring ground motions near earthquake faults in real time once every few seconds. “A coastal GPS network established and combined with the existing International GPS Service global sites could provide a more reliable global tsunami warning system than those available today,” he said.



The theory behind the GPS study was published in the December 20 issue of Ocean Modelling. Song and his team from JPL; the California Institute of Technology, Pasadena, Calif.; University of California, Santa Barbara; and Ohio State University, Columbus, Ohio, theorized most of the height and energy generated by the 2004 Indian Ocean tsunami resulted from horizontal, not vertical, faulting motions. The study uses a 3-D earthquake-tsunami model based on seismograph and GPS data to explain how the fault’s horizontal motions might be the major cause of the tsunami’s genesis.



Scientists have long believed tsunamis form from vertical deformation of seafloor during undersea earthquakes. However, seismograph and GPS data show such deformation from the 2004 Sumatra earthquake was too small to generate the powerful tsunami that ensued. Song’s team found horizontal forces were responsible for two-thirds of the tsunami’s height, as observed by three satellites (NASA’s Jason, the U.S. Navy’s Geosat Follow-on and the European Space Agency’s Environmental Satellite), and generated five times more energy than the earthquake’s vertical displacements. The horizontal forces also best explain the way the tsunami spread out across the Indian Ocean. The same mechanism was also found to explain the data observed from the 2005 Nias earthquake and tsunami.



Co-author C.K. Shum of Ohio State University said the study suggests horizontal faulting motions play a much more important role in tsunami generation than previously believed. “If this is found to be true for other tsunamis, we may have to revise some early views on how tsunamis are formed and where mega tsunamis are likely to happen in the future,” he said.

35-year glacier study reveals looming crisis





Glaciers in Swiss Alps are disappearing at an alarming rate, forcing scientists to conclude that smaller glaciers will disappear altogether in the next decade
Glaciers in Swiss Alps are disappearing at an alarming rate, forcing scientists to conclude that smaller glaciers will disappear altogether in the next decade

A 35-year University of Salford study into river flow from glaciers in the Swiss Alps has revealed that lack of winter snow as well as warming air temperatures are causing glaciers to melt at an alarming rate – with some smaller glaciers likely to disappear in the next decade.



Professor David Collins from the University’s School of Environment & Life Sciences has been measuring river flow from Gornergletscher, near Zermatt since 1974 – and the project now has the longest, most continuous and most detailed records of meltwater discharge and glacier river water quality in the world.



River flows have doubled since the 1970s, but are now starting to reduce. Declining flows will allow river temperatures to increase and dissolved oxygen levels to fall – changing the Alpine meltwater eco-system forever.


Professor Collins said: “The contribution of river flow from the loss of ice can’t go on forever. When these glaciers disappear completely there will be no meltwater at all and flow will be greatly decreased and will have changed nutrient characteristics.”



The ongoing experiment, which measures water flow, water temperature and solute chemistry, now runs with automatic data collection throughout the year. Professor Collins began his study as a PhD student back in 1974. Since then he has been taking undergraduate and postgraduate students from Salford and other universities on field trips every summer.



“The result of these changes to our eco-system will simply be disastrous for hydropower and for aquatic life,” warned Professor Collins.

For geoscientist Simons, Earth’s deepest secrets may come from the sea





The MERMAID is designed to hang between the surface and sea bottom at a depth of up to 1,500 meters, depending on where its opportunities for detecting earthquakes are thought to be optimal. Upon sensing a quake, it will surface, transmit its findings to a satellite, and return to its post. (Photo: Courtesy of Frederik Simons)
The MERMAID is designed to hang between the surface and sea bottom at a depth of up to 1,500 meters, depending on where its opportunities for detecting earthquakes are thought to be optimal. Upon sensing a quake, it will surface, transmit its findings to a satellite, and return to its post. (Photo: Courtesy of Frederik Simons)

Princeton Earth scientist Frederik Simons believes the answers to questions about such unpredictable and destructive acts of nature as earthquakes and volcanoes might best be found floating in the ocean.



Despite hundreds of seismometers and geological studies, scientists still have an imperfect understanding of what happens deep within the planet where these phenomena begin. Simons has developed a custom-made, free-floating sensor that could provide a clearer picture of the Earth’s interior by using the additional perspective only the oceans can provide.



“We’d essentially like to take a CAT scan of the Earth,” said Simons, an assistant professor of geosciences. “But all we really have are land-based seismometers. That leaves more than 70 percent of the Earth’s surface uncovered — it’s kind of like putting your head in the CAT scanner with a metal plate blocking most of the sensors. You’re just not going to get a very good image of what’s inside.”



Land-based seismometers have been used for decades to record and measure the movements of the Earth’s surface, especially the minute vibrations produced by distant, deep earthquakes and other seismic events. As these waves pass through the planet, they are bent, reflected and refracted by the different types of material in the interior, similar to the way light rays are altered when they pass from air to water. Careful analysis of seismic waves has helped in determining the planet’s makeup, but seismometers’ limited coverage of the surface has provided only blurry, incomplete images.



Simons’ creation, called MERMAID — short for Mobile Earthquake Recorder in Marine Areas by Independent Divers — might improve science’s vision. The torpedo-shaped device is designed to dive up to a kilometer beneath the waves and hang between the surface and sea floor, listening for the slow vibrations that come from earthquakes across the planet. When it hears something interesting, it will surface, transmit its data to a satellite, then submerge again — up to 150 times before its batteries need a manual recharge.



“If we can perfect the MERMAIDs, we might be able to start filling in some of the blank spots on our picture of the planet’s interior,” Simons said. “Underwater seismometers that sit on the ocean bottom exist, but they are expensive and often unreliable. We think that with a bunch of floating hydrophones, we might be able to do better.”



Though they are not in direct contact with the Earth’s solid surface, hydrophones, which are also the listening devices used in sonar, can detect the low-frequency sound waves that emanate from the ocean floor. A group of hydrophones spread through the oceans would not only allow Simons to probe the many subterranean regions that remain unexplored, but also to improve the clarity with which his colleagues can “see” regions that are familiar.



Only recently has technology sufficiently evolved to build MERMAIDs that can accomplish these tasks, though the inspiration for them first stirred in another Princetonian’s mind a generation ago.


Scanning a deeper ocean



“When I was on sabbatical at the Scripps Institution of Oceanography in 1988, I had the idea to explore some of the data they acquired with underwater instruments,” said Guust Nolet, the George J. Magee Professor of Geoscience and Geological Engineering. “I did discover one weak earthquake signal, which was enough to raise my interest.”



Nolet, who has an abiding interest in exploring the Earth with seismic waves, thought it might be possible to build sea-based equipment for his research, but found that the instrumentation at the time was too expensive.



“The time wasn’t ripe yet,” said Nolet, who is retiring from Princeton this year. “But when Frederik came to Princeton a few years ago as a postdoctoral fellow, oceanographers had developed floats that were a lot more affordable, so I suggested to him that this might be something to explore.”



Nolet has praised Simons’ subsequent efforts to develop the MERMAIDs, which may help solve mysteries that have existed since Nolet’s career began.



The Earth’s oceans lie on top of the planet’s crust, a layer of relatively cool and solid rock that stretches down a few kilometers. Below the continents, the crust is a few dozen kilometers thick. This outer shell of the Earth is fragmented into about 50 irregularly shaped chunks called tectonic plates — all of which grind against one another as they float upon a far deeper, very different sort of ocean.



This “ocean” is not made of water at all, but of warm, malleable rock. Called the mantle, it is the thickest of the planet’s layers, stretching some 2,900 kilometers toward the Earth’s core, about as far as the distance from Manhattan to Salt Lake City. This murky region is not quite warm enough to liquefy completely, but the intense heat radiating from the planet’s core of molten iron gives the mantle an oozy consistency that geophysicists liken to Silly Putty.





The prototype MERMAID, about 1.5 meters in length, is shown in the laboratory. Inside the metal tube are communications and recording equipment as well as pumps for submerging and surfacing. Mounted on the side is a hydrophone capable of hearing the faint rumbles of earthquakes and other seismic events that ripple through the planet and up into the oceans from the sea floor. (Photo: Courtesy of Frederik Simons)
The prototype MERMAID, about 1.5 meters in length, is shown in the laboratory. Inside the metal tube are communications and recording equipment as well as pumps for submerging and surfacing. Mounted on the side is a hydrophone capable of hearing the faint rumbles of earthquakes and other seismic events that ripple through the planet and up into the oceans from the sea floor. (Photo: Courtesy of Frederik Simons)

“The rock down there is flowing, but it’s not fast and hot like lava,” Simons said. “It only moves a few centimeters a year at most.”



The convective currents in this slow-motion ocean can take millions of years to push rock an appreciable distance, but they are responsible for the drift of continents across the planet and the rumbling of earthquakes as the plates collide. And while the mantle is not liquid through and through, rising “plumes” of hot buoyant material sometimes make it all the way up to the surface, melting in places, where volcanoes then sprout “hot spots” such as Hawaii.



Simple though plumes are to imagine, the uncertain details of their development have sprouted quite a volcanic debate among Earth scientists, including a contingent at Princeton. Do these plumes rise from all the way down, like elevator shafts from the Earth’s core, or might they develop at shallower depths? Are they essentially straight, or is their formation more complicated?



Seismometers help, but only sometimes. Waves from earthquakes are altered as they pass by a plume, offering clues to its nature — if only scientists can capture them.



“It’s easy to imagine magma penetrating the crust and creating a place like Iceland,” Simons said, referring to the island’s still-sputtering volcanoes. “But we still don’t know very well at all how deep Iceland’s plume goes. Take a few seismometers and plant them right atop the island, and you’ll only get a narrow view of the deep Earth. It’s a bit like closing one eye — you lose your depth perception.”



This is the irony of confining seismometers to small patches of dry land: They are meant to explore the mantle’s deeps, but are often limited in their sense of depth. Simons said that Earth scientists have argued passionately for years over whether there is a deep-seated mantle plume beneath Iceland, but do not have enough information to settle the debate — a problem that a few MERMAIDs might solve.



“With a dozen or so of these instruments, we could get a more nuanced sense of what’s happening,” Simons said. “We could gain the sort of three-dimensional perspective that a second eye provides, and finally complete the CAT scan of the Earth in all those difficult areas.”


Tuning their ears



Before members of Simons’ team can answer questions about Iceland, let alone the rest of the planet, they need reliable MERMAIDs to do their listening. At this point, his team is still refining its single prototype, which was built with the assistance of engineers and hydrophone specialists at Scripps, a part of the University of California-San Diego.



“We have deployed the MERMAID three times on shakedown missions off the San Diego coast,” said Jeff Babcock, a project scientist at Scripps who works on the hydrophone. “It’s an appealing project because if you put a single station at the bottom of the ocean, it would cost millions of dollars to install and maintain. If our team can surmount the technical challenges of the MERMAID, though, it should make for a much more viable solution.”



So far, the team’s challenges have proved numerous but surmountable: Their underwater listening device is a bit too good at hearing nearby earthquakes, and can get sidetracked by ship noise and the local marine life.



“The mechanism picks up nonseismological sounds, such as whale songs,” Simons said, “And local quakes drown out the sound of the faraway ones that we really need to image the Earth’s deep interior. It’s tough because we are trying to hear very faint sounds that have their energy below one hertz, about as low as you can go.”



Simons likens it to trying to hear a song being played in a dance club while standing in noisy traffic outside. “The thick walls are like the kilometers of Earth the more distant vibrations have to penetrate — only the deep bass notes can get through,” he said. “Noise, from nearby earthquakes — and the whales as well — mask the low throb of a vibration coming in from across the planet. But we’re getting there. Fortunately, we know what songs we’re listening for, and digital signal processing techniques have become very good in the last decade or so.”



Simons said they aim to drop their prototype into the ocean and tweak its hydrophone a few more times to perfect its hearing. Once they have finished, the team’s goal is to obtain enough funding to build more. The entire Pacific Ocean, encircled by enough volcanoes and other geological activity to be labeled the “ring of fire,” awaits their exploration, not to mention other hot spots around the globe.



“The next project would be to take 20-odd MERMAIDs and go right to one of the blank spots on the map,” Simons said. “There are still a lot of them, and it would be great to start filling them in at last.”

Earthquake seer wins accolade from star gazers


An ANU seismologist whose work could help forecast the damage path of future earthquakes has been honoured by one of the world’s top scientific organisations.



The Royal Astronomical Society (RAS) in London has awarded its 2008 Gold Medal for Geophysics to Professor Brian Kennett, Director of the Research School of Earth Sciences at ANU. In the citation, the researcher is described as “one of the most complete seismologists of his generation”.



Professor Kennett works on determining the structure of the Earth using the waves generated by earthquakes and man-made sources. His reference models for the structure of the Earth have been adopted as standards for the location of earthquakes across the globe, and are widely used in imaging the planet’s interior.



“It’s an honour to be recognised for my work by the Royal Astronomical Society, but the best reward is learning more about the powerful forces at play within the Earth,” Professor Kennett said.


“Some of my recent work in cooperation with Japanese colleagues has shown how the ground motion generated by a large earthquake is affected by Earth’s structure.



“With three-dimensional computational models it’s now possible to simulate the effect of possible earthquakes to assess likely patterns of damage. For example, puzzling observations – such as strong ground motion on the east coast of Japan from earthquakes at 600 km almost beneath China – can be explained by guided waves trapped with the descending Pacific Sea plate.



Professor Kennett’s work combines theoretical, computational and observational approaches to gain comprehensive information about the three-dimensional structure within the Earth on scales from local, through regional to global. He has worked on the techniques of seismic tomography, where structure is reconstructed from the properties of the seismic waves passing through it a similar way to a CAT scan in medical work. “Such studies reveal the detailed properties of the subduction zones that generate great earthquakes, such as the 2004 Sumatran-Andaman event that produced the devastating tsunami across Southeast Asia,” Professor Kennett said.



He has also led a major program of work on the structure of beneath the Australian region using deployments of portable instruments that provide high-fidelity recordings of earthquakes. The analysis of the seismograms recorded by these instruments has revealed strong contrasts at depth beneath Australia that link to the ages of rocks exposed at surface.

Man-Made Changes Bring About New Epoch in Earth’s History


Geologists from the University of Leicester propose that humankind has so altered the Earth that it has brought about an end to one epoch of Earth’s history and marked the start of a new epoch.



Jan Zalasiewicz and Mark Williams at the University of Leicester and their colleagues on the Stratigraphy Commission of the Geological Society of London have presented their research in the journal GSA Today.



In it, they suggest humans have so changed the Earth that on the planet the Holocene epoch has ended and we have entered a new epoch – the Anthropocene.


They have identified human impact through phenomena such as:



  • Transformed patterns of sediment erosion and deposition worldwide



  • Major disturbances to the carbon cycle and global temperature



  • Wholesale changes to the world’s plants and animals



  • Ocean acidification


The scientists analysed a proposal made by Nobel Prize-winning chemist Paul Crutzen in 2002. He suggested the Earth had left the Holocene and started the Anthropocene era because of the global environmental effects of increased human population and economic development.



The researchers argue that the dominance of humans has so physically changed Earth that there is increasingly less justification for linking pre- and post-industrialized Earth within the same epoch – the Holocene.



The scientists said their findings present the scholarly groundwork for consideration by the International Commission on Stratigraphy for formal adoption of the Anthropocene as the youngest epoch of, and most recent addition to, the Earth’s geological timescale.



They state: “Sufficient evidence has emerged of stratigraphically significant change (both elapsed and imminent) for recognition of the Anthropocene-currently a vivid yet informal metaphor of global environmental change-as a new geological epoch to be considered for formalization by international discussion.”

Natural Gas Formation By Bacteria Linked To Climate Change And Renewable Energy


Natural gas reservoirs in Michigan’s Antrim Shale are providing new information about global warming and the Earth’s climate history, according to a recent study by Steven Petsch, a geoscientist at the University of Massachusetts Amherst. The study is also good news for energy companies hoping to make natural gas a renewable resource.



Petsch found that carbon-hungry bacteria trapped deep in the rock beneath ice sheets produced the gas during the ice age, as glaciers advanced and retreated over Michigan. “Bacteria digested the carbon in the rocks and made large amounts of natural gas in a relatively short time, tens of thousands of years instead of millions,” says Petsch. “This suggests that it may be possible to seed carbon-rich environments with bacteria to create natural gas reservoirs.”



The study also helps explain high levels of methane in the atmosphere that occurred between ice ages, a trend recorded in ice cores taken from Greenland and Antarctica. “When the ice sheets retreated, it was like uncapping a soda bottle,” says Petsch. “Natural gas, which is mostly methane, was released from the shale into the atmosphere.”



This research can be used in current climate change models to account for the effects of melting glaciers,” says Petsch. “Climate scientists haven’t focused on the role that geologic sources of methane play in global warming.”


Petsch used the chemistry of water and rock samples from the shale, which sits like a bowl beneath northern Michigan, to recreate the past. For most of its history, the Antrim Shale contained water that was too salty to allow bacteria to grow. But areas rich in natural gas showed an influx of fresh water that was chemically different from modern rainfall. “This water, which is similar to meltwater from glaciers formed during the ice age, was injected into the rock by the pressure of the overlying ice sheets,” says Petsch.



Glacial meltwater diluted the salt water already present in the shale, allowing the bacteria to thrive and quickly digest available carbon. The natural gas they produced was chemically similar to the surrounding water and had a unique carbon chemistry that proved its bacterial origin. Petsch calculated that trillions of cubic feet of natural gas were eventually stored in the shale under pressure.



At least 75 percent of the gas was released into the atmosphere as the ice sheets retreated, adding to methane from other sources such as tropical wetlands. While methane from the Antrim Shale accounts for a small fraction of the rise in methane observed between ice ages, there are many natural gas deposits that were formed in the same geologic setting. The cumulative effect may have caused large emissions of methane to the atmosphere.



Klaus Nüsslein of the UMass Amherst microbiology department analyzed DNA from water samples and identified bacteria capable of breaking down hydrocarbons in the rock. Other microbes were present that produced methane from the break-down products. Both of these groups can live without oxygen. Identifying and studying the needs of these microbes, which are capable of living deep in the Earth, is an important step in creating new natural gas reserves.



Results were published in the February 2008 issue of Geology. Additional members of the team include post-doctoral researcher Michael Formolo and undergraduate student Jeffrey Salacup of the University of Massachusetts Amherst and Anna Martini, a professor of geology at Amherst College. The research was funded by the National Science Foundation and the Research Partnership to Secure Energy for America.

Antarctic ice loss speeds up, nearly matches Greenland loss


Ice loss in Antarctica increased by 75 percent in the last 10 years due to a speed-up in the flow of its glaciers and is now nearly as great as that observed in Greenland, according to a new, comprehensive study by UC Irvine and NASA scientists.



In a first-of-its-kind study, an international team led by Eric Rignot, professor of Earth system science at UCI and a scientist with NASA’s Jet Propulsion Laboratory, Pasadena, Calif., estimated changes in Antarctica’s ice mass between 1996 and 2006 and mapped patterns of ice loss on a glacier-by-glacier basis. They detected a sharp jump in Antarctica’s ice loss, from enough ice to raise global sea level by 0.3 millimeters (.01 inches) a year in 1996, to 0.5 millimeters (.02 inches) a year in 2006.



Rignot said the losses, which were primarily concentrated in West Antarctica’s Pine Island Bay sector and the northern tip of the Antarctic Peninsula, are caused by ongoing and past acceleration of glaciers into the sea. This is mostly a result of warmer ocean waters, which bathe the buttressing floating sections of glaciers, causing them to thin or collapse. “Changes in Antarctic glacier flow are having a significant, if not dominant, impact on the mass balance of the Antarctic ice sheet,” he said.



Results of the study are published in February’s issue of Nature Geoscience.



To infer the ice sheet’s mass, the team measured ice flowing out of Antarctica’s drainage basins over 85 percent of its coastline. They used 15 years of satellite radar data from the European Earth Remote Sensing-1 and -2, Canada’s Radarsat-1 and Japan’s Advanced Land Observing satellites to reveal the pattern of ice sheet motion toward the sea. These results were compared with estimates of snowfall accumulation in Antarctica’s interior derived from a regional atmospheric climate model spanning the past quarter century.


The team found that the net loss of ice mass from Antarctica increased from 112 (plus or minus 91) gigatonnes a year in 1996 to 196 (plus or minus 92) gigatonnes a year in 2006. A gigatonne is one billion metric tons, or more than 2.2 trillion pounds. These new results are about 20 percent higher over a comparable time frame than those of a NASA study of Antarctic mass balance last March that used data from the NASA/German Aerospace Center Gravity Recovery and Climate Experiment. This is within the margin of error for both techniques, each of which has its strengths and limitations.



Rignot says the increased contribution of Antarctica to global sea level rise indicated by the study warrants closer monitoring.



“Our new results emphasize the vital importance of continuing to monitor Antarctica using a variety of remote sensing techniques to determine how this trend will continue and, in particular, of conducting more frequent and systematic surveys of changes in glacier flow using satellite radar interferometry,” Rignot said. “Large uncertainties remain in predicting Antarctica’s future contribution to sea level rise. Ice sheets are responding faster to climate warming than anticipated.”



Rignot said scientists are now observing these climate-driven changes over a significant fraction of the West Antarctic Ice Sheet, and the extent of the glacier ice losses is expected to keep rising in the years to come. “Even in East Antarctica, where we find ice mass to be in near balance, ice loss is detected in its potentially unstable marine sectors, warranting closer study,” he said.



Other organizations participating in the NASA-funded study are Centro de Estudios Cientificos, Valdivia, Chile; University of Bristol, United Kingdom; Institute for Marine and Atmospheric Research, Utrecht University, Utrecht, The Netherlands; University of Missouri, Columbia, Mo.; and the Royal Netherlands Meteorological Institute, De Bilt, The Netherlands.

Earth’s getting ‘soft’ in the middle





Earth cutaway from core to exosphere.
Earth cutaway from core to exosphere.

Since we can’t sample the deepest regions of the Earth, scientists watch the velocity of seismic waves as they travel through the planet to determine the composition and density of that material. Now a new study suggests that material in part of the lower mantle has unusual electronic characteristics that make sound propagate more slowly, suggesting that the material there is softer than previously thought. The results call into question the traditional techniques for understanding this region of the planet. The authors, including Alexander Goncharov from the Carnegie Institution’s Geophysical Laboratory, present their results in the January 25, 2008, issue of Science.



The lower mantle extends from about 400 miles to 1800 miles (660-2900 kilometers) into Earth and sits atop the outer core. Pressures and temperatures are so brutal there that materials are changed into forms that don’t exist in rocks at the planet’s surface and must be studied under carefully controlled conditions in the laboratory. The pressures range from 230,000 times the atmospheric pressure at sea level (23 GPa), to 1.35 million times sea-level pressure (135 GPa). And the heat is equally extreme-from about 2,800 to 6,700 degrees Fahrenheit (1800K-4000K).


Iron is abundant in the Earth, and is a major component of the minerals ferropericlase and the silicate perovskite in the lower mantle. In previous work, researchers found that the outermost electrons of iron in ferropericlase are forced to pair up under the extreme pressures creating a so-called spin-transition zone within the lower mantle.



“What happens when unpaired electrons-called a high-spin state-are forced to pair up is that they transition to what is called a low-spin state. And when that happens, the conductivity, density, and chemical properties change,” explained Goncharov. “What’s most important for seismology is the acoustic properties-the propagation of sound. We determined the elasticity of ferropericlase through the pressure-induced high-spin to low-spin transition. We did this by measuring the velocity of acoustic waves propagating in different directions in a single crystal of the material and found that over an extended pressure range (from about 395,000 to 590,000 atmospheres) the material became ‘softer’-that is, the waves slowed down more than expected from previous work. Thus, at high temperature corresponding distributions will become very broad, which will result in a wide range of depth having subtly anomalous properties that perhaps extend through most of the lower mantle.”



The results suggest that scientists may have to go back to the drawing board to model this region of the Earth.

New radar satellite technique sheds light on ocean current dynamics





Radial surface velocity over the Amazon delta from ASAR image mode obtained on 25 January and 10 May 2006. - Credits: ESA - BOOST Technologies
Radial surface velocity over the Amazon delta from ASAR image mode obtained on 25 January and 10 May 2006. – Credits: ESA – BOOST Technologies

Ocean surface currents have long been the focus of research due to the role they play in weather, climate and transportation of pollutants, yet essential aspects of these currents remain unknown.



By employing a new technique – based on the same principle as police speed-measuring radar guns – to satellite radar data, scientists can now obtain information necessary to understand better the strength and variability of surface current regimes and their relevance for climate change.



Scientists at the SeaSAR 2008 workshop held this week in ESRIN, ESA’s European Centre for Earth Observation in Frascati, Italy, demonstrated how this new method on data from the Advanced Synthetic Aperture Radar (ASAR) instrument aboard ESA’s Envisat, enabled measurements of the speed of the moving ocean surface.



Synthetic Aperture Radar (SAR) instruments, such as ASAR, record microwave radar backscatter in order to identify roughness patterns, which are linked to varying surface winds, waves and currents of the ocean surface. However, interpreting radar images to identify and quantify surface currents had proven very difficult.



By using the new information embedded in the radar signal – the Doppler shift of the electromagnetic waves reflected from the water surface – Dr Bertrand Chapron of the French Research Institute for Exploitation of the Sea (IFREMER), Dr Johnny Johannessen of Norway’s Nansen Environmental and Remote Sensing Centre (NERSC) and Dr Fabrice Collard of France’s BOOST Technologies were able to determine how surface winds and currents contribute to the Doppler shift.



The Doppler shift occurs due to changing relative velocities, experienced in everyday life in the way the pitch of a siren on a passing ambulance goes up as it approaches, then goes down as the vehicle recedes away.



The shift is introduced by the relative motion between the satellite platform, the rotation of the Earth and the velocity of the particular facets of the sea surface from which the SAR signal scatters back to orbit. The initial two values are well known – particularly for Envisat, with its very stable satellite orbit and attitude – and can be simply subtracted to extract the useful sea surface velocity information.



Chapron first demonstrated the concept in 2005 with initial tests carried out over the Gulf Stream. Although the results were promising, repeat acquisitions and careful validation were not possible. However, based on these conclusions ESA upgraded its ASAR ground segment in July 2007 to systematically process and disseminate a Doppler grid product, a regularly spaced collection of individual Doppler information, for all Wide Swath acquired images.


The Doppler grid, embedded in ESA standard products, is now regularly tested on a number of so-called super-sites, including regions of the Gulf Stream and the greater Agulhas Current, both among the strongest western boundary currents of the world’s oceans.



“These measurements are very useful for advancing the understanding of surface current dynamics and mesoscale variability, as well as for determining surface drift, important for oil dispersion and pollution transport and for wave-current interaction, probably influencing the existence of extreme waves,” Johannessen said.



“The method at this very high resolution could also complement the use of additional information sources to improve 3-D ocean models. Its use for sensor synergy with radiometry, spectrometry and altimetry is very promising,” Chapron added.



The ground segment upgrade is also allowing the scientists to examine the anticipated Doppler shift signal of the river outflow at the mouth of the Amazon delta to monitor river runoff and improve our understanding of hydrological processes.



Chapron and Collard also presented their Near Real Time global swell wave observations to the workshop, attended by 150 participants from 25 countries. Using standard processed SAR ESA wave mode products, the team produces three hourly animations every morning for the Atlantic, Pacific and Indian Oceans and makes them available online.



Tracking swell waves from space is very important because they are generally preceded by calm water, making it impossible to detect their arrival from shore. Envisat’s Wave Mode acquires 10 by 5 km small images, or ‘imagettes’, of the sea surface every 100 km along the satellite orbit. These small images, which depict the wave groups, are then mathematically transformed into wave energy and direction, called ocean-wave spectra.



ESA has provided SAR data to some 500 oceanography projects since 1998 and remains committed to providing continuity to its SAR missions. As part of its Global Monitoring for Environment and Security (GMES) programme, the agency will launch the Sentinels – the first series of operational satellites responding to the Earth Observation needs of GMES, a joint initiative of the European Commission and ESA.



Sentinel-1, expected to be launched in 2011, will ensure the continuity of C-band SAR data with ESA’s ERS-2 and Envisat satellites. Important applications driving the mission concept include marine – vessel detection, oil spill mapping and sea ice mapping. With these new findings, Sentinel-1 is expected to provide additional information, such as consistent wind, wave and current products.