Arctic ice on the verge of another all-time low





Envisat ASAR mosaic from mid-August 2008 showing an almost ice-free Northwest Passage. The direct route through the Northwest Passage is highlighted in the picture by an orange line. The orange dotted line shows the indirect route, called the Amundsen Northwest Passage, which has been passable for almost a month. - Credits: ESA
Envisat ASAR mosaic from mid-August 2008 showing an almost ice-free Northwest Passage. The direct route through the Northwest Passage is highlighted in the picture by an orange line. The orange dotted line shows the indirect route, called the Amundsen Northwest Passage, which has been passable for almost a month. – Credits: ESA

Following last summer’s record minimum ice cover in the Arctic, current observations from ESA’s Envisat satellite suggest that the extent of polar sea-ice may again shrink to a level very close to that of last year.



Envisat observations from mid-August depict that a new record of low sea-ice coverage could be reached in a matter of weeks. The animation above is a series of mosaics of the Arctic Ocean created from images acquired between early June and mid-August 2008 from the Advanced Synthetic Aperture Radar (ASAR) instrument aboard Envisat. The dark grey colour represents ice-free areas while blue represents areas covered with sea ice.



Current ice coverage in the Arctic has already reached the second absolute minimum since observations from space began 30 years ago. Because the extent of ice cover is usually at its lowest about mid-September, this year’s minimum could still fall to set another record low.



Each year, the Arctic Ocean experiences the formation and then melting of vast amounts of ice that floats on the sea surface. An area of ice the size of Europe melts away every summer reaching a minimum in September. Since satellites began surveying the Arctic in 1978, there has been a regular decrease in the area covered by ice in summer �” with ice cover shrinking to its lowest level on record and opening up the most direct route through the Northwest Passage in September 2007.



The direct route through the Northwest Passage – highlighted in the image above by an orange line – is currently almost free of ice, while the indirect route, called the Amundsen Northwest Passage, has been passable for almost a month. This is the second year in a row that the most direct route through the Northwest Passage has opened up.



Prof. Heinrich Miller from the Alfred Wegener Institute (AWI) in Bremerhaven, Germany commented that, “Our ice-breaking research vessel ‘Polarstern’ is currently on a scientific mission in the Arctic Ocean. Departing from Iceland, the route has taken the ship through the Northwest Passage into the Canadian Basin where geophysical and geological studies will be carried out along profiles into the Makarov Basin to study the tectonic history and submarine geology of the central Arctic Ocean. In addition, oceanographic as well as biological studies will be carried out. Polarstern will circumnavigate the whole Arctic Ocean and exit through the Northeast Passage.”


Regarding the use of satellite data for polar research Miller continues, “The polar regions, especially the Arctic, are very sensitive indicators of climate change. The UN’s Intergovernmental Panel on Climate Change has shown that these regions are highly vulnerable to rising temperatures and predicted that the Arctic would be virtually ice-free in the summer months by 2070. Other scientists claim it could become ice-free as early as 2040. Latest satellite observations suggest that the Arctic could be mainly ice-free even earlier.”



Miller added, “At AWI we place particular emphasis on studying Arctic sea-ice, and along with in-situ studies of sea-ice thickness change satellite data have been used extensively – not only for the regular observation of changes in the Arctic and Antarctic, but also for optimising the operation of Polarstern in regions covered by sea ice.”



The Arctic is one of the most inaccessible regions on Earth, so obtaining measurements of sea ice was difficult before the advent of satellites. For more than 20 years, ESA has been providing satellite data for the study of the cryosphere and hence revolutionising our understanding of the polar regions.



Satellite measurements from radar instruments can acquire images through clouds and also at night. This capability is especially important in areas prone to long periods of bad weather and extended darkness – conditions frequently encountered in the polar regions.



By making available a comprehensive dataset from its Earth Observation satellites and other ground and air-based capabilities, ESA is currently also contributing to one of the most ambitious coordinated science programme ever undertaken in the Arctic and Antarctic – the International Polar Year 2007-2008.



Further exploitation of data collected over the Arctic since 1991 is part of an ESA Initiative on Climate Change that will be proposed to the ESA Member States at its Ministerial Conference in November 2008. The proposal aims to ensure delivery of appropriate information on climate variables derived from satellites.



In 2009, ESA will make another significant contribution research into the cryosphere with the launch of CryoSat-2. The observations made over the three-year lifetime of the mission will provide conclusive evidence on the rates at which ice thickness and cover is diminishing.

Crystals improve understanding of volcanic eruptions





Santorini, in Greece
Santorini, in Greece

Scientists have exploited crystals from lavas to unravel the records of volcanic eruptions.



The team, from Durham University and the University of Leeds, studied crystal formation from a volcano, in Santorini, in Greece, to calculate the timescale between the trigger of volcanic activity and the volcano’s eruption.



They say the technique can be applied to other volcanoes – such as Vesuvius, near Naples, in Italy – and will help inform the decisions of civil defence agencies.



Worldwide, it is estimated that between 50 and 70 volcanoes erupt each year, but due to the long gaps between eruptions at most volcanoes it is hard to understand how any individual volcano behaves. This work allows scientists to better understand this behaviour.



The research, funded by the Natural Environment Research Council (NERC), is published this week in the prestigious scientific journal Science.



The scientists looked at crystals from the 1925-28 eruption of Nea Kameni, in Santorini.



Lead author Dr Victoria Martin, of Durham University, showed that the crystal rims reacted with molten rock, or magma, as it moved into the volcano’s shallow chamber prior to eruption. This process is thought to be associated with shallow level earthquake activity, as shown by modern volcano monitoring.



By studying the area between the crystal core and the rim the team then worked out how long the rims had existed – revealing how long the magma was in the shallow chamber before it erupted.


The crystals showed the 1925-28 eruption at Nea Kameni took place three to ten weeks after the magma entered the shallow system.



As magma movement typically causes seismic activity, if any future seismic or inflation activity at Nea Kameni can be linked to magma recharge of the volcano, the scientists predict an eruption could follow within a similar timescale.



They hope this method can be applied to other volcanoes, allowing the pre-eruption behaviour to be better understood – and understanding of volcanoes to be extended back further in time.



Professor Jon Davidson, Chair of Earth Sciences at Durham University, said: “We hope that what we find in the crystals in terms of timescales can be linked with phenomena such as earthquakes



“If we can relate the timescales we measure to such events we may be able to say when we could expect a volcano to erupt.



“This is an exciting new method that will help us understand the timescales of fundamental volcanic processes driving eruptions.”



Co-author Dr Dan Morgan, from the School of Earth and Environment, at the University of Leeds, said: “We hope to develop these techniques further and apply them to more volcanoes worldwide.



“Potentially, these techniques could extend our knowledge of volcanic recharge considerably, as they can be applied to material erupted before volcanic monitoring was commonplace.”

Yellowstone’s Ancient Supervolcano: Only Lukewarm?





Yellowstone National Park and its famous geysers are the remnants of an ancient supervolcano. - Credit: U.S. Geological Survey
Yellowstone National Park and its famous geysers are the remnants of an ancient supervolcano. – Credit: U.S. Geological Survey

Molten plume of material beneath Yellowstone cooler than expected



The geysers of Yellowstone National Park owe their eistence to the “Yellowstone hotspot”–a region of molten rock buried deep beneath Yellowstone, geologists have found.



But how hot is this “hotspot,” and what’s causing it?



In an effort to find out, Derek Schutt of Colorado State University and Ken Dueker of the University of Wyoming took the hotspot’s temperature.



The scientists published results of their research, funded by the National Science Foundation (NSF)’s division of earth sciences, in the August, 2008, issue of the journal Geology.



“Yellowstone is located atop of one of the few large volcanic hotspots on Earth,” said Schutt. “But though the hot material is a volcanic plume, it’s cooler than others of its kind, such as one in Hawaii.”



When a supervolcano last erupted at this spot more than 600,000 years ago, its plume covered half of today’s United States with volcanic ash. Details of the cause of the Yellowstone supervolcano’s periodic eruptions through history are still unknown.


Thanks to new seismometers in the Yellowstone area, however, scientists are obtaining new data on the hotspot.



Past research found that in rocks far beneath southern Idaho and northwestern Wyoming, seismic energy from distant earthquakes slows down considerably.



Using the recently deployed seismometers, Schutt and Dueker modeled the effects of temperature and other processes that affect the speed at which seismic energy travels. They then used these models to make an estimate of the Yellowstone hotspot’s temperature.



They found that the hotspot is “only” 50 to 200 degrees Celsius hotter than its surroundings.



“Although Yellowstone sits above a plume of hot material coming up from deep with the Earth, it’s a remarkably ‘lukewarm’ plume,” said Schutt, comparing Yellowstone to other plumes.



Although the Yellowstone volcano’s continued existence is likely due to the upwelling of this hot plume, the plume may have become disconnected from its heat source in Earth’s core.



“Disconnected, however, does not mean extinct,” said Schutt. “It would be a mistake to write off Yellowstone as a ‘dead’ volcano. A hot plume, even a slightly cooler one, is still hot.”

Why is Greenland covered in ice?





Computer models show that while (tectonic) uplift of the Rocky Mountains may have contributed to increased ice cover on Greenland, this change was small in comparison with the ice sheet caused by a decrease in carbon dioxide. - Credit: Dan Lunt, University of Bristol
Computer models show that while (tectonic) uplift of the Rocky Mountains may have contributed to increased ice cover on Greenland, this change was small in comparison with the ice sheet caused by a decrease in carbon dioxide. – Credit: Dan Lunt, University of Bristol

Only changes in carbon dioxide levels are able to explain the transition from the mostly ice-free Greenland of three million years ago, to the ice-covered Greenland of today



There have been many reports in the media about the effects of global warming on the Greenland ice-sheet, but there is still great uncertainty as to why there is an ice-sheet there at all.



Reporting today (28 August) in the journal Nature, scientists at the University of Bristol and the University of Leeds show that only changes in atmospheric carbon dioxide are able to explain the transition from the mostly ice-free Greenland of three million years ago, to the ice-covered Greenland of today.



Understanding why the ice formed on Greenland three million years ago will help understand the possible response of the ice sheet to future climate change.



Dr Dan Lunt from the University of Bristol and funded by the British Antarctic Survey, explained: “Evidence shows that around three million years ago there was an increase in the amount of rock and debris deposited on the ocean floor around Greenland. These rocks could not have got there until icebergs started to form and could transport them, indicating that large amounts of ice on Greenland only began to form about three million years ago.


“Prior to that, Greenland was largely ice-free and probably covered in grass and forest. Furthermore, atmospheric carbon dioxide levels were relatively high. So the question we wanted to answer was why did Greenland become covered in an ice-sheet?”



There are several competing theories, ranging from changes in ocean circulation, the increasing height of the Rocky Mountains, changes in the Earth’s orbit, and natural changes in atmospheric greenhouse gas concentrations. Using state-of-the-art computer climate and ice-sheet models, Lunt and colleagues decided to test which, if any, of these theories was the most credible.



While the results suggest that climatic shifts associated with changes in ocean circulation and tectonic uplift did affect the amount of ice cover, and that the ice waxed and waned with changes in the Earth’s orbit, none of these changes were large enough to contribute significantly to the long-term growth of the Greenland ice sheet.



Instead, the new research suggests that the dominant cause of the Greenland glaciation was the fall from high atmospheric carbon dioxide levels to levels closer to that of pre-industrial times. Today concentrations are approaching the levels that existed while Greenland was mostly ice-free.



Dr Alan Haywood from the University of Leeds added: “So why did elevated atmospheric carbon dioxide concentrations fall to levels similar to the pre-industrial era? That is the million dollar question which researchers will no doubt be trying to answer during the next few years.”

New Analysis of Earthquake Zone off Oregon Coast Raises Questions


Oregon State University scientists have completed a new analysis of an earthquake fault line that extends some 200 miles off the southern and central Oregon coast that they say is more active than the San Andreas Fault in California.



The Blanco Transform Fault Zone likely won’t produce the huge earthquake many have predicted for the Pacific Northwest because it isn’t a subduction zone fault. But the scientists say an earthquake of magnitude 6.5 to 7.0 is possible, if not probable in the near future, and their analysis suggests that the region may be under some tectonic stress that potentially could affect the Cascadia Subduction Zone.



Results of the study were just published in the Journal of Geophysical Research.



During the past 40 years, there have been some 1,500 earthquakes of magnitude 4.0 or greater along the Blanco Transform Fault Zone, and many thousands of smaller quakes. The Blanco fault is the boundary between the Juan de Fuca and the Pacific plates. As the Juan de Fuca plate moves to the east, it is subducted beneath the North American plate at the rate of about 1.5 inches per year. But as it moves, it must break free of the adjacent Pacific plate.



This slippage causes the numerous earthquakes, according to John Nabelek, an associate professor in OSU’s College of Oceanic and Atmospheric Sciences and one of the authors of the study. When the earthquakes that relieve stress do not account for predicted motion rates, he added, it raises questions.



“The eastern portion of the fault has moved at a predictable rate and the earthquake activity associated with it has been what we would expect,” Nabelek said. “But the western part of the fault has been lagging in terms of the number and size of earthquakes. It seems to be straining, absorbing the motion.



“It could mean that the fault is getting ready for a large earthquake, or it could mean that the movement has been so gradual that we couldn’t detect it,” he added.



The OSU study is important because the Blanco Transform Fault has become the most intensely studied ocean transform fault in the world. Its close proximity to the Oregon coastline puts it within reach of land-based seismographs that can detect moderate ocean earthquakes. Another key is the research done at OSU’s Hatfield Marine Science Center, where marine geologist Bob Dziak monitors undersea seismic activity using a hydrophone system deployed by the U.S. Navy.


In April of this year, Dziak reported on a swarm of 600 earthquakes in 10 days in this region, including magnitude 5.4 and 5.0 events.



“Land stations also detected a four-fold increase in the number of earthquakes along the Blanco fault in 2008 compared to background rates,” Nabelek said, “with the largest anomaly in the enigmatic western part.”



Jochen Braunmiller , a research associate in OSU’s College of Oceanic and Atmospheric Sciences and lead author on the paper, says land-based seismographs can detect earthquakes of 4.0 or greater along the Blanco fault, and the ocean hydrophones monitored by Dziak can pick up quakes down to a magnitude of 3.0 and sometimes smaller, depending on location.



“Our monitoring may be missing a lot of earthquakes that are less than 3.0,” Braunmiller said. “The western side of the fault may be experiencing a series of mini-quakes that we can’t detect, or it could be slowing creeping along in a way we cannot measure.



“But we can’t discount the possibility that its energy hasn’t been released and it will some day in the form of a good-sized earthquake,” Braunmiller added.



The risk of a major tsunami from an earthquake in this transform fault is slim, the scientists point out, because the plates move sideways past each other. “You need quite a bit of vertical displacement on the ocean floor to generate a tsunami,” Braunmiller said, “and earthquakes along the Blanco fault don’t generate it.”



The Blanco Transform Fault Zone begins at a point about 100 miles off of Cape Blanco, south of Bandon, Ore., and extends in a northwest direction to a point about 300 miles off of Newport. Of all the world’s ocean transform faults – or those that lie between tectonic plates – it is the closest to shore and can be monitored more readily by land-based seismographs.



Northwest scientists have approximately 60 such land-based seismographs deployed from British Columbia to California that can pick up moderate offshore quakes.



“Between the land-based network, the hydrophones and other instruments, the threshold of detection for earthquakes has definitely lowered over the past 20 years,” Nabelek said. “But we still can’t tell whether the western part of the fault has thousands, or even millions of infinitesimal slips – or it is building up to a major earthquake.”

Researcher Uncovering Secrets of Ancient Eruptions and the Atmosphere


New Mexico Tech researcher Nelia Dunbar is uncovering ancient secrets about global climate by studying the chemical composition of volcanic ash in the West Antarctic Ice Sheet.



Dunbar, a geochemist and lab director at New Mexico Tech, presented her findings at an Antarctic Earth Science meeting in Santa Barbara in August 2007, and again at a Geochemical Society meeting in Vancouver in early July 2008. Along with fellow researchers at New Mexico Tech and the University of Maine, Dunbar is using her laboratory sleuthing skills to find the record of volcanic eruptions in ice that preserves a record of fluctuations in global temperatures as far back as 100,000 years ago.



From November 2007 to January 2008 – that’s summer in Antarctica –an army of scientists, engineers, technicians and students extracted a cylinder of ice from the 11,000-foot-thick ice sheet. The ice core is like a living record of precipitation, volcanic eruptions, greenhouse gases and other naturally-occurring atmospheric particles.



Dunbar’s specialty is volcanology. She is using her knowledge of volcanic activity and processes and her laboratory acumen to help the nation’s community of geologists and geophysicists deduce a broad picture of Earth’s climatic movements over the eons.



“We’re looking at specific layers of dust from specific volcanic eruptions,” Dunbar said. “This ice core contains a frozen record of the earth’s atmospheric history.”



The ice core is expected to be the first section of an 11,000-foot column of ice detailing 100,000 years of Earth’s climate history. So far, the first core of 1,800 feet has been drilled and transported to the National Ice Core facility in Denver.



The top section of ice can be visually broken down to year-by-year layers going back about 40,000 years. Below the first few hundred feet, however, the ice is too compressed to visually count the layers. At that point, other methods must be used to determine the age of the ice. One of the methods is to identify volcanic ash layers using an electron microprobe to identify the chemical profile of ash.



One main reason for drilling the ice core is to examine the history of atmospheric carbon-dioxide content and global temperatures over the last 100,000 years.



“The ice core contains a beautiful, detailed climate record,” Dunbar said. “But we need to know when changes in the climate happened. The volcanic record helps us to understand the chronology of the core.”



New Mexico Tech researchers Matt Heizler and Bill McIntosh have built a state-of-the-art Argon Geochronology Lab, and one of the specialties of the lab is to date volcanic rocks,– by vaporizing the rocks with a laser beam.



When volcanic rocks are formed, many contain crystals that are made, in part, of the element potassium. Over time, the potassium decays to an isotope of argon at a known rate. By vaporizing the crystals and releasing the argon, McIntosh can analyze the argon to determine a rock’s age. From a similar sample, Dunbar can determine the chemical composition of rocks associated with specific eruptions because each volcanic eruption has its own chemical signature.



The new WAIS ice core contains ash particles that are far too small and sparse to be dated in the Argon Lab. However, the chemical composition of the ash particles can be measured, and can be matched to ashes sampled found near the source volcanoes, where the deposits are thick and can be dated in the New Mexico Tech Argon Lab. Only Dunbar’s lab sleuthing can truly determine the chemical composition – and, hence, the age – of the deeply buried layers of volcanic ash in the ice. From ash pieces as small as 10 microns (1/100th of a millimeter) found in the ice core, she can determine the chemical composition of the ash, then associate that composition with a known, dated, volcanic eruption in West Antarctica.



“It’s a puzzle and that’s why a lot of science is intriguing,” Dunbar said. “We gather information, put it together and figure out something that you can’t observe directly. By putting together the pieces of the puzzle that we find in this ice core, we can learn about ancient occurrences.”



Dunbar said the West Antarctic Ice Sheet Divide is one of the best spots on the planet to recover ancient ice containing trapped air bubbles – samples of the Earth’s atmosphere as old as 100,000 years.



Finding good places to sample volcanic ash near a source volcano in West Antarctica can be challenging, because of the thick ice cover over most of the area. However, at certain locations on the ice sheet, ancient deposits of volcanic ash have been lifted to the surface by natural flows. In the summit crater of an extinct West Antarctic volcano, Mount Moulton, Dunbar and colleagues located and sampled a section of such ice.


“It’s the Rosetta Stone of volcanic ash,” she said. “We found a blue ice field on Mount Moulton where the ice has captured evidence of 40 separate and distinct eruptions of a nearby active volcano, Mount Berlin.”



Using the New Mexico Tech argon lab, McIntosh has dated a number of well-preserved crystals.



Scientists at the University of Maine are slowly, painstakingly dismantling and analyzing the ice core. When they find a layer of ash, they preserve a piece for Dunbar.



“This ice is very precious,” Dunbar said. “They can’t give me a huge section. They take a little portion of ice from that horizon of the core, melt it, filter it and send me the filter paper which has trapped the volcanic ash.”



Dunbar can then remove the ash and place it in an epoxy disk about 1-inch across. She then polishes the disk, which can be placed in the electron microprobe.



That device works like a microscope, but uses electrons instead of light. A microprobe can determine the chemical composition of a 1 micron spot on a sample – a feat no other instrument can do. The electron microprobe also can show spatial resolution of chemical variability on a sample surface.



In 1996 New Mexico Tech became the first university in the nation to install an electron microprobe. The $400,000 instrument – which costs about $750,000 today – is now a staple of research universities. The device functions as both a microscope and a mass spectrometer, providing an image of tiny objects and determining the objects’ chemical composition.



Dunbar needs at least six ash fragments from one layer to get a good chemical fingerprint. She prefers to analyze up to 30 fragments to minimize the possibility of natural contamination.



“We want a good, homogeneous, population of ash to look at the chemical composition,” Dunbar said. “If you find a single, random ash shard from South America, you would get the wrong impression. This shard might have been blown in by wind, rather than being brought by the volcanic eruption. Some eruptions have a very simple chemical composition, while others have a range. We can look at a chemical analysis and say it’s homogenous or we might find a sample that’s chemically variable.”



Dunbar is using her laboratory wizardry to help the scientific community expand its understanding of the past.



Scientists from more than 30 universities and government agencies around the nation are studying different aspects of the WAIS ice core – natural dust, biological materials, carbon-dioxide levels, methane levels and many other aspects.



“Each investigator offers information on a different aspect of the core,” Dunbar said. “You put it all together and you get a very complete picture of what Earth’s climate was doing. … No one person can do all that. That’s why it’s so multi-faceted.”



Together, they are putting together a large puzzle. In October, they will all convene in Denver to share their preliminary findings at the West Antarctic Ice Sheet Divide Ice Project.



“Scientists get together and share papers. There’s a huge amount of discussion. It’s exciting because we’re presenting data that is hot off the presses,” she said.



While other ice cores from Greenland have been used to develop longer records of Earth’s atmosphere, the record from Dunbar’s expedition will allow a more detailed study of the interaction of previous increases in greenhouse gases and climate change. Eventually, this information will improve computer models that are used to predict how the current high levels of greenhouse gases in the atmosphere caused by human activity might influence future climate, she said.

Drier, Warmer Springs in U.S. Southwest Stem From Human-Caused Changes in Winds





The late-winter/early-spring storm activity in the western U.S. has shifted north since the late 1970s. This graphic shows how the peak winter storm tracks have shifted poleward since 1978. The blue line shows the storm track for February, March and April of 1978. The red line shows the track for the same months during the year 1997. (Credit: Stephanie McAfee, The University of Arizona, 2008)
The late-winter/early-spring storm activity in the western U.S. has shifted north since the late 1970s. This graphic shows how the peak winter storm tracks have shifted poleward since 1978. The blue line shows the storm track for February, March and April of 1978. The red line shows the track for the same months during the year 1997. (Credit: Stephanie McAfee, The University of Arizona, 2008)

Human-driven changes in the westerly winds are bringing hotter and drier springs to the American Southwest, according to new research from The University of Arizona.



Since the 1970s the winter storm track in the western U.S. has been shifting north, particularly in the late winter. As a result, fewer winter storms bring rain and snow to Southern California, Arizona, Nevada, Utah, western Colorado and western New Mexico.



“We used to have this season from October to April where we had a chance for a storm,” said Stephanie A. McAfee. “Now it’s from October to March.”



The finding is the first to link the poleward movement of the westerly winds to the changes observed in the West’s winter storm pattern. The change in the westerlies is driven by the atmospheric effects of global warming and the ozone hole combined.



“When you pull the storm track north, it takes the storms with it,” said McAfee, a doctoral candidate in the UA’s department of geosciences.



“During the period it’s raining less, it also tends to be warmer than it used to be,” McAfee said. “We’re starting to see the impacts of climate change in the late winter and early spring, particularly in the Southwest. It’s a season-specific kind of drought.”



Having drier, warmer conditions occur earlier in the year will affect snowpack, hydrological processes and water resources, McAfee said.



Other researchers, including the UA’s Laboratory of Tree-Ring Research Director Tom Swetnam, have linked warmer, drier springs to more and larger forest fires.



McAfee’s co-author Joellen L. Russell said, “We’re used to thinking about climate change as happening sometime in the future to someone else, but this is right here and affects us now. The future is here.”


McAfee and Russell, a UA assistant professor of geosciences, will publish their paper, Northern Annular Mode Impact on Spring Climate in the Western United States, in Geophysical Research Letters, a journal of the American Geophysical Union.



The National Oceanic and Atmospheric Administration funded the research via the Climate Assessment for the Southwest program at the UA.



Atmospheric scientists have documented that the westerly winds, or storm track, have been shifting poleward for several decades. The southwestern U.S. has experienced less winter precipitation during the same period.



Computer models of future climate and atmospheric conditions suggest the storm track will continue to move north and that precipitation will continue to decrease in the southwestern U.S.



The timing of the change from wet, cool winter weather to the warmer dry season is important for many ecological processes in the arid Southwest. Therefore, McAfee wanted to know how the shift in the storm track affected precipitation during the transition from winter to spring.



For the period 1978 to 1998, the researchers compared the month-to-month position of the winter storm track, temperature and precipitation records from the western U.S., and pressure at different levels in the atmosphere.



The team used a statistical method called Monte Carlo simulations to test whether the coincidence of storm track and weather patterns had occurred by chance. Russell said the results of the simulation showed, “It’s very rare that you get this distribution by chance.”



Therefore, she said, the changes in late winter precipitation in the West from 1978 to 1998 are related to the changes in the storm track path for that same time period.



McAfee said her next step is investigating whether western vegetation has changed as the storm track has changed.

GOCE Earth explorer satellite to look at the Earth’s surface and core





GOCE positioned on the rotary table for alignment check, during launch campaign at the Plesetsk Cosmodrome. - Credits: ESA
GOCE positioned on the rotary table for alignment check, during launch campaign at the Plesetsk Cosmodrome. – Credits: ESA

The European Space Agency is about to launch the most sophisticated mission ever to investigate the Earth’s gravitational field and to map the reference shape of our planet – the geoid – with unprecedented resolution and accuracy.



The Gravity field and steady-state Ocean Circulation Explorer (GOCE) will be placed onto a low altitude near sun-synchronous orbit by a Russian Rockot vehicle launched from the Plesetsk Cosmodrome in Northern Russia, some 800 km north of Moscow. Lift-off is scheduled to take place at 16:21 CEST (14:21 UTC) on Wednesday 10 September. The launcher is operated by Eurockot Launch Services, a joint venture between EADS Astrium and the Khrunichev Space Centre (Russia).



ESA’s 1-tonne spacecraft carries a set of six state-of-the-art high-sensitivity accelerometers to measure the components of the gravity field along all three axes. The data collected will provide a high-resolution map of the geoid (the reference surface of the planet) and of gravitational anomalies. Such a map will not only greatly improve our knowledge and understanding of the Earth’s internal structure, but will also be used as a much better reference for ocean and climate studies, including sea-level changes, oceanic circulation and ice caps dynamics survey. Numerous applications are expected in climatology, oceanography and geophysics, as well as for geodetic and positioning activities.



To make this mission possible, ESA, its industrial partners (45 European companies led by Thales Alenia Space) and the science community had to overcome an impressive technical challenge by designing a satellite that will orbit the Earth close enough to gather high-accuracy gravitational data while being able to filter out disturbances caused by the remaining traces of the atmosphere in low Earth orbit (at an altitude of only 260 km). This resulted in a slender 5-m-long arrowhead shape for aerodynamics with low power ion thrusters to compensate for the atmospheric drag.


GOCE is the first Core Mission of the Earth Explorer programme undertaken by ESA in 1999 to foster research on the Earth’s atmosphere, biosphere, hydrosphere, cryosphere and interior, on their interactions and on the impact of human activities on these natural processes. It will be the first in a whole series of Earth Explorer missions with five launches to take place within the next two years.



Two more Core Missions, selected to address specific topics of major public concern are already under development: ADM-Aeolus for atmospheric dynamics (2010), and EarthCARE to investigate the Earth’s radiative balance (2013). Three smaller Earth Explorer Opportunity Missions are also in preparation: CryoSat-2 to measure ice sheet thickness (2009), SMOS to study soil moisture and ocean salinity (2009) and Swarm to survey the evolution of the magnetic field (2010).



On the occasion of the launch of GOCE, ESA will open a Press Centre at ESA/ESRIN in Frascati, Italy from 14:00 to 20:00, hosting a launch event from 15:30 to 18:15.



A live televised transmission of the launch will bring images from Plesetsk and from mission control at ESA/ESOC in Darmstadt, Germany to broadcasters (further details on the TV transmission at http://television.esa.int). ESA senior management and programme specialists will be on hand at ESRIN for explanations and interviews. The general public can also follow the video transmission web-streamed at: http://www.esa.int/goce.

Earthquakes may endanger New York more than thought, says study





All known quakes, greater New York-Philadelphia area, 1677-2004, graded by magnitude (M). Peekskill, NY, near Indian Point nuclear power plant, is denoted as Pe. - Credit: Adapted from Sykes et al.
All known quakes, greater New York-Philadelphia area, 1677-2004, graded by magnitude (M). Peekskill, NY, near Indian Point nuclear power plant, is denoted as Pe. – Credit: Adapted from Sykes et al.

Indian Point nuclear power plant seen as particular risk



A study by a group of prominent seismologists suggests that a pattern of subtle but active faults makes the risk of earthquakes to the New York City area substantially greater than formerly believed. Among other things, they say that the controversial Indian Point nuclear power plants, 24 miles north of the city, sit astride the previously unidentified intersection of two active seismic zones. The paper appears in the current issue of the Bulletin of the Seismological Society of America at http://www.earth.columbia.edu/sitefiles/File/pressreleases/1696.pdf.



Many faults and a few mostly modest quakes have long been known around New York City, but the research casts them in a new light. The scientists say the insight comes from sophisticated analysis of past quakes, plus 34 years of new data on tremors, most of them perceptible only by modern seismic instruments. The evidence charts unseen but potentially powerful structures whose layout and dynamics are only now coming clearer, say the scientists. All are based at Columbia University’s Lamont-Doherty Earth Observatory, which runs the network of seismometers that monitors most of the northeastern United States: http://www.ldeo.columbia.edu/LCSN/.



Lead author Lynn R. Sykes said the data show that large quakes are infrequent around New York compared to more active areas like California and Japan, but that the risk is high, because of the overwhelming concentration of people and infrastructure. “The research raises the perception both of how common these events are, and, specifically, where they may occur,” he said. “It’s an extremely populated area with very large assets.” Sykes, who has studied the region for four decades, is known for his early role in establishing the global theory of plate tectonics.



The authors compiled a catalog of all 383 known earthquakes from 1677 to 2007 in a 15,000-square-mile area around New York City. Coauthor John Armbruster estimated sizes and locations of dozens of events before 1930 by combing newspaper accounts and other records. The researchers say magnitude 5 quakes-strong enough to cause damage–occurred in 1737, 1783 and 1884. There was little settlement around to be hurt by the first two quakes, whose locations are vague due to a lack of good accounts; but the last, thought to be centered under the seabed somewhere between Brooklyn and Sandy Hook, toppled chimneys across the city and New Jersey, and panicked bathers at Coney Island. Based on this, the researchers say such quakes should be routinely expected, on average, about every 100 years. “Today, with so many more buildings and people, a magnitude 5 centered below the city would be extremely attention-getting,” said Armbruster. “We’d see billions in damage, with some brick buildings falling. People would probably be killed.”



Starting in the early 1970s Lamont began collecting data on quakes from dozens of newly deployed seismometers; these have revealed further potential, including distinct zones where earthquakes concentrate, and where larger ones could come. The Lamont network, now led by coauthor Won-Young Kim, has located hundreds of small events, including a magnitude 3 every few years, which can be felt by people at the surface, but is unlikely to cause damage. These small quakes tend to cluster along a series of small, old faults in harder rocks across the region. Many of the faults were discovered decades ago when subways, water tunnels and other excavations intersected them, but conventional wisdom said they were inactive remnants of continental collisions and rifting hundreds of millions of years ago. The results clearly show that they are active, and quite capable of generating damaging quakes, said Sykes.



One major previously known feature, the Ramapo Seismic Zone, runs from eastern Pennsylvania to the mid-Hudson Valley, passing within a mile or two northwest of Indian Point. The researchers found that this system is not so much a single fracture as a braid of smaller ones, where quakes emanate from a set of still ill-defined faults. East and south of the Ramapo zone-and possibly more significant in terms of hazard–is a set of nearly parallel northwest-southeast faults. These include Manhattan’s 125th Street fault, which seems to have generated two small 1981 quakes, and could have been the source of the big 1737 quake; the Dyckman Street fault, which carried a magnitude 2 in 1989; the Mosholu Parkway fault; and the Dobbs Ferry fault in suburban Westchester, which generated the largest recent shock, a surprising magnitude 4.1, in 1985. Fortunately, it did no damage. Given the pattern, Sykes says the big 1884 quake may have hit on a yet-undetected member of this parallel family further south.



The researchers say that frequent small quakes occur in predictable ratios to larger ones, and so can be used to project a rough time scale for damaging events. Based on the lengths of the faults, the detected tremors, and calculations of how stresses build in the crust, the researchers say that magnitude 6 quakes, or even 7-respectively 10 and 100 times bigger than magnitude 5–are quite possible on the active faults they describe. They calculate that magnitude 6 quakes take place in the area about every 670 years, and sevens, every 3,400 years. The corresponding probabilities of occurrence in any 50-year period would be 7% and 1.5%. After less specific hints of these possibilities appeared in previous research, a 2003 analysis by The New York City Area Consortium for Earthquake Loss Mitigation put the cost of quakes this size in the metro New York area at $39 billion to $197 billion. A separate 2001 analysis for northern New Jersey’s Bergen County estimates that a magnitude 7 would destroy 14,000 buildings and damage 180,000 in that area alone. The researchers point out that no one knows when the last such events occurred, and say no one can predict when they next might come.



“We need to step backward from the simple old model, where you worry about one large, obvious fault, like they do in California,” said coauthor Leonardo Seeber. “The problem here comes from many subtle faults. We now see there is earthquake activity on them. Each one is small, but when you add them up, they are probably more dangerous than we thought. We need to take a very close look.” Seeber says that because the faults are mostly invisible at the surface and move infrequently, a big quake could easily hit one not yet identified. “The probability is not zero, and the damage could be great,” he said. “It could be like something out of a Greek myth.”



The researchers found concrete evidence for one significant previously unknown structure: an active seismic zone running at least 25 miles from Stamford, Conn., to the Hudson Valley town of Peekskill, N.Y., where it passes less than a mile north of the Indian Point nuclear power plant. The Stamford-Peekskill line stands out sharply on the researchers’ earthquake map, with small events clustered along its length, and to its immediate southwest. Just to the north, there are no quakes, indicating that it represents some kind of underground boundary. It is parallel to the other faults beginning at 125th Street, so the researchers believe it is a fault in the same family. Like the others, they say it is probably capable of producing at least a magnitude 6 quake. Furthermore, a mile or so on, it intersects the Ramapo seismic zone.



Sykes said the existence of the Stamford-Peekskill line had been suggested before, because the Hudson takes a sudden unexplained bend just ot the north of Indian Point, and definite traces of an old fault can be along the north side of the bend. The seismic evidence confirms it, he said. “Indian Point is situated at the intersection of the two most striking linear features marking the seismicity and also in the midst of a large population that is at risk in case of an accident,” says the paper. “This is clearly one of the least favorable sites in our study area from an earthquake hazard and risk perspective.”



The findings comes at a time when Entergy, the owner of Indian Point, is trying to relicense the two operating plants for an additional 20 years-a move being fought by surrounding communities and the New York State Attorney General. Last fall the attorney general, alerted to the then-unpublished Lamont data, told a Nuclear Regulatory Commission panel in a filing: “New data developed in the last 20 years disclose a substantially higher likelihood of significant earthquake activity in the vicinity of [Indian Point] that could exceed the earthquake design for the facility.” The state alleges that Entergy has not presented new data on earthquakes past 1979. However, in a little-noticed decision this July 31, the panel rejected the argument on procedural grounds. A source at the attorney general’s office said the state is considering its options.



The characteristics of New York’s geology and human footprint may increase the problem. Unlike in California, many New York quakes occur near the surface-in the upper mile or so-and they occur not in the broken-up, more malleable formations common where quakes are frequent, but rather in the extremely hard, rigid rocks underlying Manhattan and much of the lower Hudson Valley. Such rocks can build large stresses, then suddenly and efficiently transmit energy over long distances. “It’s like putting a hard rock in a vise,” said Seeber. “Nothing happens for a while. Then it goes with a bang.” Earthquake-resistant building codes were not introduced to New York City until 1995, and are not in effect at all in many other communities. Sinuous skyscrapers and bridges might get by with minimal damage, said Sykes, but many older, unreinforced three- to six-story brick buildings could crumble.



Art Lerner-Lam, associate director of Lamont for seismology, geology and tectonophysics, pointed out that the region’s major highways including the New York State Thruway, commuter and long-distance rail lines, and the main gas, oil and power transmission lines all cross the parallel active faults, making them particularly vulnerable to being cut. Lerner-Lam, who was not involved in the research, said that the identification of the seismic line near Indian Point “is a major substantiation of a feature that bears on the long-term earthquake risk of the northeastern United States.” He called for policymakers to develop more information on the region’s vulnerability, to take a closer look at land use and development, and to make investments to strengthen critical infrastructure.



“This is a landmark study in many ways,” said Lerner-Lam. “It gives us the best possible evidence that we have an earthquake hazard here that should be a factor in any planning decision. It crystallizes the argument that this hazard is not random. There is a structure to the location and timing of the earthquakes. This enables us to contemplate risk in an entirely different way. And since we are able to do that, we should be required to do that.”

NEW YORK EARTHQUAKE BRIEFS AND QUOTES:



Existing U.S. Geological Survey seismic hazard maps show New York City as facing more hazard than many other eastern U.S. areas. Three areas are somewhat more active-northernmost New York State, New Hampshire and South Carolina-but they have much lower populations and fewer structures. The wider forces at work include pressure exerted from continuing expansion of the mid-Atlantic Ridge thousands of miles to the east; slow westward migration of the North American continent; and the area’s intricate labyrinth of old faults, sutures and zones of weakness caused by past collisions and rifting.



Due to New York’s past history, population density and fragile, interdependent infrastructure, a 2001 analysis by the Federal Emergency Management Agency ranks it the 11th most at-risk U.S. city for earthquake damage. Among those ahead: Los Angeles, San Francisco, Seattle and Portland. Behind: Salt Lake City, Sacramento, Anchorage.



New York’s first seismic station was set up at Fordham University in the 1920s. Lamont-Doherty Earth Observatory, in Palisades, N.Y., has operated stations since 1949, and now coordinates a network of about 40.



Dozens of small quakes have been felt in the New York area. A Jan. 17, 2001 magnitude 2.4, centered in the Upper East Side-the first ever detected in Manhattan itself–may have originated on the 125th Street fault. Some people thought it was an explosion, but no one was harmed.



The most recent felt quake, a magnitude 2.1 on July 28, 2008, was centered near Milford, N.J. Houses shook and a woman at St. Edward’s Church said she felt the building rise up under her feet-but no damage was done.



Questions about the seismic safety of the Indian Point nuclear power plant, which lies amid a metropolitan area of more than 20 million people, were raised in previous scientific papers in 1978 and 1985.



Because the hard rocks under much of New York can build up a lot strain before breaking, researchers believe that modest faults as short as 1 to 10 kilometers can cause magnitude 5 or 6 quakes.



In general, magnitude 3 quakes occur about 10 times more often than magnitude fours; 100 times more than magnitude fives; and so on. This principle is called the Gutenberg-Richter relationship.


LEAD AUTHOR LYNN SYKES



On the study and earthquake risk: “New York is not as prone to earthquakes as California and Japan, but they do happen. This study takes a more realistic look at the possibility of larger ones, and why earthquakes concentrate in certain places. To understand risk, you have to multiply hazard by assets, and vulnerability. When you factor that in, our risk is high. Too much attention has been paid to the level of hazard, and not enough to the risk. Earthquake hazard is about the same today as in 1609 when Henry Hudson sailed up the River. But earthquake risk is much, much higher today, since the number of people, assets and their vulnerability are so much greater.”



On faults near Indian Point nuclear plant: “We think that the intersection of these two features being so close to Indian Point makes it a place of greater risk than most other points on the map.”


COAUTHOR LEONARDO SEEBER



On estimating hazard: “Most people underestimate the hazard here. Any conservative approach will look at geologically similar environments. If you do that, we are similar to Bhuj, India [where a 2001 magnitude 7 quake killed over 15,000 people]. There was no obvious sign of strain there. There is a mystery here to be solved, and we better step back and do our homework.”



On preparing: “Once you accept that one fault in a family is active, you better consider that all the faults in that family could be active. We need to adapt our structures with that in mind.”


COAUTHOR JOHN ARMBRUSTER



On past and future quakes: “You could debate whether a magnitude 6 or 7 is possible, but we’ve already had three magnitude fives, so that is very realistic. There is no one now alive now to remember that last one, so people tend to forget. And having only a partial 300-year history, we may not have seen everything we could see. There could be surprises-things bigger than we have ever seen.”

Satellite Images Show Continued Breakup Of Two Of Greenland’s Largest Glaciers, Predict Disintegration In Near Future





A 29 sq. km. (11 sq. mi.) area of the Petermann Glacier in northern Greenland (80?N, 60?W) broke away between July 10th and by July 24th. Petermann has a floating section 16 km (10 mi) wide and 80 km (50 mi) long, that is, 1295 sq. km (500 sq mi); the longest floating glacier in the Northern Hemisphere. Photo courtesy Byrd Polar Research Center, Ohio State University.
A 29 sq. km. (11 sq. mi.) area of the Petermann Glacier in northern Greenland (80?N, 60?W) broke away between July 10th and by July 24th. Petermann has a floating section 16 km (10 mi) wide and 80 km (50 mi) long, that is, 1295 sq. km (500 sq mi); the longest floating glacier in the Northern Hemisphere. Photo courtesy Byrd Polar Research Center, Ohio State University.

Researchers monitoring daily satellite images here of Greenland’s glaciers have discovered break-ups at two of the largest glaciers in the last month.



They expect that part of the Northern hemisphere’s longest floating glacier will continue to disintegrate within the next year.



A massive 11-square-mile (29-square-kilometer) piece of the Petermann Glacier in northern Greenland broke away between July 10th and by July 24th. The loss to that glacier is equal to half the size of Manhattan Island. The last major ice loss to Petermann occurred when the glacier lost 33 square miles (86 square kilometers) of floating ice between 2000 and 2001.



Petermann has a floating section of ice 10 miles (16 kilometers) wide and 50 miles (80.4 kilometers) long which covers 500 square miles (1,295 square kilometers).



What worries Jason Box, an associate professor of geography at Ohio State, and his colleagues, graduate students Russell Benson and David Decker, all with the Byrd Polar Research Center, even more about the latest images is what appears to be a massive crack further back from the margin of the Petermann Glacier.



That crack may signal an imminent and much larger breakup.


“If the Petermann glacier breaks up back to the upstream rift, the loss would be as much as 60 square miles (160 square kilometers),” Box said, representing a loss of one-third of the massive ice field.



Meanwhile, the margin of the massive Jakobshavn glacier has retreated inland further than it has at any time in the past 150 years it has been observed. Researchers believe that the glacier has not retreated to where it is now in at least the last 4,000 to 6,000 years.



The Northern branch of the Jakobshavn broke up in the past several weeks and the glacier has lost at least three square miles (10 square kilometers) since the end of the last melt season.



The Jakobshavn Glacier dominates the approximately 130 glaciers flowing out of Greenland’s inland into the sea. It alone is responsible for producing at least one-tenth of the icebergs calving off into the sea from the entire island of Greenland, making it the island’s most productive glacier.



Between 2001 and 2005, a massive breakup of the Jakobshavn glacier erased 36 square miles (94 square kilometers) from the ice field and raised the awareness of worldwide of glacial response to global climate change.



The researchers are using images updated daily from National Aeronautics and Space Administration satellites and from time-lapse photography from cameras monitoring the margin of these and other Greenland glaciers. Additional support for this project came from NASA.