Large earthquakes may broadcast warnings, but is anyone tuning in to listen


Like geological ninjas, earthquakes can strike without warning. But there may be a way to detect the footfalls of large earthquakes before they strike, alerting their potential victims a week or more in advance. A Stanford professor thinks a method to provide just such warnings may have been buried in the scientific literature for over 40 years.



In October, Japan instituted a nationwide earthquake warning system that heralds the advance of a big earthquake; its sophisticated machinery senses the shaking deep in the earth and transmits a warning signal that can beat the tremors to the surface by seconds. Antony Fraser-Smith, professor emeritus of electrical engineering and of geophysics, has evidence that big temblors emit a burst of ultra-low-frequency electromagnetic radio waves days or even weeks before they hit. The problem is that nobody is paying enough attention.



Fraser-Smith has been interested in electromagnetic signals for decades. Most of these waves come from space, he said, generated in the upper atmosphere by the sun and then beamed down to Earth.



In 1989, Fraser-Smith and his research team were monitoring ultra-low-frequency radio waves in a remote location in the Santa Cruz Mountains as part of a long-term study of the signals reaching Earth from space. On Oct. 5, 1989, their equipment suddenly reported a large signal, and the signal stayed up for the next 12 days. At 2:00 p.m. on Oct. 17, 1989, the signal jumped even higher, about 20 to 30 times higher than what the instruments would normally ever measure, Fraser-Smith said. At 5:04 p.m. the 7.1 magnitude Loma Prieta earthquake hit the Monterey Bay and San Francisco Bay areas, killing 63 people and causing severe damage across the region.



Fraser-Smith originally thought there was something wrong with the equipment. After ruling out the possibility of technical malfunctions, he and his research team started to think the Loma Prieta quake had quietly announced its impending arrival, and that their equipment just happened to be in the right place at the right time to pick up the message.



“Most scientists necessarily make measurements on small earthquakes because that’s what occurs all the time,” Fraser-Smith said. “To make a measurement on a large earthquake you have to be lucky, which we were.”


Along with Stephen Park, earth sciences professor at the University of California-Riverside, and Frank Morrison, professor emeritus of earth and planetary science at UC-Berkeley, Fraser-Smith continued to study the phenomenon of earthquakes emitting electromagnetic waves through a study funded by the U.S. Geological Survey (USGS).



When the USGS terminated the funding in 1999, he decided to move on to other things. But he was recently drawn back into this issue by a local private company that wanted to use his methods to develop earthquake warning systems.



“I took a new look at the measurements, concentrating entirely on large earthquakes,” Fraser-Smith said, “and all of a sudden I could see the forest through the trees.”



He found three other studies describing electromagnetic surges before large earthquakes, just as he had found at the Loma Prieta site. The earliest report was from the Great Alaska earthquake (M9.2) in 1964. Up until now, most of the focus for earthquake warnings and predictions has been on seismological studies, but no seismic measurements have ever shown this kind of warning before a big quake, Fraser-Smith said.



This technique will probably only yield results for earthquakes of approximately magnitude 7 or higher, because background waves from the atmosphere will tend to mask any smaller signals. But these are the quakes people are most concerned about anyway, from a safety and damage point of view.



Some seismologists are suspicious that these results are real, Fraser-Smith said. But it would take little effort to verify or disprove them. He is calling for federal funding for a mission-oriented study that would place approximately 30 of the ultra-low-frequency-detecting instruments around the world at hotspots for big quakes. It would cost around $3 million to buy 30 of these machines, he said, which is cheap compared to the cost of many other large studies.



Every year, there are on average 10 earthquakes of magnitude 7 or higher around the world. So within just a few years, he said, you could potentially have 10 new measurements of electromagnetic waves before big quakes-surely enough to determine whether the previous four findings were real.



Fraser-Smith will present his findings at the American Geophysical Union meeting this week in San Francisco. His talk is scheduled for 8:45 a.m. Thursday, Dec. 13.

As waters clear, scientists seek to end a muddy debate





Schieber uses a camera to track the growth and movement of mud formations - Photo by: Chris Meyer
Schieber uses a camera to track the growth and movement of mud formations – Photo by: Chris Meyer

Geologists have long thought muds will only settle when waters are quiet, but new research by Indiana University Bloomington and Massachusetts Institute of Technology geologists shows muds will accumulate even when currents move swiftly. Their findings appear in this week’s Science.



This may seem a trifling matter at first, but understanding the deposition of mud could significantly impact a number of public and private endeavors, from harbor and canal engineering to oil reservoir management and fossil fuel prospecting.



“Mudstones make up two-thirds of the sedimentary geological record,” said IU Bloomington geologist Juergen Schieber, who led the study. “One thing we are very certain of is that our findings will influence how geologists and paleontologists reconstruct Earth’s past.”



Previously geologists had thought that constant, rapid water flow prevented mud’s constituents — silts and clays — from coalescing and gathering at the bottoms of rivers, lakes and oceans. This has led to a bias, Schieber explains, that wherever mudstones are encountered in the sedimentary rock record, they are generally interpreted as quiet water deposits.



“But we suspected this did not have to be the case,” Schieber said. “All you have to do is look around. After the creek on our university’s campus floods, you can see ripples on the sidewalks once the waters have subsided. Closely examined, these ripples consist of mud. Sedimentary geologists have assumed up until now that only sand can form ripples and that mud particles are too small and settle too slowly to do the same thing. We just needed to demonstrate it that it can actually happen under controlled conditions.”



Schieber and IU graduate student Kevin Thaisen used a specially designed “mud flume” to simulate mud deposition in natural flows. The oval-shaped apparatus resembles a race track. A motorized paddle belt keeps water moving in one direction at a pre-determined speed, say, 26 centimeters per second (about 0.6 miles per hour). The concentration of dispersed sediment, temperature, salinity, and a dozen other parameters can be controlled. M.I.T. veteran sedimentologist John Southard provided advice on the construction and operation of the mud flume used in the experiments.


For their experiments, the scientists used calcium montmorillonite and kaolinite, extremely fine clays that in dry form have the feel of facial powder. Most geologists would have predicted that these tiny mineral grains could not settle easily from rapidly moving water, but the flume experiments showed that mud was traveling on the bottom of the flume after a short time period. Experiments with natural lake muds showed the same results.



“We found that mud beds accumulate at flow velocities that are much higher than what anyone would have expected,” said Schieber, who, because of the white color of the clay suspensions, calls this ongoing work the “sedimentology of milk.”



The mud accumulates slowly at first, in the form of heart- or arrowhead-shaped ripples that point upstream. These ripples slowly move with the current while maintaining their overall shapes.



Understanding how and when muds deposit will aid engineers who build harbors and canals, Schieber says, by providing them with new information about the rates at which mud can accumulate from turbid waters. Taking into account local conditions, engineers can build waterways in a way that truly minimizes mud deposition by optimizing tidal and wave-driven water flow. Furthermore, Schieber explains, the knowledge that muds can deposit from moving waters could expand the possible places where oil companies prospect for oil and gas. Organic matter and muds are both sticky and are often found together.



“If anything, when organic matter is present in addition to mud, it enhances mud deposition from fast moving currents,” he said.



The finding feels like something of a vindication, Schieber says. He and his colleagues have (genially) argued about whether muds could deposit from rapidly flowing water. Schieber had posited the possibility after noting an apparent oddity in the sedimentary rock record.



“In many ancient mudstones, you see not only deposition, but also erosion and rapid re-deposition of mud — all in the same place,” Schieber said. “The erosive features are at odds with the notion that the waters must have been still all or most of the time. We needed a better explanation.”



This research was supported by a grant from the National Science Foundation.

Microbes in oil reservoirs





Jennifer Adams (left), a PhD student in the Department of Geoscience who did the geofluids modelling, and Steve Larter, a petroleum geologist in geoscience, explain their discovery at a news conference
Jennifer Adams (left), a PhD student in the Department of Geoscience who did the geofluids modelling, and Steve Larter, a petroleum geologist in geoscience, explain their discovery at a news conference

An international team that includes University of Calgary scientists has shown how crude oil in oil deposits around the world-including in Alberta’s oil sands-are naturally broken down by microbes in the reservoir.



Their discovery-published in the prestigious science journal Nature-could revolutionize heavy oil and oil sands production by leading to more energy-efficient, environmentally friendly ways to produce this valuable resource.



Understanding how crude oil biodegrades into methane, or natural gas, opens the door to being able to recover the clean-burning methane directly from deeply buried, or in situ, oil sands deposits, says Steve Larter, a petroleum geologist in the U of C Department of Geoscience who headed the Calgary contingent of the research team.



The oil sands industry would no longer have to use costly and polluting thermal, or heat-based, processes (such as injecting steam into reservoirs) to loosen the tar-like bitumen so it flows into wells and can be pumped to the surface.



“The main thing is you’d be recovering a much cleaner fuel,” says Larter, Canada Research Chair in Petroleum Geology. “Methane is, per energy unit, a much lower carbon dioxide emitter than bitumen. Also, you wouldn’t need all the upgrading facilities and piping on the surface.”



Biodegradation of crude oil into heavy oil in petroleum reservoirs is a problem worldwide for the petroleum industry. The natural process, caused by bacteria that consume the oil, makes the oil viscous, or thick, and contaminates it with pollutants such as sulphur. This makes recovering and refining heavy oil difficult and costly.


Some studies have suggested that biodegradation could by caused by aerobic bacteria, which use oxygen. But Larter and colleagues from the U of C, University of Newcastle in the U.K., and Norsk Hydro Oil & Energy in Norway, report in Nature that the dominant process is, in fact, fermentation. It is caused by anaerobic bacteria that live in oil reservoirs and don’t use oxygen.



“This is the main process that’s occurring all over the Earth, in any oil reservoir where you’ve got biodegradation,” Larter says.



Using a combination of microbiological studies, laboratory experiments and oilfield case studies, the team demonstrated the anaerobic degradation of hydrocarbons to produce methane. The findings offer the potential of ‘feeding’ the microbes and rapidly accelerating the breaking down of the oil into methane.



“Instead of 10 million years, we want to do it 10 years,” Larter says. “We think it’s possible. We can do it in the laboratory. The question is: can we do it in a reservoir?”



Doing so would revolutionize the heavy oil/oil sands industry, which now manages to recover only about 17 per cent of a resource that consists of six trillion barrels worldwide. Oil sands companies would be able to recover only the clean-burning natural gas, leaving the hard-to-handle bitumen and contaminants deep underground.



Understanding biodegradation also provides an immediate tool for predicting where the less-biodegraded oil is located in reservoirs, enabling companies to increase recovery by targeting higher-quality oil. “It gives us a better understanding of why the fluid properties are varying within the reservoir,” Larter says. “That will help us with thermal recovery processes such as SAGD (steam-assisted gravity drainage).”



The research team also discovered an intermediate step in the biodegradation process. It involves a separate family of microbes that produce carbon dioxide and hydrogen from partly degraded oil, prior to it being turned into methane. This paves the way for using the microbes to capture this CO2 as methane, which could then be recycled as fuel in a closed-loop energy system. This would keep the CO2, a greenhouse gas blamed for global warming and climate change, out of the atmosphere.



The petroleum industry already has expressed interest in trying to accelerate biodegradation in a reservoir, Larter says. “It is likely there will be field tests by 2009.”



The multidisciplinary team, with the U.K. contingent led by Ian Head and Martin Jones, included petroleum geologists, microbiologists, organic geochemists and reservoir modellers.

Geologist probes undersea seismic zone as part of new deep-drilling experiment





 The science team on IODP Expedition 314 poses on board the world's largest research drilling vessel, the Chikyu Hakken. UCSC geologist Casey Moore is the sixth person from the left in the back row. Photo courtesy of IODP.
The science team on IODP Expedition 314 poses on board the world’s largest research drilling vessel, the Chikyu Hakken. UCSC geologist Casey Moore is the sixth person from the left in the back row. Photo courtesy of IODP.

The first effort to drill into an undersea zone where massive earthquakes and tsunamis are generated has yielded new data on the stresses that build up there, according to Casey Moore, a professor of Earth and planetary sciences at the University of California, Santa Cruz.



Moore took part in an eight-week scientific drilling expedition off the Pacific Coast of Japan, as part of the launch of the Nankai Trough Seismogenic Zone Experiment (NanTroSEIZE), an international research initiative supported by the Integrated Ocean Drilling Program (IODP).



“What we’re trying to understand is the earthquake cycle–what makes these faults move, when they do,” said Moore, who was involved in the conceptual planning of the experiment. “Right now the fault is locked, the rubber band is stretching, and it’s going to break probably in the next 50 years. We want to know what brings it to that breaking point.”



Unlike previous ocean drilling projects that explored basic plate tectonics, NanTroSEIZE specifically aims to understand the triggers and mechanisms that lead to earthquakes and tsunamis. NanTroSEIZE will also attempt the deepest ever drilling penetration in a seismogenic fault zone, conducting both sampling of the rocks and monitoring of the seismic cycle there. The drilling site, an area called the Nankai Trough, is located off the coast of Shingu, Japan–a sister city of Santa Cruz.



The Nankai Trough marks the boundary between two of Earth’s major crustal plates, where the Philippines Sea plate rams into and slides under the Eurasian plate. Centuries of Japanese records suggest that this subduction zone has generated many large earthquakes, including one of magnitude 8.1 in 1944 and another of magnitude 8.3 in 1946.



According to Moore, the combination of a thrust fault and an off-branching splay fault at this location creates the potential for a huge and sudden uplift of the seafloor–on the order of 10 to 20 meters. Such a disturbance can lead to a giant tsunami wave, similar to the one that struck the Indian Ocean region on December 26, 2004.



“A subduction zone like the Nankai Trough can generate the world’s biggest and most damaging earthquakes and tsunamis, so there’s good reason to drill there,” Moore said.



Researchers working with NanTroSEIZE will be the first to drill deeply in this type of seismic hotbed, Moore said. A different type of seismogenic zone is being explored on land by U.S. Geological Survey researchers who have drilled into California’s San Andreas Fault, a strike-slip boundary where two plates slide past each other.


As part of NanTroSEIZE Expedition 314, Moore worked aboard the world’s largest research drilling vessel, Chikyu Hakken (Japanese for “Earth Discovery”). The new vessel and its crew left Shingu Port on September 21. Moore helicoptered off the ship on November 16, but the Chikyu will continue this drilling program into February 2008 with other scientists on board.



In this part of the experiment’s first phase, 16 onboard scientists from six countries conducted logging-while-drilling operations at five sites. They drilled between 400 and 1,400 meters below the seabed and logged data on the physical properties of the rocks and sediments that have piled up at the fault zone to form an undersea mountain range.



Moore measured the stresses pressing in on the boreholes by collecting images and analyzing the boreholes’ failure patterns. His goal was to figure out how the strain is distributed and how the faults are oriented.



“The most exciting thing was getting an unprecedented record of the stress accumulating between major earthquakes in a subduction zone,” Moore said. “From the failure patterns, we can determine the orientation of the stress. We have the directions nailed right now.”



Findings from Expedition 314 were presented at the annual meeting of the American Geophysical Union in San Francisco, where Moore participated in a news conference on December 12 entitled, “Undersea drilling reveals mechanics of earthquakes, tsunamis.” The panel discussion was led by Harold Tobin, co-chief scientist of the expedition. Tobin is a geologist at the University of Wisconsin-Madison and an alumnus of UCSC.



Ultimately, NanTroSEIZE is expected to give researchers access to the “seismogenic zone,” where earthquakes are generated. To do this, the experiment will proceed in four phases through 2012, eventually drilling through the crust and into the Earth’s mantle at a depth of 6 kilometers.



“It’s a learning process,” Moore noted. “We have to take small steps because it’s a hostile environment that’s difficult to work in. When we get to 6 kilometers the challenges will be huge.”



In 2009, Moore is slated to serve as co-chief scientist of an expedition, sharing duties with a researcher from the Japan Agency for Marine-Earth Science and Technology (JAMSTEC), a leading research institution in Japan. By then, the team expects to begin using a technique called riser drilling to probe the mega-splay fault zone at 3,500 meters below the seafloor, he said.



At that point, Moore said, he looks forward to making another set of measurements that have never been collected before. “More than anything, we’d like to know what the variation of water pressure in that fault zone is,” he explained. “That’s critical because fluids push the two plates apart and can also allow the faults to slip.”



In the final phase of the drilling experiment, researchers plan to install observatories–sets of instruments deep in the seismogenic zone that will allow them to monitor changes. The scientists hope to identify precursors that signal when a major earthquake is about to happen.

Researcher part of critical team studying glaciers and climate change





Brad Danielson and Alex Gardner during time-lapse camera setup in spring 2007.
Brad Danielson and Alex Gardner during time-lapse camera setup in spring 2007.

University of Ottawa geography professor Luke Copland is among researchers from 17 countries studying 19 Arctic tidewater glaciers to better understand how they react to climate change.



The international GLACIODYN project, which is part of the International Polar Year, is concerned with the effect of climate change on tidewater glacier flow, which increases the amount of freshwater pouring into oceans thus causing a rise in sea levels.



Tidewater glaciers drain directly into the ocean from large ice caps and ice sheets such as Greenland. Over the past decade, these glaciers have experienced accelerated ice flow and increased rates of mass loss by iceberg break-up. GLACIODYN researchers want to find out why.



In Canada, seven glaciologists have been working on the Belcher Glacier, a tidewater outlet of the Devon Island ice cap in the Arctic. The research team spent three months in the spring and summer of 2007 on the Belcher Glacier and will be going back in 2008. Their main task is to collect field data to develop a numerical model of the glacier, which will allow them to simulate and explain how it responds to climate change. The field research is augmented by remote sensing data, which will be used to apply field measurements to the scale of the whole glacier system.


“When we look at how glaciers and ice caps respond to climate change we know that their surface melting will increase” explains co-PI Dr. Luke Copland of the University of Ottawa, “but until recently we didn’t realize that glaciers may also respond by speeding up due to increased meltwater lubrication of the glacier bed. This can result in a much more rapid loss of ice from an ice cap than melting alone.”



So far the team has used ice-penetrating radar to measure ice thickness and map the topography under the ice. Researchers also installed time-lapse cameras to monitor the development of surface meltwater drainage systems and the break-up of icebergs during the summer melt season. They used global positioning systems (GPS) to measure ice movement, and installed automated weather stations to record local conditions.



The data will allow scientists to explore how the increased amount of meltwater draining into the glacier as the climate warms will affect the rate of ice flow and the subsequent break-up of icebergs.



A team led by Dr. Luke Copland, Director of University of Ottawa’s new Laboratory for Cryospheric Research, will be heading to Devon Ice Cap in May 2008 to make measurements for the GLACIODYN project. One of three teams to visit the glacier next summer, Copland’s will include two graduate students and one northern undergraduate student who will be using the fieldwork results towards their degrees. This will be the first major field expedition for the Laboratory for Cryospheric Research, which opened in September 2007 with funding from the Canadian Foundation for Innovation and Ontario Research Fund.



The Canadian GLACIODYN team also received substantial support from researchers using the Canadian research icebreaker CCGS Amundsen. They mapped the seabed topography and measured ocean conditions in the water at the glacier snout, giving the team information about what’s happening under the front of the glacier, and how it affects the ice surface.

New Tibetan Ice Cores Missing A-Bomb Blast Markers; Suggest Himalayan Ice Fields Haven’t Grown In Last 50 Years





Operation Redwing was a United States series of 17 nuclear test detonations from May to July 1956
Operation Redwing was a United States series of 17 nuclear test detonations from May to July 1956

Ice cores drilled last year from the summit of a Himalayan ice field lack the distinctive radioactive signals that mark virtually every other ice core retrieved worldwide.



That missing radioactivity, originating as fallout from atmospheric nuclear tests during the 1950s and 1960s, routinely provides researchers with a benchmark against which they can gauge how much new ice has accumulated on a glacier or ice field.



In 2006, a joint U.S.-Chinese team drilled four cores from the summit of Naimona’nyi, a large glacier 6,050 meters (19,849 feet) high on the Tibetan Plateau.



The researchers routinely analyze ice cores for a host of indicators – particulates, dust, oxygen isotopes, etc. — that can paint a picture of past climate in that region.



Scientists believe that the missing signal means that this Tibetan ice field has been shrinking at least since the A-bomb test half a century ago. If true, this could foreshadow a future when the stockpiles of freshwater will dwindle and vanish, seriously affecting the lives of more than 500 million people on the Indian subcontinent.



“There’s about 12,000 cubic kilometers (2,879 cubic miles) of fresh water stored in the glaciers throughout the Himalayas – more freshwater than in Lake Superior,” explained Lonnie Thompson, distinguished university professor of earth sciences at Ohio State University and a researcher with the Byrd Polar Research Center on campus.



“Those glaciers release meltwater each year and feed the rivers that support nearly a half-billion people in that region. The loss of these ice fields might eventually create critical water shortages for people who depend on glacier-fed streams.”



Thompson and his colleagues worry that this massive loss of meltwater would drastically impact major Indian rivers like the Ganges, Indus and Brahmaputra that provide water for one-sixth of the world’s population.


Thompson outlined his findings in an address at the annual meeting of the American Geophysical Union in San Francisco this week.



The Beta radioactivity signals – from strontium90, cesium136, tritium (hydrogen3) and chlorine36 – are the remnants of radioactive fallout from the 1950s-60s atomic tests. They are “present in ice cores retrieved from both polar regions and from tropical glaciers around the globe and they suggest that those ice fields have retained snow (mass) that fell during the last 50 years,” he said.



“In ice cores drilled in 2000 from Kilimanjaro’s northern ice field (5890 meters high), the radioactive fallout from the 1950s atomic test was found only 1.8 meters below the surface.



“By 2006 the surface of that ice field had lost more than 2.5 meters of solid ice (and hence recorded time) – including ice containing that signal. Had we drilled those cores in 2006 rather than 2000, the radioactive horizon would be absent – like it is now on Naimona’nyi in the Himalayas,” he said.



In 2002 Thompson predicted that the ice fields capping Kilimanjaro would disappear between 2015 and 2020.



“If what is happening on Naimona’nyi is characteristic of the other Himalayan glaciers, glacial meltwater will eventually dwindle with substantial consequences for a tremendous number of people,” he said.



Scientists estimate that there are some 15,000 glaciers nested within the Himalayan mountain chain forming the main repository for fresh water in that part of the world. The total area of glaciers in the Tibetan Plateau is expected to shrink by 80 percent by the year 2030.



The work is supported in part by the National Science Foundation.



Working on the project along with Thompson were Yao Tandong, Institute for Tibetan Plateau Research, Chinese Academy of Sciences; Ellen Mosley-Thompson, professor of geography at Ohio State and research scientist at the Byrd Center; Mary E. Davis, a research associate with the Byrd Center; doctoral student Natalie M. Kehrwald; Jürg Beer, Swiss Federal Institute of Aquatic Science and Technology; Ulrich Schotterer, University of Bern; and Vasily Alfimov, Paul Scherrer Institute and ETH in Zurich.

Arctic Impact Crater Lake Reveals Interglacial Cycles in Sediments





The coring equipment and other instrumentation was set up using a tripod over the hole in the ice. The scientists were able to extract a core of the topmost 8.5 meters of sediment.
The coring equipment and other instrumentation was set up using a tripod over the hole in the ice. The scientists were able to extract a core of the topmost 8.5 meters of sediment.

A University of Arkansas researcher and a team of international scientists have taken cores from the sediments of a Canadian Arctic lake and found an interglacial record indicating two ice-free periods that could pre-date the Holocene Epoch.



Sonja Hausmann, assistant professor of geosciences in the J. William Fulbright College of Arts and Sciences at the University of Arkansas, and her colleagues will report their preliminary findings at the American Geophysical Union meeting this week.



The researchers traveled by increasingly smaller planes, Ski-doos and finally sleds dragged on foot to arrive at the Pingualuit Crater, located in the Parc National des Pingualuit in northern Quebec. The crater formed about 1.4 million years ago as the result of a meteorite impact, and today it hosts a lake about 267 meters deep. Its unique setting – the lake has no surface connection to other surrounding water bodies – makes it a prime candidate for the study of lake sediments.



Scientists study lake sediments to determine environmental information beyond historical records. Hausmann studies diatoms, unicellular algae with shells of silica, which remain in the sediments. Diatoms make excellent bioindicators, Hausmann said, because the diatom community composition changes with environmental changes in acidity, climate, nutrient availability and lake circulation.



By examining relationships between modern diatom communities and their environment, Hausmann and her colleagues can reconstruct various historic environmental changes quantitatively.



However, most sediments of lakes in previously glaciated areas have limitations – they only date back to the last ice age.


“Glaciers are powerful. They polish everything,” Hausmann said. Glaciers typically carve out any sediments in a lake bed, meaning any record before the ice age is swept away.



However, the unique composition of the Pingualuit Crater Lake led Michel A. Bouchard to speculate in 1989 that the sediments beneath its icy exterior might have escaped glacial sculpting. So in May of this year, Hausmann and her colleagues donned parkas, hauled equipment on ski-doos and slogged through sub-zero temperatures for three weeks so they could core sediments and collect data from the lake.



They carefully carved squares of ice out to make a small hole for equipment, then began a series of investigations that included pulling up a core of the topmost 8.5 meters of sediment. An echosounder indicated that the lake bottom may have more than 100 meters of relatively fine-grained sediments altogether. During the time since the expedition, researchers have examined the physical, magnetic and sedimentological properties of the sediment core.



The sediment core contains mostly faintly laminated silts or sandy mud with frequent pebble-size rock fragments, which is typical of deposits found in water bodies covered by an ice sheet. Sandwiched in the middle of the faintly laminated silts and sandy mud, the researchers found two distinct and separate layers containing organically rich material that most likely date back well before the Holocene, representing earlier ice-free periods. The samples they found contain the remains of diatoms and other organic material, suggesting that they represent ice-free conditions and possibly interglacial periods.



“There are no paleolimnological studies of lakes that cover several warm periods in this area,” Hausmann said. The terrestrial record will be complementary to marine records or to long ice-core records from Greenland.



The international team of researchers in the field included Guillaume St-Onge; Reinhard Pienitz, principal investigator; Veli-Pekka Salonen of the University of Helsinki, Finland; and Richard Niederreiter, coring expert. Please visit http://www.cen.ulaval.ca/pingualuit/index.html for more information.

Early warning system predicted shaking from Oct. 30 quake





The Alum Rock earthquake of Oct. 30, 2007, was the largest in the area since 1989's destructive Loma Prieta quake, and produced ground shaking in San Francisco, Oakland and points north. An early warning system called ElarmS accurately predicted the shaking a few seconds before it happened, showing the potential of such warning systems. (Berkeley Seismological Laboratory)
The Alum Rock earthquake of Oct. 30, 2007, was the largest in the area since 1989’s destructive Loma Prieta quake, and produced ground shaking in San Francisco, Oakland and points north. An early warning system called ElarmS accurately predicted the shaking a few seconds before it happened, showing the potential of such warning systems. (Berkeley Seismological Laboratory)

A California earthquake early warning system now being tested accurately predicted the ground shaking in San Francisco a few seconds before the city felt the Oct. 30, 2007, magnitude 5.4 quake near San Jose, according to a statewide team of seismologists.



Active early warning systems already created in places like Japan, Taiwan, Mexico and Turkey automatically stop elevators at the nearest floor, halt trains, isolate hazardous chemical systems and machinery and move people to a safer location or position.



According to Richard Allen, University of California, Berkeley, assistant professor of earth and planetary science and one of the leaders of the early warning testing being conducted by the California Integrated Seismic Network (CISN), the California system, if fully implemented, could provide similar services through information transmitted via TV and radio networks, the internet, cell phones and other closed circuit systems.



Allen discussed the ongoing tests of a statewide early warning system today (Monday, Dec. 10) during a media briefing at the annual meeting of the American Geophysical Union in San Francisco. He and his California colleagues will present further details during three scientific sessions today and tomorrow.



Current testing of the system should be completed by July 2009, providing the state with an estimate of the likely accuracy and warning times that the system could provide in future earthquakes should a public warning system be built.



CISN is a collaboration that includes UC Berkeley’s Seismological Laboratory, the U. S. Geological Survey, the California Institute of Technology, the Southern California Earthquake Center based at the University of Southern California and the Swiss Seismological Service.



Earthquake early warning systems are designed to provide a warning that arrives seconds to tens of seconds prior to earthquake shaking. The Japan Meteorological Agency recently turned on its first national earthquake warning system, in October 2007, bringing to four the number of nations providing rapid warning to mitigate the impacts of earthquakes, Allen said.


In California, CISN is testing three early warning algorithms on real-time seismic networks across the state to determine how effective such warnings could be. ElarmS, the algorithm being testing in Northern California, detected the Oct. 30 Alum Rock quake – the largest earthquake in the San Francisco Bay region since the 1989 Loma Prieta earthquake – and estimated the magnitude of the event to within 0.5 magnitude units using only 3 or 4 seconds’ worth of data. ElarmS also predicted the distribution of ground shaking across the region with errors less than one unit on the Modified Mercalli Intensity Scale, Allen said.



Even with the algorithm’s 15-second processing delay, ElarmS predicted the ground shaking intensity a few seconds before the peak ground shaking occurred in San Francisco. Work by the CISN early warning group has shown that this delay could be reduced to less than 5 seconds, Allen said, meaning that in a repeat earthquake, Oakland and San Francisco could have about a 10-second warning.



In a repeat of the 1989 Loma Prieta earthquake, on the other hand, warning systems like those being tested by the CISN could provide a 10- to 20-second warning to Oakland and San Francisco, where 84 percent of the casualties and much of the damage occurred. San Jose would have less warning time. Because the amount of warning depends on the distance of the quake’s epicenter, earthquakes to the north of the Bay Area, such as those on the San Andreas or Hayward-Rodgers Creek faults, would provide San Jose with more warning than Oakland or San Francisco.



Allen noted that the success of a statewide warning system hinges on the density of early warning-capable seismic stations. Although there are many seismic stations in the San Francisco and Los Angeles areas, almost none of these have the necessary dataloggers to provide the fast data delivery that would be required. Similarly, lack of early warning-capable stations all along the state’s fault systems lessens the ability to warn metropolitan regions about more distant earthquakes that still might cause substantial damage in metropolitan regions.



The CISN study will recommend technical requirements for such a system.



Japan’s system, for example, uses around 900 seismic stations. California, by comparison, currently has approximately 260 stations that could be upgraded to an early warning capability.



The CISN, part of the Advanced National Seismic System, is funded by the United States Geological Survey.

Study May Solve Age-old Mystery of Missing Chemicals From Earth’s Mantle


Observations about the early formation of Earth may answer an age-old question about why the planet’s mantle is missing some of the matter that should be present, according to UBC geophysicist John Hernlund.



Earth is made from chondrite, very primitive rocks of meteorites that date from the earliest time of the solar system before the Earth was formed. However, scientists have been puzzled why the composition of Earth’s mantle and core differed from that of chondrite.



Hernlund’s findings suggest that an ancient magma ocean swirled beneath the Earth’s surface and would account for the discrepancy.



“As the thick melted rock cooled and crystallized, the solids that resulted had a different composition than the melt,” explains Hernlund, a post-doctoral fellow at UBC Earth and Ocean Sciences.



“The melt held onto some of the elements. This would be where the missing elements of chondrite are stored.”


He says this layer of molten rock would have been around 1,000 km thick and 2,900 km beneath the surface.”



Published in today’s edition of the journal Nature, Hernlund’s study explores the melting and crystallization processes that have controlled the composition of the Earth’s interior over geological time. Co-authors are Stéphane Labrosse, Ecole Normale Superieure de Lyon and Nicolas Coltice, Université de Lyon.



The centre of Earth is a fiery core of melted heavy metals, mostly iron. This represents 30 per cent while the remaining 70 per cent is the outer mantle of solid rock.



Traditional views hold that a shallow ocean of melted rock (magma) existed 1,000 km below the Earth’s surface, but it was short lived and gone by 10 million years after the formation of Earth.



In contrast, Hernlund’s evolutionary model predicts that during Earth’s hotter past shortly after its formation 4.5 billion years ago, at least one-third of the mantle closest to the core was also melted.



The partially molten patches now observed at the base of the Earth’s mantle could be the remnants of such a deep magma ocean, says Hernlund.

ANDRILL’s 2nd Antarctic drilling season exceeds all expectations


A second season in Antarctica for the Antarctic Geological Drilling (ANDRILL) Program has exceeded all expectations, according to the co-chief scientists of the program’s Southern McMurdo Sound Project.



One week ago (Nov. 21), the drilling team passed the 1,000-meter mark in rock core pulled from beneath the sea floor in McMurdo Sound, and with a remarkable recovery rate of more than 98 percent. The end of drilling is scheduled for this weekend, and only a few tens of meters of core remain to be recovered for an expected final total of more than 1,100 meters (3,600 feet). It’s the second-deepest rock core drilled in Antarctica, surpassed only by the 1,285 meters (more than 4,215 feet) recovered by last year’s ANDRILL effort, the McMurdo Ice Shelf Project.



As the job nears completion for the Southern McMurdo Sound Project drillers, the co-chief scientists, David Harwood of the University of Nebraska-Lincoln and Fabio Florindo of Italy’s National Institute of Geophysics and Volcanology in Rome, said they couldn’t be more pleased with the results. They said the efforts of the program’s nearly 80 scientists, drillers, engineers, technicians, students and educators in Antarctica, with the operations and logistics support provided by Antarctica New Zealand, have given the world’s scientists more than a kilometer of pristine rock core that records the history of climate and glacial fluctuations in Antarctica over the past 20 million years.



“It’s everything we hoped for,” Harwood said. “Combine the drill hole we recovered last year with this one, from a time period right below it, and it’s more than 2 kilometers (1 1/4 miles) of geological history. It’s phenomenal what we’ve recovered. There’s a lot of diversity in the core, indeed more than we can digest right now. It will take some time to fully resolve the paleoenvironmental and dynamic paleoclimate information in the core.”



The goal of this drilling project was sediment core retrieval from the middle Miocene Epoch when, for an extended period, Earth was warmer than today. Florindo and Harwood said they are especially pleased to have recovered such high-quality core from this target period.



“We now have a more complete core record from the middle Miocene and a step into a colder period of time, and that was one of our key targets,” Florindo said. “It will tell an important story when we put together our recovery with the record of last season. This is exciting science and it will echo loudly in the scientific community.”



The middle Miocene has long been held as one of the fundamental time intervals in development of the modern Antarctic ice sheets. It encompassed a change from a warm climate optimum approximately 17 million years ago to the onset of major cooling approximately 14 million years ago, and the formation of a quasi-permanent ice sheet on East Antarctica. Florindo and Harwood said fossils and sediments deposited during this year’s ANDRILL target interval suggest the persistence of warmer-than-present conditions over an extended period of the middle and late Miocene when the western Ross Sea and McMurdo Sound resembled the modern climate conditions of southernmost South America, southwestern New Zealand, and southern Alaska, rather than the cold polar climate of today.


“Until now, most climatic interpretations for this time period has been based on measurement of oxygen isotopes in the deep sea, far from Antarctica,” Harwood said. “The cores we’ve recovered will give us a high resolution history of paleoclimate change directly from the Antarctic continent.”



The sediment cores reflect deposition close to or beneath grounded glaciers, alternating with fine-grained sediments, which provide clear evidence for ice advance and substantial retreat during main climate transitions, Florindo and Harwood said. They said programs like ANDRILL are extremely important because of the uncertainties about the future behavior of Antarctic ice sheets. This stratigraphic record will be used to determine the behavior of ancient ice sheets, and to better understand the factors driving past ice sheet, ice shelf and sea-ice growth and decay. This new knowledge will enhance our understanding of Antarctica’s potential responses to future global climate changes.



After a seven-week setup period by Antarctica New Zealand during late winter in the Southern Hemisphere, drilling began Oct. 9 and continued until last week, with the drillers recovering 25 to 70 meters of core each day. There was only one major interruption, occurring in early November when sand and water flowed into the drill hole, but Harwood said the drill team “did an awesome job” of fixing the problem.



Following the planned drilling stoppage at the end of last week, scientists lowered a variety of scientific instruments into the deep drill hole over several days to get a better understanding of the physical properties of the geologic layers under pressure and to obtain an acoustic image of the inside of the borehole. Drilling resumed this week and will continue until probably Sunday to recover about 100 meters of additional core.



The first stop for each core section after recovery is the Crary Science and Engineering Center, operated by the U.S. National Science Foundation at McMurdo Station. After preliminary examination by on-ice scientists, the cores are shipped to Florida State University’s Antarctic Marine Geology Research Facility in Tallahassee for storage and long-term study.



ANDRILL is a multinational collaboration comprised of scientists, students and educators from the four partner nations (Germany, Italy, New Zealand and the United States) to recover stratigraphic records from the Antarctic continental margin. ANDRILL is one of about 220 projects endorsed by the fourth International Polar Year, 2007-2009, one of the largest collaborative science programs ever attempted. Operations and logistics for ANDRILL are managed by Antarctica New Zealand. Scientific research is administered and coordinated through the ANDRILL Science Management Office at the University of Nebraska-Lincoln. For more information, visit http://andrill.org.



Funding support for ANDRILL comes from the U.S National Science Foundation, New Zealand Foundation of Research, Science, and Technology, Royal Society of New Zealand Marsden Fund, Antarctica New Zealand, the Italian National Program for Research in Antarctica, the German Science Foundation and the Alfred Wegener Institute for Polar and Marine Research Science.