The depths of winter: How much snow is in fact on the ground?

Stellar dendrites are tree-like snow crystals that have branches upon branches. -  Kenneth Libbrecht, Caltech
Stellar dendrites are tree-like snow crystals that have branches upon branches. – Kenneth Libbrecht, Caltech

Equipped with specialized lasers and GPS technology, scientists are working to address a critical wintertime weather challenge: how to accurately measure the amount of snow on the ground.

Transportation crews, water managers and others who make vital safety decisions need precise measurements of how snow depth varies across wide areas.

But traditional measuring devices such as snow gauges and yardsticks are often inadequate for capturing snow totals that may vary even within a single field or neighborhood.

Now scientists at the National Center for Atmospheric Research (NCAR) in Boulder, Colo., and at other institutions are finding that prototype devices that use light pulses, satellite signals and other technologies offer the potential to almost instantly measure large areas of snow.

In time, such devices might provide a global picture of snow depth.

“We’ve been measuring rain accurately for centuries, but snow is much harder because of the way it’s affected by wind and sun and other factors,” says NCAR researcher Ethan Gutmann.

“It looks like new technology, however, will finally give us the ability to say exactly how much snow is on the ground.”

NCAR is conducting the effort with several collaborating organizations, including the National Oceanic and Atmospheric Administration (NOAA) and the University of Colorado Boulder.

The work is supported by NCAR’s sponsor, the National Science Foundation (NSF).

“Snow represents both a hazard and a water resource in the western states,” says Thomas Torgersen, NSF program director for hydrologic sciences. “Both require detailed assessments of snow amounts and depth. This technology will provide new and important guidance.”

Emergency managers rely on snowfall measurements when mobilizing snow plows or deciding whether to shut down highways and airports during major storms.

They also use snow totals when determining whether a region qualifies for disaster assistance.

In mountainous areas, officials need accurate reports of snowpack depth to assess the threat of avalanches or floods, and to anticipate the amount of water available from spring and summer runoff.

But traditional approaches to measuring snow can greatly underreport or overreport snow totals, especially in severe conditions.

Snow gauges may miss almost a third of the snow in a windy storm, even when they are protected by specialized fencing designed to cut down on the wind’s effects.

Snow probes or yardsticks can reveal snow depth within limited areas. But such tools require numerous in-person measurements at different locations, a method that may not keep up with totals during heavy snowfalls.

Weather experts also sometimes monitor the amount of snow that collects on flat, white pieces of wood known as snow boards, but this is a time-intensive approach that requires people to check the boards and clear them off every few hours.

The nation’s two largest volunteer efforts–the National Weather Service’s Cooperative Observer Program, and the Community Collaborative Rain, Hail, and Snow Network (CoCoRaHS)–each involve thousands of participants nationwide using snow boards, but their reports are usually filed just once a day.

More recently, ultrasonic devices have been deployed in some of the world’s most wintry regions.

Much like radar, these devices measure the length of time needed for a pulse of ultrasonic energy to bounce off the surface of the snow and return to the transmitter.

However, the signal may be affected by shifting atmospheric conditions, including temperature, humidity and winds.

The specialized laser instruments under development at NCAR can correct for such problems.

Once set up at a location, they can automatically measure snow depth across large areas. Unlike ultrasonic instruments, lasers rely on light pulses that are not affected by atmospheric conditions.

New tests by Gutmann indicate that a laser instrument installed high above treeline in the Rocky Mountains west of Boulder can measure 10 feet or more of snow with an accuracy as fine as half an inch or better.

In a little more than an hour, the instrument measures snow at more than 1,000 points across an area almost the size of a football field to produce a three-dimensional image of the snowpack and its variations in depth.

Gutmann’s next step will be to build and test a laser instrument that can measure snow over several square miles. Tracking such a large area would require a new instrument capable of taking more than 12,000 measurements per second.

“If we’re successful, these types of instruments will reveal a continually-updated picture of snow across an entire basin,” he says.

One limitation for the lasers, however, is that light pulses cannot penetrate through objects such as trees and buildings.

This could require development of networks of low-cost laser installations that would each record snow depths within a confined area.

Alternatively, future satellites equipped with such lasers might be capable of mapping the entire world from above.

Gutmann and Kristine Larson, a scientist at the University of Colorado, are also exploring how to use GPS sensors for snowfall measurements.

GPS sensors record satellite signals that reach them directly and signals that bounce off the ground.

When there is snow on the ground, the GPS signal bounces off the snow with a different frequency than when it bounces off bare soil, enabling scientists to determine how high the surface of the snow is above the ground.

Such units could be a cost-effective way of measuring snow totals; meteorologists could tap into the existing global network of ground-based GPS receivers.

However, researchers are seeking to fully understand how the density of the snow and the roughness of its surface alter GPS signals.

“Our hope is to develop a set of high-tech tools that will enable officials to continually monitor snow depth, even during an intense storm,” Larson says.

“While we still have our work cut out for us, the technology is very promising.”

Fukushima at increased earthquake risk

This is a map of Japan's islands indicating the area of study (black box). The purple star marks the epicentre of the March 11 earthquake and the red star the Iwaki epicentre. Fukushima Daiichi is highlighted by a red square. Black triangles indicate active volcanoes. Numbers on the side of the image represent latitude and longitude. -  Ping Tong, Dapeng Zhao and Dinghui Yang
This is a map of Japan’s islands indicating the area of study (black box). The purple star marks the epicentre of the March 11 earthquake and the red star the Iwaki epicentre. Fukushima Daiichi is highlighted by a red square. Black triangles indicate active volcanoes. Numbers on the side of the image represent latitude and longitude. – Ping Tong, Dapeng Zhao and Dinghui Yang

Seismic risk at the Fukushima nuclear plant increased after the magnitude 9 earthquake that hit Japan last March, scientists report. The new study, which uses data from over 6,000 earthquakes, shows the 11 March tremor caused a seismic fault close to the nuclear plant to reactivate. The results are now published in Solid Earth, an open-access journal of the European Geosciences Union (EGU).

The research suggests authorities should strengthen the security of the Fukushima Daiichi nuclear power plant to withstand large earthquakes that are likely to directly disturb the region. The power plant witnessed one of the worst nuclear disasters in history after it was damaged by the 11 March 2011 magnitude 9 earthquake and tsunami. But this tremor occurred about 160 km from the site, and a much closer one could occur in the future at Fukushima.

“There are a few active faults in the nuclear power plant area, and our results show the existence of similar structural anomalies under both the Iwaki and the Fukushima Daiichi areas. Given that a large earthquake occurred in Iwaki not long ago, we think it is possible for a similarly strong earthquake to happen in Fukushima,” says team-leader Dapeng Zhao, geophysics professor at Japan’s Tohoku University.

The 11 April 2011 magnitude 7 Iwaki earthquake was the strongest aftershock of the 11 March earthquake with an inland epicentre. It occurred 60 km southwest of the Fukushima nuclear power plant, or 200 km from the 11 March epicentre.

The research now published in EGU’s Solid Earth shows that the Iwaki earthquake was triggered by fluids moving upwards from the subducting Pacific plate to the crust. The Pacific plate is moving beneath northeast Japan, which increases the temperature and pressure of the minerals in it. This leads to the removal of water from minerals, generating fluids that are less dense than the surrounding rock. These fluids move up to the upper crust and may alter seismic faults.

“Ascending fluids can reduce the friction of part of an active fault and so trigger it to cause a large earthquake. This, together with the stress variations caused by the 11 March event, is what set off the Iwaki tremor,” says Ping Tong, lead author of the paper.

The number of earthquakes in Iwaki increased greatly after the March earthquake. The movements in the Earth’s crust induced by the event caused variations in the seismic pressure or stress of nearby faults. Around Iwaki, Japan’s seismic network recorded over 24,000 tremors from 11 March 2011 to 27 October 2011, up from under 1,300 detected quakes in the nine years before, the scientists report.

The 6,000 of these earthquakes selected for the study were recorded by 132 seismographic stations in Japan from June 2002 to October 2011. The researchers analysed these data to take pictures of the Earth’s interior, using a technique called seismic tomography.

“The method is a powerful tool to map out structural anomalies, such as ascending fluids, in the Earth’s crust and upper mantle using seismic waves. It can be compared to a CT or CAT scan, which relies on X-rays to detect tumours or fractures inside the human body,” explains Zhao.

While the scientists can’t predict when an earthquake in Fukushima Daiichi will occur, they state that the ascending fluids observed in the area indicate that such an event is likely to occur in the near future. They warn that more attention should be paid to the site’s ability to withstand strong earthquakes, and reduce the risk of another nuclear disaster.

The scientists also note that the results may be useful for reviewing seismic safety in other nuclear facilities in Japan, such as nearby Fukushima Daini, Onagawa to the north of Fukushima, and Tōkai to the south.

3-D laser map shows earthquake zone before and after

This image of post-earthquake topography shows Mexico's Pescadores Fault cutting along a ridge. -  Michael Oskin et al.
This image of post-earthquake topography shows Mexico’s Pescadores Fault cutting along a ridge. – Michael Oskin et al., www.keckcaves.org

Geologists have a new tool to study how earthquakes change the landscape–down to a few inches. It’s giving scientists insights into how earthquake faults behave.

In this week’s issue of the journal Science, a team of scientists from the United States, Mexico and China reports the most comprehensive before-and-after picture yet of an earthquake zone, using data from the magnitude 7.2 event that struck near Mexicali, Mexico, in April 2010.

“We can learn so much about how earthquakes work by studying fresh fault ruptures,” said Michael Oskin, a geologist at the University of California, Davis, and lead author of the paper.

The team, working with the National Center for Airborne Laser Mapping (NCALM), flew over the area with LiDAR (light detection and ranging), which bounces a stream of laser pulses off the ground.

New airborne LiDAR equipment can measure surface features to within a few inches. The researchers were able to make a detailed scan over about 140 square miles in less than three days, Oskin said.

Oskin said that they knew the area had been mapped with LiDAR in 2006 by the Mexican government.

When the earthquake occurred, Oskin and Ramon Arrowsmith at Arizona State University received rapid-response funding from the National Science Foundation (NSF) to carry out an immediate aerial survey to compare the results.

Paper co-authors John Fletcher and Orlando Teran from Mexico’s Ensenada Center for Scientific Research and Higher Education (CICESE) carried out a traditional ground survey of the fault rupture, which helped guide planning of the aerial LiDAR survey and interpretation of the results.

“This study is an excellent demonstration of an emerging tool for Earth science,” said Greg Anderson, NSF program director for EarthScope, which funded the research.

EarthScope scientists conduct research using data from instruments that measure motions of the Earth’s surface, record seismic waves, and recover rock samples from the depths at which earthquakes originate.

“LiDAR-based models of fault rupture and off-fault deformation from large earthquakes can provide new insights into fault behavior,” said Anderson, “with implications for estimating seismic hazards.”

From the ground, features like the five-foot escarpment created when part of a hillside abruptly moved up and sideways are readily visible.

But the LiDAR survey further reveals warping of the ground surface adjacent to faults that previously could not easily be detected, Oskin said.

For example, it reveals folding above the Indiviso Fault running beneath agricultural fields in the floodplain of the Colorado River.

“That would be very hard to see in the field,” Oskin said.

The survey also revealed deformation around a system of small faults that caused the earthquake and allowed measurements that provide clues to understanding how multi-fault earthquakes occur.

Team members used the “virtual reality” facility at UC Davis’s W. M. Keck Center for Active Visualization in Earth Sciences to handle and view the data from the survey.

By comparing pre- and post-earthquake surveys, they could see exactly where the ground moved and by how much.

The 2010 Mexicali earthquake did not occur on a major fault, like the San Andreas, but ran through a series of smaller faults in the Earth’s crust.

These minor faults are common around major faults but are “underappreciated,” Oskin said. “This sort of earthquake happens out of the blue.”

The new LiDAR survey shows that seven of these small faults came together to cause a major earthquake, Oskin said.

Ken Hudnut, a geophysicist with the U.S. Geological Survey and co-author of the paper, made the first use of airborne LiDAR about ten years ago to document surface faulting from the Hector Mine earthquake.

But “pre-earthquake” data were lacking. Since then, NCALM has carried out LiDAR scans of the San Andreas system and other active faults in the Western United States (a component of EarthScope), thereby setting a trap for future earthquakes.

“In this case, fortunately, our CICESE colleagues had set such a trap and this earthquake fell right into it and became the first ever to be imaged by ‘before’ and ‘after’ LiDAR,” said Hudnut.

The post-event dataset collected by the team is publicly available on the Open Topography website.

Study shows global glaciers, ice caps, shedding billions of tons of mass annually

A new CU-Boulder study using the NASA/Germany GRACE satellite shows Earth is losing roughly 150 billion tons of ice annually. -  NASA
A new CU-Boulder study using the NASA/Germany GRACE satellite shows Earth is losing roughly 150 billion tons of ice annually. – NASA

Earth’s glaciers and ice caps outside of the regions of Greenland and Antarctica are shedding roughly 150 billion tons of ice annually, according to a new study led by the University of Colorado Boulder.

The research effort is the first comprehensive satellite study of the contribution of the world’s melting glaciers and ice caps to global sea level rise and indicates they are adding roughly 0.4 millimeters annually, said CU-Boulder physics Professor John Wahr, who helped lead the study. The measurements are important because the melting of the world’s glaciers and ice caps, along with Greenland and Antarctica, pose the greatest threat to sea level increases in the future, Wahr said.

The researchers used satellite measurements taken with the Gravity Recovery and Climate Experiment, or GRACE, a joint effort of NASA and Germany, to calculate that the world’s glaciers and ice caps had lost about 148 billion tons, or about 39 cubic miles of ice annually from 2003 to 2010. The total does not count the mass from individual glacier and ice caps on the fringes of the Greenland and Antarctic ice sheets — roughly an additional 80 billion tons.

“This is the first time anyone has looked at all of the mass loss from all of Earth’s glaciers and ice caps with GRACE,” said Wahr. “The Earth is losing an incredible amount of ice to the oceans annually, and these new results will help us answer important questions in terms of both sea rise and how the planet’s cold regions are responding to global change.”

A paper on the subject is being published in the Feb. 9 online edition of the journal Nature. The first author, Thomas Jacob, did his research at CU-Boulder and is now at the Bureau de Recherches Géologiques et Minières, in Orléans, France. Other paper co-authors include Professor Tad Pfeffer of CU-Boulder’s Institute of Arctic and Alpine Research and Sean Swenson, a former CU-Boulder physics doctoral student who is now a researcher at the National Center for Atmospheric Research in Boulder.

“The strength of GRACE is that it sees everything in the system,” said Wahr. “Even though we don’t have the resolution to look at individual glaciers, GRACE has proven to be an exceptional tool.” Traditional estimates of Earth’s ice caps and glaciers have been made using ground-based measurements from relatively few glaciers to infer what all of the unmonitored glaciers around the world were doing, he said. Only a few hundred of the roughly 200,000 glaciers worldwide have been monitored for a decade or more.

Launched in 2002, two GRACE satellites whip around Earth in tandem 16 times a day at an altitude of about 300 miles, sensing subtle variations in Earth’s mass and gravitational pull. Separated by roughly 135 miles, the satellites measure changes in Earth’s gravity field caused by regional changes in the planet’s mass, including ice sheets, oceans and water stored in the soil and in underground aquifers.

A positive change in gravity during a satellite approach over Greenland, for example, tugs the lead GRACE satellite away from the trailing satellite, speeding it up and increasing the distance between the two. As the satellites straddle Greenland, the front satellite slows down and the trailing satellite speeds up. A sensitive ranging system allows researchers to measure the distance of the two satellites down to as small as 1 micron — about 1/100 the width of a human hair — and to calculate ice and water amounts from particular regions of interest around the globe using their gravity fields.

For the global glaciers and ice cap measurements, the study authors created separate “mascons,” large, ice-covered regions of Earth of various ovate-type shapes. Jacob and Wahr blanketed 20 regions of Earth with 175 mascons and calculated the estimated mass balance for each mascon.

The CU-led team also used GRACE data to calculate that the ice loss from both Greenland and Antarctica, including their peripheral ice caps and glaciers, was roughly 385 billion tons of ice annually. The total mass ice loss from Greenland, Antarctica and all Earth’s glaciers and ice caps from 2003 to 2010 was about 1,000 cubic miles, about eight times the water volume of Lake Erie, said Wahr.

“The total amount of ice lost to Earth’s oceans from 2003 to 2010 would cover the entire United States in about 1 and one-half feet of water,” said Wahr, also a fellow at the CU-headquartered Cooperative Institute for Research in Environmental Sciences.

The vast majority of climate scientists agree that human activities like pumping huge amounts of greenhouse gases in the atmosphere is warming the planet, an effect that is most pronounced in the polar regions.

One unexpected study result from GRACE was that the estimated ice loss from high Asia mountains — including ranges like the Himalaya, the Pamir and the Tien Shan — was only about 4 billion tons of ice annually. Some previous ground-based estimates of ice loss in the high Asia mountains have ranged up to 50 billion tons annually, Wahr said.

“The GRACE results in this region really were a surprise,” said Wahr. “One possible explanation is that previous estimates were based on measurements taken primarily from some of the lower, more accessible glaciers in Asia and were extrapolated to infer the behavior of higher glaciers. But unlike the lower glaciers, many of the high glaciers would still be too cold to lose mass even in the presence of atmospheric warming.”

“What is still not clear is how these rates of melt may increase and how rapidly glaciers may shrink in the coming decades,” said Pfeffer, also a professor in CU-Boulder’s civil, environmental and architectural engineering department. “That makes it hard to project into the future.”

According to the GRACE data, total sea level rise from all land-based ice on Earth including Greenland and Antarctica was roughly 1.5 millimeters per year annually or about 12 millimeters, or one-half inch, from 2003 to 2010, said Wahr. The sea rise amount does include the expansion of water due to warming, which is the second key sea-rise component and is roughly equal to melt totals, he said.

“One big question is how sea level rise is going to change in this century,” said Pfeffer. “If we could understand the physics more completely and perfect numerical models to simulate all of the processes controlling sea level — especially glacier and ice sheet changes — we would have a much better means to make predictions. But we are not quite there yet.”

3-D laser map shows earthquake before and after

This is a visualization of LiDAR data from the April, 2010 earthquake near Mexicali. Blue shows where ground surface moved down, red shows upward movement compared to the previous survey. -  Michael Oskin, UC Davis
This is a visualization of LiDAR data from the April, 2010 earthquake near Mexicali. Blue shows where ground surface moved down, red shows upward movement compared to the previous survey. – Michael Oskin, UC Davis

Geologists have a new tool to study how earthquakes change the landscape down to a few inches, and it’s giving them insight into how earthquake faults behave. In the Feb. 10 issue of the journal Science, a team of scientists from the U.S., Mexico and China reports the most comprehensive before-and-after picture yet of an earthquake zone, using data from the magnitude 7.2 event that struck near Mexicali, northern Mexico in April, 2010.

“We can learn so much about how earthquakes work by studying fresh fault ruptures,” said Michael Oskin, geology professor at the University of California, Davis and lead author on the paper.

The team, working with the National Center for Airborne Laser Mapping (NCALM), flew over the area with LiDAR (light detection and ranging), which bounces a stream of laser pulses off the ground. New airborne LiDAR equipment can measure surface features to within a few inches. The researchers were able to make a detailed scan over about 140 square miles in less than three days, Oskin said.

Oskin said that they knew the area had been mapped with LiDAR in 2006 by the Mexican government. When the earthquake occurred, Oskin and Ramon Arrowsmith at Arizona State University applied for and got funding from the National Science Foundation to carry out an immediate aerial survey to compare the results.

Coauthors John Fletcher and graduate student Orlando Teran from the Centro de Investigación Científica y de Educación Superior de Ensenada (CICESE) carried out a traditional ground survey of the fault rupture, which helped guide planning of the aerial LiDAR survey and the interpretation of the results.

From the ground, features like the five-foot escarpment created when part of a hillside abruptly moved up and sideways are readily visible. But the LiDAR survey further reveals warping of the ground surface adjacent to faults that previously could not easily be detected, Oskin said. For example, it revealed the folding above the Indiviso fault running beneath agricultural fields in the floodplain of the Colorado River.

“This would be very hard to see in the field,” Oskin said.

Team members used the “virtual reality” facility at UC Davis’s W. M. Keck Center for Active Visualization in Earth Sciences to handle and view the data from the survey. By comparing pre- and post-earthquake surveys, they could see exactly where the ground moved and by how much.

The survey revealed deformation around the system of small faults that caused the earthquake, and allowed measurements that provide clues to understanding how these multi-fault earthquakes occur.

The 2010 Mexicali earthquake did not occur on a major fault, like the San Andreas, but ran through a series of smaller faults in the Earth’s crust. These minor faults are common around major faults but are “underappreciated,” Oskin said.

“This sort of earthquake happens out of the blue,” he said.

The new LiDAR survey shows how seven of these small faults came together to cause a major earthquake, Oskin said.

Ken Hudnut, a geophysicist with the U.S. Geological Survey and coauthor on the paper, made the first use of airborne LiDAR about 10 years ago to document surface faulting from the Hector Mine earthquake. But “pre-earthquake” data were lacking. Since then, NCALM has carried out LiDAR scans of the San Andreas system (the “B4 Project”) and other active faults in the western U.S. (a component of the EarthScope Project), thereby setting a trap for future earthquakes, he said.

“In this case, fortunately, our CICESE colleagues had set such a trap, and this earthquake fell right into it and became the first ever to be imaged by “before” and “after'” LiDAR. It is a thrill for me to be on the team that reached this important milestone,” Hudnut said.

Unearthing Antarctica’s mysterious mountains

Buried more than a kilometer beneath the East Antarctica Ice sheet, the Gamburstev Subglacial Mountains have proven to be a geological puzzle for more than 5 decades. How did these mountains form? When did they form? And what makes this ancient mountain range one of the least-understood tectonic features on Earth?

The Gamburstevs lie under the highest point in Antarctica: the 4,000-meter-high Dome Argus Plateau. The mountain range, in the middle of an ancient continental craton, has a thick, crustal root and high topography. Locked under the ice, frozen in time, what secrets could the Gamburstevs reveal about the evolution of our planet? Look below the ice and read the rest of the story available online here http://www.earthmagazine.org/article/unearthing-antarcticas-mysterious-mountains.

Read this story and more in the February issue of EARTH Magazine, available online now at http://www.earthmagazine.org/. Learn how minute particles in our atmosphere affect clouds and rainfall; unlock the mystery to the moon’s magnetism; and read about boron, EARTH’s mineral resource of the month.

Researchers uncover a mechanism to explain dune field patterns

In a study of the harsh but beautiful White Sands National Monument in New Mexico, University of Pennsylvania researchers have uncovered a unifying mechanism to explain dune patterns. The new work represents a contribution to basic science, but the findings may also hold implications for identifying when dune landscapes like those in Nebraska’s Sand Hills may reach a “tipping point” under climate change, going from valuable grazing land to barren desert.

The study was conducted by Douglas Jerolmack, an assistant professor in the Department of Earth and Environmental Science; postdoctoral researcher Federico Falcini; graduate students Raleigh Martin, Colin Phillips and Meredith Reitz; and undergraduate researcher Claire Masteller. The Penn researchers also collaborated with Ryan Ewing of the University of Alabama and Ilya Buynevich of Temple University.

Their paper was published in Nature Geoscience.

Much of the study’s data was collected during field trips taken by students in an undergraduate and graduate course Jerolmack teaches at Penn, Geology 305: Earth Systems Processes. Each year, the class has traveled to White Sands to do fieldwork during spring break.

“It’s a magnificent place to go, and one of the reasons I take my students there is really because it’s so visually striking and compelling,” Jerolmack said. “I want it to be memorable for them.”

White Sands National Monument, located near Alamogordo in south-central New Mexico, is an enclosed basin that housed an ancient lake during the last ice age. Unlike most dune fields, which are composed of quartz sand, it’s the world’s largest dune field made of gypsum. Its blindingly white dunes cover 275 square miles.

The dune fields’ groundwater table is located just a meter below the surface.

“So it means you’re in a very hot arid place, but when you walk around you feel moisture on your feet,” Jerolmack said.

The moisture creates a somewhat “sticky” surface, he added, “so, if the sand blows off a dune and lands, it sticks to the surface and can get deposited and left behind.”

White Sands has long been the site of geologic inquiry. Scientists have put forward theories to explain individual elements of the dunes, including their shape, their movements over time and the presence or absence of plants. The novelty of this study lies in showing how all of these problems are a consequence of the interaction of wind with the dunes.

While the majority of Jerolmack’s work examines how water moves sediment, wind becomes the dominant shaping force in deserts.

The researchers began by analyzing high-resolution elevation maps, measured each year for five years using aerial laser scans of the dune field surface. These data showed that dunes migrated fastest at the upwind (western) edge of the dune field, where the field transitioned into a flat and barren plain. Moving along the prevailing wind direction (northeast) into the dune field, the speed of the moving dunes consistently slowed down. The researchers reasoned that the friction resulting from the dunes was likely causing the wind to slow down over the dune field. They employed a simple theory to provide quantitative confirmation of this idea, demonstrating that aerodynamics was the cause of the dune migration pattern.

Small specks in the high-resolution images, which indicate where plants grow, also showed that the wind and dune migration activity appeared to impact vegetative growth. “There is a rapid transition from bare dunes to dunes that are almost entirely covered with vegetation,” said Jerolmack. “We recognized that this transition occurs because the dunes are slowing down, and slowing down, and slowing down; eventually the dunes are moving so slowly that plants can grow on them.”

According to the researchers’ observations, dunes that are hit with stronger winds have fewer plants, as the plants cannot grow roots quickly enough to keep up with the shifting sands. By contrast, the dunes that experience the slower-moving winds are stable enough to support plants.

The plants then exert their own influence on dune shapes, as their root systems help stabilize the sand in which they grow. Because plants generally take hold first to a dunes’ “horns” – the narrow slopes of boomerang-shaped dunes – before reaching the center, the researchers observed that dunes with plant-stabilized horns inverted as the wind blew the center inside out.

Where plants grew, the underlying groundwater was fresher and farther below the surface than areas bare of plants. The Penn researchers demonstrated that plants impacted the groundwater, rather than the other way around. By taking up water, the plants draw the groundwater table down. This also lowers the evaporation at the groundwater table, leaving the groundwater less salty than in unvegetated areas with high evaporation rates.

“What makes this so interesting is that, by understanding the changes in the wind pattern over the dunes, we can also understand the migration of the dunes, the plant and groundwater dynamics and even the long term deposition rate within the dune field,” Jerolmack said. “This helps us to understand very well what’s going on at White Sands, but these are all fundamental mechanisms that we think can apply in many other places.”

North-central Nebraska’s Sand Hills, located on a grass-stabilized dune field, is one example where this mechanism may apply. Under some climate change predictions, rainfall could decline in the upper Midwest. Even a small reduction in rainwater could mean that the grasses that stabilize the Sand Hills’ dunes would no longer be able to survive. The dunes would then go back to being a barren migrating dune field, no longer serving the half-a-million cattle that now graze there.

“It happened during the Dust Bowl and it could happen again,” Jerolmack said.

Google Earth ocean terrain receives major update

Internet information giant Google updated ocean data in its Google Earth application this week, reflecting new bathymetry data assembled by Scripps Institution of Oceanography, UC San Diego, NOAA researchers and many other ocean mapping groups from around the world.

The newest version of Google Earth includes more accurate imagery in several key areas of ocean using data collected by research cruises over the past three years.

“The original version of Google Ocean was a newly developed prototype map that had high resolution but also contained thousands of blunders related to the original archived ship data,” said David Sandwell, a Scripps geophysicist. “UCSD undergraduate students spent the past three years identifying and correcting the blunders as well as adding all the multibeam echosounder data archived at the National Geophysical Data Center in Boulder, Colorado.”

“The Google map now matches the map used in the research community, which makes the Google Earth program much more useful as a tool for planning cruises to uncharted areas,” Sandwell added.

For example, the updated, more precise data corrects a grid-like artifact on the seafloor that was misinterpreted in the popular press as evidence of the lost city of Atlantis off the coast of North Africa.

Through several rounds of upgrades, Google Earth now has 15 percent of the seafloor image derived from shipboard soundings at 1-kilometer resolution. Previous versions only derived about 10 percent of their data from ship soundings and the rest from depths predicted by Sandwell and NOAA researcher Walter Smith using satellite gravity measurements. The two developed the prediction technique in 1994. The satellite and sounding data are combined with land topography from the NASA Shuttle Radar Topography Mission (SRTM) to create a global topography and bathymetry grid called SRTM30_PLUS.

This new version includes all of the multibeam bathymetry data collected by U.S. research vessels over the past three decades including 287 Scripps expeditions from research vessels Washington, Melville and Revelle. UCSD undergraduate student Alexis Shakas processed all the U.S. multibeam data and then worked with Google researchers on the global integration.

The next major upgrade to the grid will occur later this year using a new gravity model having twice the accuracy of previous models. The new gravity information is being collected by a European Space Agency satellite called CryoSat that was launched in February 2010.

Scientists will install first real-time seafloor earthquake observatory at Cascadia Fault

One of the most dangerous faults in North America is the Pacific Northwest’s Cascadia fault – an offshore, subduction zone fault capable of producing a magnitude 9 earthquake that would damage Portland, Tacoma, Seattle, and Victoria, British Columbia, and generate a large tsunami. Yet there are currently no instruments installed offshore, directly above the fault, for measuring the strain that is currently building up along the fault.

But a recent $1 million grant from the W. M. Keck Foundation to scientists at the Woods Hole Oceanographic Institution (WHOI) will change that. An interdisciplinary project led by WHOI geologist Jeff McGuire, an expert in global earthquake seismology and geodesy, and John Collins, director of WHOI’s Ocean Bottom Seismometer Lab, will build and install the first seafloor geodesy observatory above the expected rupture zone of the next great Cascadia earthquake.

“I think all scientists agree there will be another magnitude 9 earthquake off Oregon and Washington,” said McGuire. “What we’re doing is trying to understand what that will look like. Information that is critically important for modeling how much the fault will slip – and hence how much the ground will shake – and for predicting the maximum height of the tsunami that could be generated.”

“We are immensely grateful for the support from the Keck Foundation, which has a long track record of supporting important research and technology innovations to understand the Earth and its systems for the benefit of society,” said WHOI President and Director Susan Avery. “The real-time data flowing from the fault on the seafloor will not only advance our understanding of earthquakes but can help city planners and emergency response managers.”

The Cascadia subduction zone is a very long sloping fault that stretches from mid-Vancouver Island to Northern California. It separates the Juan de Fuca and North American plates. For many years, according to conventional wisdom, the Cascadia subduction zone slipped without earthquakes. But in the last 30 years, geologists have uncovered sedimentary records as well as historical records in Japan showing that “indeed, the fault repeatedly had these huge earthquakes with big tsunamis,” said McGuire.

Cascadia’s last big event occurred in 1700 and was likely very similar to the March 2011 Japanese earthquake – a magnitude 9 quake and tsunami that traveled all the way across the Pacific. This similarity is foreboding for earthquake scientists, as a key scientific lesson of the Japanese earthquake has been that the standard datasets collected onshore are completely inadequate for characterizing the upcoming ruptures on an offshore subduction zone thrust fault.

One key limitation in the seismic hazard estimation for subduction zones is the use of geodetic data recorded onshore – primarily GPS data – to determine the extent to which offshore faults are locked and building up strain for the next big earthquake. GPS can detect surface motion to unprecedented precision – a fraction of a millimeter per year – but land-based GPS is too far away from offshore faults to be sensitive enough to that motion.

“So you have to have instruments out there to be really sensitive to it,” said McGuire. “We know the fault is locked around the coast but we don’t know how far offshore it’s locked. So one of our goals is to determine if the fault really is locked all the way to the trench or not. One reason that’s important is for understanding what the next tsunami will be like. The March Japan earthquake had such a big tsunami because most of the fault motion was really shallow and close to the seafloor. So figuring out exactly where the locking starts at the shallow end of the fault is one of our primary goals.”

To do this, McGuire and Collins will install tiltmeters at a location approximately 4 kilometers above the Cascadia subduction zone thrust interface. “Tiltmeters are standard instruments on land – most volcano observatories have them,” said McGuire. “These instruments are very, very sensitive to tiny little deformations that occur in the rock,” adds Collins. “The movements can be subtle. They can be slow. Something a seismometer is not sensitive to.”

The tiltmeters will be located within a 300 meter-deep borehole, a study site established by the Integrated Ocean Drilling Program, and will take advantage of an existing seafloor cable infrastructure – NEPTUNE Canada – enabling immediate access to the data collected by the instrument. The instrument array should be installed and returning data by summer 2013.

If such a data stream had been available in real time from the Japanese subduction zone in the days preceding the March 11 quake, the scientific community might have known that the potential for a large earthquake was very high because the fault was already slipping slowly.

McGuire says the co-location of the instruments in the IODP borehole, where scientists study the fluid pressure in the Earth, will also enable the scientists to collaborate across disciplines in new ways.

“Part of the reason we’re installing a tiltmeter in a borehole is because of interesting signals collected in boreholes in the past,” he said. Signals that provide clues to better understanding of earthquakes. “It all feeds back into understanding the fault system – how the stress changes over time in the fault system.”

Sediments from the Enol lake reveal more than 13,500 years of environmental history

This is a research campaign in the Enol lake. -  Ana Moreno et al./IPE(CSIC)
This is a research campaign in the Enol lake. – Ana Moreno et al./IPE(CSIC)

A team of Spanish researchers have used different geological samples, extracted from the Enol lake in Asturias, to show that the Holocene, a period that started 11,600 years ago, did not have a climate as stable as was believed.

The Holocene period, which includes the last 11,600 years of our history, has always been described as a stable period in terms of climatic conditions, especially when compared to the abrupt changes that occurred in the last ice age, which ended around 10,000 years ago, giving way to the Holocene.

A study carried out by researchers from the Pyrenean Institute of Ecology (IPE) at the Spanish Research Scientific Council (CSIC), in collaboration with other scientists from Zaragoza, La Coruña, Valencia and Cádiz universities, and published in the Journal of Paleolimnology, has found climatic differences amongst the “stable” 13,500 years.

The study specifically focused on the Enol lake (Asturias), where various sediment samples were extracted from the bottom. These samples provide data about the regional humidity and temperature changes in the area over more than 135 centuries.

The project, together with a previous study that details the last ice age and another, more recent one that examines the last centuries, implies “the first time glacial evolution and climate change have been registered in the last 40,000 years in the Picos de Europa National Park” claims Ana Moreno, researcher from the IPE-CSIC and lead author of the study.

The Enol lake was formed 40,000 years ago following the retreat of a glacier which dug a trough, allowing the accumulation of sediments and water. 18,000 years ago it was already a lake and organic sediments that are currently being studied were starting to be deposited.

From the lake sediments the physical properties and the amounts of organic carbon, carbonate and other elements, could be analysed, as well as some biological indicators, such as diatom and ostracod fossils.

Vegetation cover evolution


Furthermore, the detailed study of pollen accumulated in this material allows us to make a reconstruction of the variations of vegetation cover, which is crucial information in the context of climate change and the impact of human beings.

The researchers recognised at least 4 different stages in the Holocene: the first one was cold and dry, between 13,500 to 11,600 years ago (cal years BP) which included a brief return to the icy condition known as Younger Dryas. This was followed by a period of higher temperature and humidity, between 11,600 and 8,700 years ago, which coincided with the beginning of the Holocene.

The third period had a drier climate, between 8,700 and 4,650 years ago, and finally a return to the more humid climate from then up to 2,220 years ago. The study also highlights the changes caused in the latest period caused by human activity, specifically from pasture and deforestation.

The study’s conclusions therefore report significant environmental changes throughout the last 13,500 years in history. They also show how at the beginning of the Holocene, the vegetation coverage of the area, which until that time had consisted of Pinus (pine), Betula (birch) and Quercus (oak), then became a forest of mainly Quercus.

Researchers also highlight an increase in precipitation for nearly twelve centuries (between 9,750 and 8,600 years ago), which led to an increase in Corylus, or in other words, Hazel. Although the study of these geological traces from the Enol lake only covers up to 2,200 years ago, it is possible to determine the environmental impact that the region’s inhabitants of that time had from studying the pollen.

Former use of mountain pastures

Moreno said that “The use of mountain pastures is possibly the oldest documented human activity in the area. As we have seen, the lengthy sampling of Lago Enol detected that an opening in the landscape began 4,650 years ago, and most notably from 2,700 years ago”.

The results also show that from 4,650 years ago, humans contributed a greater presence of herbaceous species (from the Plantago and Rumex Acetosella genera) and a decrease in the area’s woodlands.

Those in charge of the study claim that these hydrological and landscape stages from the Enol lake sediments demonstrate the biggest changes in the climate registered during the Holocene in the south of Europe. The Cantabria mountains were like that 2,200 years ago, a date that coincided with the Roman occupation and the start of the Second Punic Wars against the Carthaginians led by Hannibal.

In a more recent study, these researchers found from the pollen register that there have been many alterations in the landscape, which were caused by human activity in the last 200 years. For example, they detected a change in the number of coprophilious fungi (which feed on the faeces of the livestock that graze there) throughout the twentieth century.

According to the researchers, this is due to the fact that “the indigenous bovine livestock were replaced by Alpine Brown cattle, and, before that, high-milk yielding Frisians. This way it changes from being an extensive livestock on the mountain, with the indigenous cattle, to another intensive type, with stables at the bottom of the valley. Another change that the pollen shows is the introduction of eucalyptus plants in 1930.