Hidden movements of Greenland Ice Sheet, runoff revealed

For years NASA has tracked changes in the massive Greenland Ice Sheet. This week scientists using NASA data released the most detailed picture ever of how the ice sheet moves toward the sea and new insights into the hidden plumbing of melt water flowing under the snowy surface.

The results of these studies are expected to improve predictions of the future of the entire Greenland ice sheet and its contribution to sea level rise as researchers revamp their computer models of how the ice sheet reacts to a warming climate.

“With the help of NASA satellite and airborne remote sensing instruments, the Greenland Ice Sheet is finally yielding its secrets,” said Tom Wagner, program scientist for NASA’s cryosphere program in Washington. “These studies represent new leaps in our knowledge of how the ice sheet is losing ice. It turns out the ice sheet is a lot more complex than we ever thought.”

University at Buffalo geophysicist Beata Csatho led an international team that produced the first comprehensive study of how the ice sheet is losing mass based on NASA satellite and airborne data at nearly 100,000 locations across Greenland. The study found that the ice sheet shed about 243 gigatons of ice per year from 2003-09, which agrees with other studies using different techniques. The study was published today in the Proceedings of the National Academy of Sciences.

The study suggests that current ice sheet modeling is too simplistic to accurately predict the future contribution of the Greenland ice sheet to sea level rise, and that current models may underestimate ice loss in the near future.

The project was a massive undertaking, using satellite and aerial data from NASA’s ICESat spacecraft, which measured the elevation of the ice sheet starting in 2003, and the Operation IceBridge field campaign that has flown annually since 2009. Additional airborne data from 1993-2008, collected by NASA’s Program for Arctic Regional Climate Assessment, were also included to extend the timeline of the study.

Current computer simulations of the Greenland Ice Sheet use the activity of four well-studied glaciers — Jakobshavn, Helheim, Kangerlussuaq and Petermann — to forecast how the entire ice sheet will dump ice into the oceans. The new research shows that activity at these four locations may not be representative of what is happening with glaciers across the ice sheet. In fact, glaciers undergo patterns of thinning and thickening that current climate change simulations fail to address, Csatho says.

As a step toward building better models of sea level rise, the research team divided Greenland’s 242 glaciers into 7 major groups based on their behavior from 2003-09.

“Understanding the groupings will help us pick out examples of glaciers that are representative of the whole,” Csatho says. “We can then use data from these representative glaciers in models to provide a more complete picture of what is happening.”

The team also identified areas of rapid shrinkage in southeast Greenland that today’s models don’t acknowledge. This leads Csatho to believe that the ice sheet could lose ice faster in the future than today’s simulations would suggest.

In separate studies presented today at the American Geophysical Union annual meeting in San Francisco, scientists using data from Operation IceBridge found permanent bodies of liquid water in the porous, partially compacted firn layer just below the surface of the ice sheet. Lora Koenig at the National Snow and Ice Data Center in Boulder, Colorado, and Rick Forster at the University of Utah in Salt Lake City, found signatures of near-surface liquid water using ice-penetrating radar.

Across wide areas of Greenland, water can remain liquid, hiding in layers of snow just below the surface, even through cold, harsh winters, researchers are finding. The discoveries by the teams led by Koenig and Forster mean that scientists seeking to understand the future of the Greenland ice sheet need to account for relatively warm liquid water retained in the ice.

Although the total volume of water is small compared to overall melting in Greenland, the presence of liquid water throughout the year could help kick off melt in the spring and summer. “More year-round water means more heat is available to warm the ice,” Koenig said.

Koenig and her colleagues found that sub-surface liquid water are common on the western edges of the Greenland Ice Sheet. At roughly the same time, Forster used similar ground-based radars to find a large aquifer in southeastern Greenland. These studies show that liquid water can persist near the surface around the perimeter of the ice sheet year round.

Another researcher participating in the briefing found that near-surface layers can also contain masses of solid ice that can lead to flooding events. Michael MacFerrin, a scientist at the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, and colleagues studying radar data from IceBridge and surface based instruments found near surface patches of ice known as ice lenses more than 25 miles farther inland than previously recorded.

Ice lenses form when firn collects surface meltwater like a sponge. When this shallow ice melts, as was seen during July 2012, they can release large amounts of water that can lead to flooding. Warm summers and resulting increased surface melt in recent years have likely caused ice lenses to grow thicker and spread farther inland. “This represents a rapid feedback mechanism. If current trends continue, the flooding will get worse,” MacFerrin said.




Video
Click on this image to view the .mp4 video
This animation (from March 2014) portrays the changes occurring in the surface elevation of the Greenland Ice Sheet since 2003 in three drainage areas: the southeast, the northeast and the Jakobshavn regions. In each region, the time advances to show the accumulated change in elevation, 2003-2012.

Downloadable video: http://svs.gsfc.nasa.gov/cgi-bin/details.cgi?aid=4022 – NASA SVS NASA’s Goddard Space Flight Center

Migrating ‘supraglacial’ lakes could trigger future Greenland ice loss

Supraglacial lakes on the Greenland ice sheet can be seen as dark blue specks in the center and to the right of this satellite image. -  USGS/NASA Landsat
Supraglacial lakes on the Greenland ice sheet can be seen as dark blue specks in the center and to the right of this satellite image. – USGS/NASA Landsat

Predictions of Greenland ice loss and its impact on rising sea levels may have been greatly underestimated, according to scientists at the University of Leeds.

The finding follows a new study, which is published today in Nature Climate Change, in which the future distribution of lakes that form on the ice sheet surface from melted snow and ice – called supraglacial lakes – have been simulated for the first time.

Previously, the impact of supraglacial lakes on Greenland ice loss had been assumed to be small, but the new research has shown that they will migrate farther inland over the next half century, potentially altering the ice sheet flow in dramatic ways.

Dr Amber Leeson from the School of Earth and Environment and a member of the Centre for Polar Observation and Modelling (CPOM) team, who led the study, said: “Supraglacial lakes can increase the speed at which the ice sheet melts and flows, and our research shows that by 2060 the area of Greenland covered by them will double.”

Supraglacial lakes are darker than ice, so they absorb more of the Sun’s heat, which leads to increased melting. When the lakes reach a critical size, they drain through ice fractures, allowing water to reach the ice sheet base which causes it to slide more quickly into the oceans. These changes can also trigger further melting.

Dr Leeson explained: “When you pour pancake batter into a pan, if it rushes quickly to the edges of the pan, you end up with a thin pancake. It’s similar to what happens with ice sheets: the faster it flows, the thinner it will be.

“When the ice sheet is thinner, it is at a slightly lower elevation and at the mercy of warmer air temperatures than it would have been if it were thicker, increasing the size of the melt zone around the edge of the ice sheet.”

Until now, supraglacial lakes have formed at low elevations around the coastline of Greenland, in a band that is roughly 100 km wide. At higher elevations, today’s climate is just too cold for lakes to form.

In the study, the scientists used observations of the ice sheet from the Environmental Remote Sensing satellites operated by the European Space Agency and estimates of future ice melting drawn from a climate model to drive simulations of how meltwater will flow and pool on the ice surface to form supraglacial lakes.

Since the 1970s, the band in which supraglacial lakes can form on Greenland has crept 56km further inland. From the results of the new study, the researchers predict that, as Arctic temperatures rise, supraglacial lakes will spread much farther inland – up to 110 km by 2060 – doubling the area of Greenland that they cover today.

Dr Leeson said: “The location of these new lakes is important; they will be far enough inland so that water leaking from them will not drain into the oceans as effectively as it does from today’s lakes that are near to the coastline and connected to a network of drainage channels.”

“In contrast, water draining from lakes farther inland could lubricate the ice more effectively, causing it to speed up.”

Ice losses from Greenland had been expected to contribute 22cm to global sea-level rise by 2100. However, the models used to make this projection did not account for changes in the distribution of supraglacial lakes, which Dr Leeson’s study reveals will be considerable.

If new lakes trigger further increases in ice melting and flow, then Greenland’s future ice losses and its contribution to global sea-level rise have been underestimated.

The Director of CPOM, Professor Andrew Shepherd, who is also from the School of Earth and Environment at the University of Leeds and is a co-author of the study, said: “Because ice losses from Greenland are a key signal of global climate change, it’s important that we consider all factors that could affect the rate at which it will lose ice as climate warms.

“Our findings will help to improve the next generation of ice sheet models, so that we can have greater confidence in projections of future sea-level rise. In the meantime, we will continue to monitor changes in the ice sheet losses using satellite measurements.”

Further information:


The study was funded by the Natural Environment Research Council (NERC) through their support of the Centre for Polar Observation and Modelling and the National Centre for Earth Observation.

The research paper, Supraglacial lakes on the Greenland ice sheet advance inland under warming climate, is published in Nature Climate Change on 15 December 2014.

Dr Amber Leeson and Professor Andrew Shepherd are available for interview. Please contact the University of Leeds Press Office on 0113 343 4031 or email pressoffice@leeds.ac.uk

Worldwide retreat of glaciers confirmed in unprecedented detail

The worldwide retreat of glaciers is confirmed in unprecedented detail. This new book presents an overview and detailed assessment of changes in the world's glaciers by using satellite imagery -  Springer
The worldwide retreat of glaciers is confirmed in unprecedented detail. This new book presents an overview and detailed assessment of changes in the world’s glaciers by using satellite imagery – Springer

Taking their name from the old Scottish term glim, meaning a passing look or glance, in 1994 a team of scientists began developing a world-wide initiative to study glaciers using satellite data. Now 20 years later, the international GLIMS (Global Land Ice Measurements from Space) initiative observes the world’s glaciers primarily using data from optical satellite instruments such as ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) and Landsat.

More than 150 scientists from all over the world have contributed to the new book Global Land Ice Measurements from Space, the most comprehensive report to date on global glacier changes. While the shrinking of glaciers on all continents is already known from ground observations of individual glaciers, by using repeated satellite observations GLIMS has firmly established that glaciers are shrinking globally. Although some glaciers are maintaining their size, most glaciers are dwindling. The foremost cause of the worldwide reductions in glaciers is global warming, the team writes.

Full color throughout, the book has 25 regional chapters that illustrate glacier changes from the Arctic to the Antarctic. Other chapters provide a thorough theoretical background on glacier monitoring and mapping, remote sensing techniques, uncertainties, and interpretation of the observations in a climatic context. The book highlights many other glacier research applications of satellite data, including measurement of glacier thinning from repeated satellite-based digital elevation models (DEMs) and calculation of surface flow velocities from repeated satellite images.

These tools are key to understanding local and regional variations in glacier behavior, the team writes. The high sensitivity of glaciers to climate change has substantially decreased their volume and changed the landscape over the past decades, affecting both regional water availability and the hazard potential of glaciers. The growing GLIMS database about glaciers also contributed to the Intergovernmental Panel on Climate Change (IPCC)’s Fifth Assessment Report issued in 2013. The IPCC report concluded that most of the world’s glaciers have been losing ice at an increasing rate in recent decades.

More than 60 institutions across the globe are involved in GLIMS. Jeffrey S. Kargel of the Department of Hydrology and Water Resources at the University of Arizona coordinates the project. The GLIMS glacier database and GLIMS web site are developed and maintained by the National Snow and Ice Data Center (NSIDC) at the University of Colorado in Boulder.

Global Land Ice Measurements from Space</em?

Hardcover $279.00; £180.00; € 199,99

Springer and Praxis Publishing (2014) ISBN 978-3-540-79817-0

Also available as an eBook

Researchers resolve the Karakoram glacier anomaly, a cold case of climate science

The researchers found that low-resolution models and a lack of reliable observational data obscured the Karakoram's dramatic shifts in elevation over a small area and heavy winter snowfall. They created a higher-resolution model that showed the elevation and snow water equivalent for (inlaid boxes, from left to right) the Karakoram range and northwest Himalayas, the central Himalayas that include Mount Everest, and the southeast Himalayas and the Tibetan Plateau. For elevation (left), the high-resolution model showed the sharp variations between roughly 2,500 and 5,000 meters above sea level (yellow to brown) for the Karakoram, while other areas of the map have comparatively more consistent elevations. The model also showed that the Karakoram receive much more annual snowfall (right) than other Himalayan ranges (right), an average of 100 centimeters (brown). The researchers found that the main precipitation season in the Karakoram occurs during the winter and is influenced by cold winds coming from Central Asian countries, as opposed to the heavy summer monsoons that provide the majority of precipitation to the other Himalayan ranges. -  Image by Sarah Kapnick, Program in Atmospheric and Oceanic Sciences
The researchers found that low-resolution models and a lack of reliable observational data obscured the Karakoram’s dramatic shifts in elevation over a small area and heavy winter snowfall. They created a higher-resolution model that showed the elevation and snow water equivalent for (inlaid boxes, from left to right) the Karakoram range and northwest Himalayas, the central Himalayas that include Mount Everest, and the southeast Himalayas and the Tibetan Plateau. For elevation (left), the high-resolution model showed the sharp variations between roughly 2,500 and 5,000 meters above sea level (yellow to brown) for the Karakoram, while other areas of the map have comparatively more consistent elevations. The model also showed that the Karakoram receive much more annual snowfall (right) than other Himalayan ranges (right), an average of 100 centimeters (brown). The researchers found that the main precipitation season in the Karakoram occurs during the winter and is influenced by cold winds coming from Central Asian countries, as opposed to the heavy summer monsoons that provide the majority of precipitation to the other Himalayan ranges. – Image by Sarah Kapnick, Program in Atmospheric and Oceanic Sciences

Researchers from Princeton University and other institutions may have hit upon an answer to a climate-change puzzle that has eluded scientists for years, and that could help understand the future availability of water for hundreds of millions of people.

In a phenomenon known as the “Karakoram anomaly,” glaciers in the Karakoram mountains, a range within the Himalayas, have remained stable and even increased in mass while many glaciers nearby — and worldwide — have receded during the past 150 years, particularly in recent decades. Himalayan glaciers provide freshwater to a densely populated area that includes China, Pakistan and India, and are the source of the Ganges and Indus rivers, two of the world’s major waterways.

While there have been many attempts to explain the stability of the Karakoram glaciers, the researchers report in the journal Nature Geoscience that the ice is sustained by a unique and localized seasonal pattern that keeps the mountain range relatively cold and dry during the summer. Other Himalayan ranges and the Tibetan Plateau — where glaciers have increasingly receded as Earth’s climate has warmed — receive most of their precipitation from heavy summer monsoons out of hot South and Southeast Asian nations such as India. The main precipitation season in the Karakoram, however, occurs during the winter and is influenced by cold winds coming from Central Asian countries such as Afghanistan to the west, while the main Himalayan range blocks the warmer air from the southeast throughout the year.

The researchers determined that snowfall, which is critical to maintaining glacier mass, will remain stable and even increase in magnitude at elevations above 4,500 meters (14,764 feet) in the Karakoram through at least 2100. On the other hand, snowfall over much of the Himalayas and Tibet is projected to decline even as the Indian and Southeast Asian monsoons increase in intensity under climate change.

First author Sarah Kapnick, a postdoctoral research fellow in Princeton’s Program in Atmospheric and Oceanic Sciences, said that a shortage of reliable observational data and the use of low-resolution computer models had obscured the subtleties of the Karakoram seasonal cycle and prevented scientists from unraveling the causes of the anomaly.

For models, the complication is that the Karakoram features dramatic shifts in elevation over a small area, Kapnick said. The range boasts four mountains that are more than 8,000 meters (26,246 feet) high — including K2, the world’s second highest peak — and numerous summits that exceed 7,000 meters, all of which are packed into a length of about 500 kilometers (300 miles).

Kapnick and her co-authors overcame this obstacle with a high-resolution computer model that broke the Karakoram into 50-kilometer pieces, meaning that those sharp fluctuations in altitude were better represented.

In their study, the researchers compared their model with climate models from the United Nations’ Intergovernmental Panel on Climate Change (IPCC), which averages a resolution of 210-kilometer squares, Kapnick said. At that scale, the Karakoram is reduced to an average height that is too low and results in temperatures that are too warm to sustain sufficient levels of snowfall throughout the year, and too sensitive to future temperature increases.

Thus, by the IPCC’s models, it would appear that the Karakoram’s glaciers are imperiled by climate change due to reduced snowfall, Kapnick said. This region has been a great source of controversy ever since the IPCC’s last major report, in 2007, when the panel misreported that Himalayan glaciers would likely succumb to climate change by 2035. More recent papers using current IPCC models have similarly reported snowfall losses in this region because the models do not accurately portray the topography of the Karakoram, Kapnick said.

“The higher resolution allowed us to explore what happens at these higher elevations in a way that hasn’t been able to be done,” Kapnick said. “Something that climate scientists always have to keep in mind is that models are useful for certain types of questions and not necessarily for other types of questions. While the IPCC models can be particularly useful for other parts of the world, you need a higher resolution for this area.”

Jeff Dozier, a professor of snow hydrology, earth system science and remote sensing at the University of California-Santa Barbara, said that the research addresses existing shortcomings in how mountain climates are modeled and predicted, particularly in especially steep and compact ranges. Dozier, who was not involved in the research, conducts some of his research in the Hindu Kush mountains west of the Karakoram.

Crucial information regarding water availability is often lost in computer models, observational data and other tools that typically do not represent ranges such as Karakoram accurately enough, Dozier said. For instance, a severe 2011 drought in Northern Afghanistan was a surprise partly due to erroneous runoff forecasts based on insufficient models and surface data, he said. The high-resolution model Kapnick and her co-authors developed for Karakoram potentially resolves many of the modeling issues related to mountain ranges with similar terrain, he said.

“The Karakoram Anomaly has been a puzzle, and this paper gives a credible explanation,” Dozier said. “Climate in the mountains is obviously affected strongly by the elevation, but most global climate models don’t resolve the topography well enough. So, the higher-resolution model is appropriate. About a billion people worldwide get their water resources from melting snow and many of these billion get their water from High Mountain Asia.”

The researchers used the high-resolution global-climate model GFDL-CM2.5 at the Geophysical Fluid Dynamics Laboratory (GFDL), which is on Princeton’s Forrestal Campus and administered by the National Oceanic and Atmospheric Administration (NOAA). The researchers simulated the global climate — with a focus on the Karakoram — based on observational data from 1861 to 2005, and on the IPCC’s greenhouse-gas projections for 2006-2100, which will be included in its Fifth Assessment Report scheduled for release in November.

The 50-kilometer resolution revealed conditions in Karakoram on a monthly basis, Kapnick said. It was then that she and her colleagues could observe that the monsoon months in Karakoram are not only not characterized by heavy rainfall, but also include frigid westerly winds that keep conditions in the mountain range cold enough for nearly year-round snowfall.

“There is precipitation during the summer, it just doesn’t dominate the seasonal cycle. This region, even at the same elevation as the rest of the Himalayas, is just colder,” Kapnick said.

“The high-resolution model shows us that things don’t happen perfectly across seasons. You can have statistical variations in one month but not another,” she continued. “This allows us to piece out those significant changes from one month to the next.”

Kapnick, who received her bachelor’s degree in mathematics from Princeton in 2004, worked with Thomas Delworth, a NOAA scientist and Princeton lecturer of geosciences and atmospheric and oceanic sciences; Moestasim Ashfaq, a scientist at the Oak Ridge National Laboratory Climate Change Science Institute; Sergey Malyshev, a climate modeler in Princeton’s Department of Ecology and Evolutionary Biology based at GFDL; and P.C.D. “Chris” Milly, a research hydrologist for the U.S. Geological Survey based at GFDL who received his bachelor’s degree in civil engineering from Princeton in 1978.

While the researchers show that the Karakoram will receive consistent — and perhaps increased — snowfall through 2100, more modeling work is needed to understand how the existing glaciers may change over time as a result of melt, avalanches and other factors, Kapnick said.

“Our work is an important piece to understanding the Karakoram anomaly,” Kapnick said. “But that balance of what’s coming off the glacier versus what’s coming in also matters for understanding how the glacier will change in the future.”

The paper, “Snowfall less sensitive to warming in Karakoram than in Himalayas due to a unique seasonal cycle,” was published online in-advance-of-print Oct. 12 by Nature Geoscience.

New map uncovers thousands of unseen seamounts on ocean floor

This is a gravity model of the North Atlantic; red dots are earthquakes. Quakes are often related to seamounts. -  David Sandwell, SIO
This is a gravity model of the North Atlantic; red dots are earthquakes. Quakes are often related to seamounts. – David Sandwell, SIO

Scientists have created a new map of the world’s seafloor, offering a more vivid picture of the structures that make up the deepest, least-explored parts of the ocean.

The feat was accomplished by accessing two untapped streams of satellite data.

Thousands of previously uncharted mountains rising from the seafloor, called seamounts, have emerged through the map, along with new clues about the formation of the continents.

Combined with existing data and improved remote sensing instruments, the map, described today in the journal Science, gives scientists new tools to investigate ocean spreading centers and little-studied remote ocean basins.

Earthquakes were also mapped. In addition, the researchers discovered that seamounts and earthquakes are often linked. Most seamounts were once active volcanoes, and so are usually found near tectonically active plate boundaries, mid-ocean ridges and subducting zones.

The new map is twice as accurate as the previous version produced nearly 20 years ago, say the researchers, who are affiliated with California’s Scripps Institution of Oceanography (SIO) and other institutions.

“The team has developed and proved a powerful new tool for high-resolution exploration of regional seafloor structure and geophysical processes,” says Don Rice, program director in the National Science Foundation’s Division of Ocean Sciences, which funded the research.

“This capability will allow us to revisit unsolved questions and to pinpoint where to focus future exploratory work.”

Developed using a scientific model that captures gravity measurements of the ocean seafloor, the map extracts data from the European Space Agency’s (ESA) CryoSat-2 satellite.

CryoSat-2 primarily captures polar ice data but also operates continuously over the oceans. Data also came from Jason-1, NASA’s satellite that was redirected to map gravity fields during the last year of its 12-year mission.

“The kinds of things you can see very clearly are the abyssal hills, the most common landform on the planet,” says David Sandwell, lead author of the paper and a geophysicist at SIO.

The paper’s co-authors say that the map provides a window into the tectonics of the deep oceans.

The map also provides a foundation for the upcoming new version of Google’s ocean maps; it will fill large voids between shipboard depth profiles.

Previously unseen features include newly exposed continental connections across South America and Africa and new evidence for seafloor spreading ridges in the Gulf of Mexico. The ridges were active 150 million years ago and are now buried by mile-thick layers of sediment.

“One of the most important uses will be to improve the estimates of seafloor depth in the 80 percent of the oceans that remain uncharted or [where the sea floor] is buried beneath thick sediment,” the authors state.

###

Co-authors of the paper include R. Dietmar Muller of the University of Sydney, Walter Smith of the NOAA Laboratory for Satellite Altimetry Emmanuel Garcia of SIO and Richard Francis of ESA.

The study also was supported by the U.S. Office of Naval Research, the National Geospatial-Intelligence Agency and ConocoPhillips.

Breakthrough provides picture of underground water

Superman isn’t the only one who can see through solid surfaces. In a development that could revolutionize the management of precious groundwater around the world, Stanford researchers have pioneered the use of satellites to accurately measure levels of water stored hundreds of feet below ground. Their findings were published recently in Water Resources Research.

Groundwater provides 25 to 40 percent of all drinking water worldwide, and is the primary source of freshwater in many arid countries, according to the National Groundwater Association. About 60 percent of all withdrawn groundwater goes to crop irrigation. In the United States, the number is closer to 70 percent. In much of the world, however, underground reservoirs or aquifers are poorly managed and rapidly depleted due to a lack of water-level data. Developing useful groundwater models, availability predictions and water budgets is very challenging.

Study co-author Rosemary Knight, a professor of geophysics and senior fellow, by courtesy, at the Stanford Woods Institute for the Environment, compared groundwater use to a mismanaged bank account: “It’s like me saying I’m going to retire and live off my savings without knowing how much is in the account.”

Lead author Jessica Reeves, a postdoctoral scholar in geophysics, extended Knight’s analogy to the connection among farmers who depend on the same groundwater source. “Imagine your account was connected to someone else’s account, and they were withdrawing from it without your knowing.”

Until now, the only way a water manager could gather data about the state of water tables in a watershed was to drill monitoring wells. The process is time and resource intensive, especially for confined aquifers, which are deep reservoirs separated from the ground surface by multiple layers of impermeable clay. Even with monitoring wells, good data is not guaranteed. Much of the data available from monitoring wells across the American West is old and of varying quality and scientific usefulness. Compounding the problem, not all well data is openly shared.

To solve these challenges, Reeves, Knight, Stanford Woods Institute-affiliated geophysics and electrical engineering Professor Howard Zebker, Stanford civil and environmental engineering Professor Peter Kitanidis and Willem Schreüder of Principia Mathematica Inc. looked to the sky.

The basic concept: Satellites that use electromagnetic waves to monitor changes in the elevation of Earth’s surface to within a millimeter could be mined for clues about groundwater. The technology, Interferometric Synthetic Aperture Radar (InSAR), had previously been used primarily to collect data on volcanoes, earthquakes and landslides.

With funding from NASA, the researchers used InSAR to make measurements at 15 locations in Colorado’s San Luis Valley, an important agricultural region and flyway for migrating birds. Based on observed changes in Earth’s surface, the scientists compiled water-level measurements for confined aquifers at three of the sampling locations that matched the data from nearby monitoring wells.

“If we can get this working in between wells, we can measure groundwater levels across vast areas without using lots of on-the-ground monitors,” Reeves said.

The breakthrough holds the potential for giving resource managers in Colorado and elsewhere valuable data as they build models to assess scenarios such as the effect on groundwater from population increases and droughts.

Just as computers and smartphones inevitably get faster, satellite data will only improve. That means more and better data for monitoring and managing groundwater. Eventually, InSAR data could play a vital role in measuring seasonal changes in groundwater supply and help determine levels for sustainable water use.

In the meantime, Knight envisions a Stanford-based, user-friendly online database that consolidates InSAR findings and a range of other current remote sensing data for soil moisture, precipitation and other components of a water budget. “Very few, if any, groundwater managers are tapping into any of the data,” Knight said. With Zebker, postdoctoral fellow Jingyi Chen and colleagues at the University of South Carolina, Knight recently submitted a grant proposal for this concept to NASA.

San Francisco’s big 1906 quake was third of a series on San Andreas Fault

University of Oregon doctoral student Ashley Streig shows a tree stump on which tree-ring dating indicates the tree was cut prior to the earthquake of 1838 on the San Andreas Fault in the Santa Cruz Mountains. -  University of Oregon
University of Oregon doctoral student Ashley Streig shows a tree stump on which tree-ring dating indicates the tree was cut prior to the earthquake of 1838 on the San Andreas Fault in the Santa Cruz Mountains. – University of Oregon

Research led by a University of Oregon doctoral student in California’s Santa Cruz Mountains has uncovered geologic evidence that supports historical narratives for two earthquakes in the 68 years prior to San Francisco’s devastating 1906 disaster.

The evidence places the two earthquakes, in 1838 and 1890, on the San Andreas Fault, as theorized by many researchers based on written accounts about damage to Spanish-built missions in the Monterey and San Francisco bay areas. These two quakes, as in 1906, were surface-rupturing events, the researchers concluded.

Continuing work, says San Francisco Bay-area native Ashley R. Streig, will dig deeper into the region’s geological record — layers of sediment along the fault — to determine if the ensuing seismically quiet years make up a normal pattern — or not — of quake frequency along the fault.

Streig is lead author of the study, published in this month’s issue of the Bulletin of the Seismological Society of America. She collaborated on the project with her doctoral adviser Ray Weldon, professor of the UO’s Department of Geological Sciences, and Timothy E. Dawson of the Menlo Park office of the California Geological Survey.

The study was the first to fully map the active fault trace in the Santa Cruz Mountains using a combination of on-the-ground observations and airborne Light Detection and Ranging (LiDAR), a remote sensing technology. The Santa Cruz Mountains run for about 39 miles from south of San Francisco to near San Juan Batista. Hazel Dell is east of Santa Cruz and north of Watsonville.

“We found the first geologic evidence of surface rupture by what looks like the 1838 and 1890 earthquakes, as well as 1906,” said Streig, whose introduction to major earthquakes came at age 11 during the 1989 Loma Prieta Earthquake on a deep sub-fault of the San Andreas Fault zone. That quake, which disrupted baseball’s World Series, forced her family to camp outside their home.

Unlike the 1906 quake that ruptured 470 km (296 mi) of the fault, the 1838 and 1890 quakes ruptured shorter portions of the fault, possibly limited to the Santa Cruz Mountains. “This is the first time we have had good, clear geologic evidence of these historic 19th century earthquakes,” she said. “It’s important because it tells us that we had three surface ruptures, really closely spaced in time that all had fairly large displacements of at least half a meter and probably larger.”

The team identified ax-cut wood chips, tree stumps and charcoal fragments from early logging efforts in unexpectedly deep layers of sediment, 1.5 meters (five feet) below the ground, and document evidence of three earthquakes since logging occurred at the site. The logging story emerged from 16 trenches dug in 2008, 2010 and 2011 along the fault at the Hazel Dell site in the mountain range.

High-resolution radiocarbon dating of tree-rings from the wood chips and charcoal confirm these are post European deposits, and the geologic earthquake evidence coincides with written accounts describing local earthquake damage, including damage to Spanish missions in 1838, and in a USGS publication of earthquakes in 1890 catalogued by an astronomer from Lick Observatory.

Additionally, in 1906 individuals living near the Hazel Dell site reported to geologists that cracks from the 1906 earthquake had occurred just where they had 16 years earlier, in 1890, which, Streig and colleagues say, was probably centered in the Hazel Dell region. Another displacement of sediment at the Hazel Dell site matched the timeline of the 1906 quake.

The project also allowed the team to conclude that another historically reported quake, in 1865, was not surface rupturing, but it was probably deep and, like the 1989 event, occurred on a sub zone of the San Andreas Fault. Conventional thinking, Streig said, has suggested that the San Andreas Fault always ruptures in a long-reaching fashion similar to the 1906 earthquake. This study, however, points to more regionally confined ruptures as well.

“This all tells us that there are more frequent surface-rupturing earthquakes on this section of the fault than have been previously identified, certainly in the historic period,” Streig said. “This becomes important to earthquake models because it is saying something about the connectivity of all these fault sections — and how they might link up.”

The frequency of the quakes in the Santa Cruz Mountains, she added, must have been a terrifying experience for settlers during the 68-year period.

“This study is the first to show three historic ruptures on the San Andreas Fault outside the special case of Parkfield,” Weldon said, referring to a region in mountains to the south of the Santa Cruz range where six magnitude 6-plus earthquakes occurred between 1857 and 1966. “The earthquakes of 1838 and 1890 were known to be somewhere nearby from shaking, but now we know the San Andreas Fault ruptured three times on the same piece of the fault in less than 100 years.”

More broadly, Weldon said, having multiple paleoseismic sites close together on a major fault, geologists now realize that interpretations gleaned from single-site evidence probably aren’t reliable. “We need to spend more time reproducing or confirming results rather than rushing to the next fault if we are going to get it right,” he said. “Ashley’s combination of historical research, C-14 dating, tree rings, pollen and stratigraphic correlation between sites has allowed us to credibly argue for precision that allows identification of the 1838 and 1890 earthquakes.”

“Researchers at the University of Oregon are using tools and technologies to further our understanding of the dynamic forces that continue to shape our planet and impact its people,” said Kimberly Andrews Espy, vice president for research and innovation and dean of the UO Graduate School. “This research furthers our understanding of the connectivity of the various sections of California’s San Andreas Fault and has the potential to save lives by leading to more accurate earthquake modeling.”

Study to enhance earthquake prediction and mitigation in Pakistani region

This is a sketch map of southeast Asia showing major faults and tectonic blocks, including the Chaman Fault. -  Courtesy of Shuhab Khan
This is a sketch map of southeast Asia showing major faults and tectonic blocks, including the Chaman Fault. – Courtesy of Shuhab Khan

A three-year, $451,000 grant from the United States Agency for International Development to study the Chaman Fault in Western Pakistan will help earthquake prediction and mitigation in this heavily populated region.

The research, part of the Pakistan-U.S. Science and Technology Cooperation Program, will also increase the strength and breadth of cooperation and linkages between Pakistani scientists and institutions with counterparts in the U.S. The National Academy of Sciences implements the U.S. side of the program.

Shuhab Khan, associate professor of geology at University of Houston, will lead the project in the U.S. His counterpart in Pakistan is Abdul Salam Khan of the University of Balochistan.

“The Chaman Fault is a large, active fault around 1,000 kilometers, or 620 miles, long. It crosses back and forth between Afghanistan and Pakistan, ultimately merging with some other faults and going to the Arabian Sea,” Khan said.

The study area is located close to megacities in both countries.

“Seismic activity across this region has caused hundreds of thousands of deaths and catastrophic economic losses,” Khan said. “However, the Chaman Fault is one of the least studied fault systems. Through this research, we will determine how fast the fault is moving and which part is more active.”

The Chaman Fault is the largest, strike-slip fault system in Central Asia. It is a little more than half the size of the San Andreas Fault in California.

“In strike-slip faults, the Earth’s crust moves laterally. Earthquakes along these types of faults are shallow and more damaging,” he said. “Rivers can also be displaced and change course with activity related to this type of fault.”

The study team will gather data using remote sensing satellite technology and field measurements made at various sites along the fault.

“In addition to current movement, the techniques will allow us to go back tens of thousands of years to determine which areas have moved and how much,” Khan said.

Field measurement techniques include sampling and analysis of rocks and sand along the fault system.

“Through cosmogenic age dating, we can determine how much time rocks along the fault have been exposed to sunlight by measuring for cosmic rays and radiation. Those measurements help us determine how much time it took the rocks to move in the area,” Khan said.

Sand buried below the surface will be sampled without being exposed to light. In the lab, measurements using optically stimulated luminescence will reveal how long the sand has been buried.

Three students from the University of Balochistan will come to the U.S. to learn the field techniques. “We will take them to the San Andreas Fault for training because the locations and faults are similar,” Khan said. “They will return to Pakistan and gather samples from designated areas along the fault.”

The samples will be analyzed at the University of Cincinnati lab of geology professor Lewis Owen, co-investigator on the grant. The research team also includes University of Houston geosciences students. Two undergraduate students will help process the rock samples, and a Ph.D. student will work with the remote sensing data.

“Through the data collection, we will learn more about the movement along this fault in the recent past. That information will help with earthquake prediction and mitigation,” Khan said.

New explanation for slow earthquakes on San Andreas

New Zealand’s geologic hazards agency reported this week an ongoing, “silent” earthquake that began in January is still going strong. Though it is releasing the energy equivalent of a 7.0 earthquake, New Zealanders can’t feel it because its energy is being released over a long period of time, therefore slow, rather than a few short seconds.

These so-called “slow slip events” are common at subduction zone faults – where an oceanic plate meets a continental plate and dives beneath it. They also occur on continents along strike-slip faults like California’s San Andreas, where two plates move horizontally in opposite directions. Occurring close to the surface, in the upper 3-5 kilometers (km) of the fault, this slow, silent movement is referred to as “creep events.”

In a study published this week in Nature Geoscience, scientists from Woods Hole Oceanographic Institution (WHOI), McGill University, and GNS Science New Zealand provide a new model for understanding the geological source of silent earthquakes, or “creep events” along California’s San Andreas fault. The new study shows creep events originate closer to the surface, a much shallower source along the fault.

“The observation that faults creep in different ways at different places and times in the earthquake cycle has been around for 40 years without a mechanical model that can explain this variability,” says WHOI geologist and co-author Jeff McGuire. “Creep is a basic feature of how faults work that we now understand better.”

Fault creep occurs in shallow portions of the fault and is not considered a seismic event. There are two types of creep. In one form, creep occurs as a continuous stable sliding of unlocked portions of the fault, and can account for approximately 25 millimeters of motion along the fault per year. The other type is called a “creep event,” sudden slow movement, lasting only a few hours, and accommodating approximately 3 centimeters of slip per event. Creep events are separated by long intervals of slow continuous creep.

“Normal earthquakes happen when the locked portions of the fault rupture under the strain of accumulated stress and the plates move or slip along the fault,” says the study’s lead author, WHOI postdoctoral scholar Matt Wei. “This kind of activity is only a portion of the total fault movement per year. However, a significant fraction of the total slip can be attributed to fault creep.”

Scientists have mapped out the segments of the San Andreas fault that experience these different kinds of creep, and which segments are totally “locked,” experiencing no movement at all until an earthquake rupture. They know the source of earthquakes is a layer of unstable rock at about 5- 15 km depth along the fault. But have only recently begun to understand the source of fault creep.

For nearly two decades, geologists have accepted and relied upon a mechanical model to explain the geologic source of fault creep. This model explains that continuous creep is generated in the upper-most “stable” sediment layer of the fault plane and episodic creep events originate in a “conditionally stable” layer of rock sandwiched between the sediment and the unstable layer of rock (the seismogenic zone, where earthquakes originate) below it.

But when Wei and his colleagues tried to use this mechanical model to reproduce the geodetic data after a 1987 earthquake in southern California’s Superstition Hills fault, they found it is impossible to match the observations.

“Superstition Hills was a very large earthquake. Immediately following the quake, the US Geologic Survey installed creepmeters to measure the post-seismic deformation. The result is a unique data set that shows both afterslip and creep events,” says Wei.

The researchers could only match the real world data set and on-the-ground observations by embedding an additional unstable layer within the top sediment layer of the model. “This layer may result from fine-scale lithological heterogeneities within the stable zone -frictional behavior varies with lithology, generating the instability,” the authors write. “Our model suggests that the displacement of and interval between creep events are dependent on the thickness, stress, and frictional properties of the shallow, unstable layer.”

There are major strike-slip faults like the San Andreas around the world, but the extent of creep events along those faults is something of a mystery. “Part of the reason is that we don’t have creepmeters along these faults, which are often in sparsely populated areas. It takes money and effort, so a lot of these faults are not covered [with instruments]. We can use remote sensing to know if they are creeping, but we don’t know if it’s from continuous creep or creep events,” says Wei.

Simulating faults to better understand how stress, strain, and earthquakes work is inherently difficult because of the depth at which the important processes happen. Recovering drill cores and installing instruments at significant depths within the earth is very expensive and still relatively rare. “Rarely are the friction tests done on real cores,” says Wei. “Most of the friction tests are done on synthetic cores. Scientists will grind rocks into powder to simulate the fault.”Decades of these experiments have provided an empirical framework to understand how stress and slip evolve on faults, but geologists are still a long way from having numerical models tailored to the parameters that hold for particular faults in the earth.

McGuire says the new research is an important step in ground-truthing those lab simulations. “This work has shown that the application of the friction laws derived from the lab can accurately describe some first order variations that we observe with geodesy between different faults in the real world,” he says. “This is an important validation of the scaling up of the lab results to large crustal faults.”

For the scientists, this knowledge is a new beginning for further research into understanding fault motion and the events that trigger them. Creep events are important because they are shallow and release stress, but are still an unknown factor in understanding earthquake behavior. “There’s much we still don’t know. For example, it’s possible that the shallow layer source of creep events could magnify an earthquake as it propagates through these layers,” says Wei.

Additionally, the findings can help understand the slow slip events happening along subduction zones, like the ongoing event in New Zealand. “By validating the friction models with shallow creep events that have very precise data, we can have more confidence in the mechanical implications of the deep subduction zone events,” McGuire says.

The drones of oil

Researcher Aleksandra Sima at Bergen's Centre for Integrated Petroleum Research (CIPR) is part of the Norwegian research team using drones to look for oil. -  Photo: Eivind Senneset/UiB
Researcher Aleksandra Sima at Bergen’s Centre for Integrated Petroleum Research (CIPR) is part of the Norwegian research team using drones to look for oil. – Photo: Eivind Senneset/UiB

Geologists have long used seismology on the bottom of the ocean or have been throwing dynamite from snowmobiles when they look for oil. But now researchers at Centre for integrated petroleum research (CIPR), a joint venture between the University of Bergen (UiB) and Uni Research, have found a new preferred method – using drones to map new oil reserves from the air.

- In reality the drones can be viewed as an advanced camera tripod, which helps geologists to map inaccessible land in an efficient manner. The use of drones facilitates our efforts to define the geology and to find oil, says researcher Aleksandra Sima at CIPR about the drone that she and her fellow researchers have just acquired to take aerial shots of rocks.

Virtual fieldwork


Sima is a member of CIPR’s Virtual Outcrop Geology (VOG) group. The group’s main task is to create digital maps in 3D of potential oil fields. Using laser scanners, infrared sensors and digital cameras, the researchers create realistic, virtual models. Every tiny pixel of an image can store information on minerals and rocks.

These high-tech models help the geologists to criss-cross the landscape, not unlike what you will find on Google Earth. This virtual fieldwork enables the researchers to gather information on anything from the type of rock to the thickness of the sedimentation; all with the help of a few mouse clicks on the computer.

- A landscape’s surface often reflects what lies beneath ground and corresponds with the rocks below the seabed. When we have an overview of the rocks and minerals in one area, it is far easier to make estimates about where to find oil and how the oil flows, says Simon Buckley, senior researcher at CIPR and head of the VOG group.

Quick and affordable

So far, the researchers have used ground-based laser scanners (LIDAR), infrared sensors and cameras to replicate the landscape. But putting instruments on the ground is both time-consuming and limited to lower ground areas.

In higher elevations in the shadows of sensors, for instance behind rocks or high mountains, the researchers have had to mount the cameras and laser sensors to helicopters, which they have leased.

- Using drones is more affordable. All places can be reached quickly and you can shoot in inaccessible areas, Buckley explains.

Pictures shot with the help of a drone complement the images from low-level terrain that the researchers already have in hand. The end result is more precise and complete 3D models.

- The aim is to bring all models together to get the best possible geological map of an area, says Buckley.

The use of drones in the search for oil is similar to techniques used in Switzerland and Germany to look for minerals. The models created by the CIPR researchers can also be used for research on CO2 storage.

- It isn’t hard to collect a point cloud of laser readings and present these. The challenge is to use the data for geological analysis, Buckley points out.

A helicopter in the office

The drone is operated from the ground just like a radio-controlled plane, shooting images of the earth’s surface from the air. The pilot on the ground also operates the camera.

There are plenty of restrictions in place, though, and not anyone can fly a drone. Norwegian aviation authorities put strict regulations on anyone wanting to use drones for research. Aleksandra Sima has been practising in a flight simulator and has tested mini helicopters in her office.

- The worst thing that can happen is that a drone crashes and hurts people, says Sima before reassuringly adding.

- But we won’t be flying drones in populated areas.