Hidden movements of Greenland Ice Sheet, runoff revealed

For years NASA has tracked changes in the massive Greenland Ice Sheet. This week scientists using NASA data released the most detailed picture ever of how the ice sheet moves toward the sea and new insights into the hidden plumbing of melt water flowing under the snowy surface.

The results of these studies are expected to improve predictions of the future of the entire Greenland ice sheet and its contribution to sea level rise as researchers revamp their computer models of how the ice sheet reacts to a warming climate.

“With the help of NASA satellite and airborne remote sensing instruments, the Greenland Ice Sheet is finally yielding its secrets,” said Tom Wagner, program scientist for NASA’s cryosphere program in Washington. “These studies represent new leaps in our knowledge of how the ice sheet is losing ice. It turns out the ice sheet is a lot more complex than we ever thought.”

University at Buffalo geophysicist Beata Csatho led an international team that produced the first comprehensive study of how the ice sheet is losing mass based on NASA satellite and airborne data at nearly 100,000 locations across Greenland. The study found that the ice sheet shed about 243 gigatons of ice per year from 2003-09, which agrees with other studies using different techniques. The study was published today in the Proceedings of the National Academy of Sciences.

The study suggests that current ice sheet modeling is too simplistic to accurately predict the future contribution of the Greenland ice sheet to sea level rise, and that current models may underestimate ice loss in the near future.

The project was a massive undertaking, using satellite and aerial data from NASA’s ICESat spacecraft, which measured the elevation of the ice sheet starting in 2003, and the Operation IceBridge field campaign that has flown annually since 2009. Additional airborne data from 1993-2008, collected by NASA’s Program for Arctic Regional Climate Assessment, were also included to extend the timeline of the study.

Current computer simulations of the Greenland Ice Sheet use the activity of four well-studied glaciers — Jakobshavn, Helheim, Kangerlussuaq and Petermann — to forecast how the entire ice sheet will dump ice into the oceans. The new research shows that activity at these four locations may not be representative of what is happening with glaciers across the ice sheet. In fact, glaciers undergo patterns of thinning and thickening that current climate change simulations fail to address, Csatho says.

As a step toward building better models of sea level rise, the research team divided Greenland’s 242 glaciers into 7 major groups based on their behavior from 2003-09.

“Understanding the groupings will help us pick out examples of glaciers that are representative of the whole,” Csatho says. “We can then use data from these representative glaciers in models to provide a more complete picture of what is happening.”

The team also identified areas of rapid shrinkage in southeast Greenland that today’s models don’t acknowledge. This leads Csatho to believe that the ice sheet could lose ice faster in the future than today’s simulations would suggest.

In separate studies presented today at the American Geophysical Union annual meeting in San Francisco, scientists using data from Operation IceBridge found permanent bodies of liquid water in the porous, partially compacted firn layer just below the surface of the ice sheet. Lora Koenig at the National Snow and Ice Data Center in Boulder, Colorado, and Rick Forster at the University of Utah in Salt Lake City, found signatures of near-surface liquid water using ice-penetrating radar.

Across wide areas of Greenland, water can remain liquid, hiding in layers of snow just below the surface, even through cold, harsh winters, researchers are finding. The discoveries by the teams led by Koenig and Forster mean that scientists seeking to understand the future of the Greenland ice sheet need to account for relatively warm liquid water retained in the ice.

Although the total volume of water is small compared to overall melting in Greenland, the presence of liquid water throughout the year could help kick off melt in the spring and summer. “More year-round water means more heat is available to warm the ice,” Koenig said.

Koenig and her colleagues found that sub-surface liquid water are common on the western edges of the Greenland Ice Sheet. At roughly the same time, Forster used similar ground-based radars to find a large aquifer in southeastern Greenland. These studies show that liquid water can persist near the surface around the perimeter of the ice sheet year round.

Another researcher participating in the briefing found that near-surface layers can also contain masses of solid ice that can lead to flooding events. Michael MacFerrin, a scientist at the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, and colleagues studying radar data from IceBridge and surface based instruments found near surface patches of ice known as ice lenses more than 25 miles farther inland than previously recorded.

Ice lenses form when firn collects surface meltwater like a sponge. When this shallow ice melts, as was seen during July 2012, they can release large amounts of water that can lead to flooding. Warm summers and resulting increased surface melt in recent years have likely caused ice lenses to grow thicker and spread farther inland. “This represents a rapid feedback mechanism. If current trends continue, the flooding will get worse,” MacFerrin said.

Click on this image to view the .mp4 video
This animation (from March 2014) portrays the changes occurring in the surface elevation of the Greenland Ice Sheet since 2003 in three drainage areas: the southeast, the northeast and the Jakobshavn regions. In each region, the time advances to show the accumulated change in elevation, 2003-2012.

Downloadable video: http://svs.gsfc.nasa.gov/cgi-bin/details.cgi?aid=4022 – NASA SVS NASA’s Goddard Space Flight Center

Volcano hazards and the role of westerly wind bursts in El Niño

On June 27, lava from Kīlauea, an active volcano on the island of Hawai'i, began flowing to the northeast, threatening the residents in a community in the District of Puna. -  USGS
On June 27, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in a community in the District of Puna. – USGS

On 27 June, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in Pāhoa, a community in the District of Puna, as well as the only highway accessible to this area. Scientists from the U.S. Geological Survey’s Hawaiian Volcano Observatory (HVO) and the Hawai’i County Civil Defense have been monitoring the volcano’s lava flow and communicating with affected residents through public meetings since 24 August. Eos recently spoke with Michael Poland, a geophysicist at HVO and a member of the Eos Editorial Advisory Board, to discuss how he and his colleagues communicated this threat to the public.

Drilling a Small Basaltic Volcano to Reveal Potential Hazards

Drilling into the Rangitoto Island Volcano in the Auckland Volcanic Field in New Zealand offers insight into a small monogenetic volcano, and may improve understanding of future hazards.

From AGU’s journals: El Niño fades without westerly wind bursts

The warm and wet winter of 1997 brought California floods, Florida tornadoes, and an ice storm in the American northeast, prompting climatologists to dub it the El Niño of the century. Earlier this year, climate scientists thought the coming winter might bring similar extremes, as equatorial Pacific Ocean conditions resembled those seen in early 1997. But the signals weakened by summer, and the El Niño predictions were downgraded. Menkes et al. used simulations to examine the differences between the two years.

The El Niño-Southern Oscillation is defined by abnormally warm sea surface temperatures in the eastern Pacific Ocean and weaker than usual trade winds. In a typical year, southeast trade winds push surface water toward the western Pacific “warm pool”–a region essential to Earth’s climate. The trade winds dramatically weaken or even reverse in El Niño years, and the warm pool extends its reach east.

Scientists have struggled to predict El Niño due to irregularities in the shape, amplitude, and timing of the surges of warm water. Previous studies suggested that short-lived westerly wind pulses (i.e. one to two weeks long) could contribute to this irregularity by triggering and sustaining El Niño events.

To understand the vanishing 2014 El Niño, the authors used computer simulations and examined the wind’s role. The researchers find pronounced differences between 1997 and 2014. Both years saw strong westerly wind events between January and March, but those disappeared this year as spring approached. In contrast, the westerly winds persisted through summer in 1997.

In the past, it was thought that westerly wind pulses were three times as likely to form if the warm pool extended east of the dateline. That did not occur this year. The team says their analysis shows that El Niño’s strength might depend on these short-lived and possibly unpredictable pulses.


The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing more than 62,000 members in 144 countries. Join our conversation on Facebook, Twitter, YouTube, and other social media channels.

New hi-tech approach to studying sedimentary basins

A radical new approach to analysing sedimentary basins also harnesses technology in a completely novel way. An international research group, led by the University of Sydney, will use big data sets and exponentially increased computing power to model the interaction between processes on the earth’s surface and deep below it in ‘five dimensions’.

As announced by the Federal Minister for Education today, the University’s School of Geosciences will lead the Basin GENESIS Hub that has received $5.4 million over five years from the Australian Research Council (ARC) and industry partners.

The multitude of resources found in sedimentary basins includes groundwater and energy resources. The space between grains of sand in these basins can also be used to store carbon dioxide.

“This research will be of fundamental importance to both the geo-software industry, used by exploration and mining companies, and to other areas of the energy industry,” said Professor Dietmar Müller, Director of the Hub, from the School of Geosciences.

“The outcomes will be especially important for identifying exploration targets in deep basins in remote regions of Australia. It will create a new ‘exploration geodynamics’ toolbox for industry to improve estimates of what resources might be found in individual basins.”

Sedimentary basins form when sediments eroded from highly elevated regions are transported through river systems and deposited into lowland regions and continental margins. The Sydney Basin is a massive basin filled mostly with river sediments that form Hawkesbury sandstone. It is invisible to the Sydney population living above it but has provided building material for many decades.

“Previously the approach to analysing these basins has been based on interpreting geological data and two-dimensional models. We apply infinitely more computing power to enhance our understanding of sedimentary basins as the product of the complex interplay between surface and deep Earth processes,” said Professor Müller.

Associate Professor Rey, a researcher at the School of Geosciences and member of the Hub said, “Our new approach is to understand the formation of sedimentary basins and the changes they undergo, both recently and over millions to hundreds of millions of years, using computer simulations to incorporate information such as the evolution of erosion, sedimentary processes and the deformation of the earth’s crust.”

The researchers will incorporate data from multiple sources to create ‘five-dimensional’ models, combining three-dimensional space with the extra dimensions of time and estimates of uncertainty.

The modelling will span scales from entire basins hundreds of kilometres wide to individual sediment grains.

Key geographical areas the research will focus on are the North-West shelf of Australia, Papua New Guinea and the Atlantic Ocean continental margins.

The Hub’s technology builds upon the exponential increase in computational power and the increasing amount of available big data (massive data sets of information). The Hub will harness the capacity of Australia’s most powerful computer, launched in 2013.

Warm US West, cold East: A 4,000-year pattern

<IMG SRC="/Images/485889256.jpg" WIDTH="350" HEIGHT="262" BORDER="0" ALT="University of Utah geochemist Gabe Bowen led a new study, published in Nature Communications, showing that the curvy jet stream pattern that brought mild weather to western North America and intense cold to the eastern states this past winter has become more dominant during the past 4,000 years than it was from 8,000 to 4,000 years ago. The study suggests global warming may aggravate the pattern, meaning such severe winter weather extremes may be worse in the future. – Lee J. Siegel, University of Utah.”>
University of Utah geochemist Gabe Bowen led a new study, published in Nature Communications, showing that the curvy jet stream pattern that brought mild weather to western North America and intense cold to the eastern states this past winter has become more dominant during the past 4,000 years than it was from 8,000 to 4,000 years ago. The study suggests global warming may aggravate the pattern, meaning such severe winter weather extremes may be worse in the future. – Lee J. Siegel, University of Utah.

Last winter’s curvy jet stream pattern brought mild temperatures to western North America and harsh cold to the East. A University of Utah-led study shows that pattern became more pronounced 4,000 years ago, and suggests it may worsen as Earth’s climate warms.

“If this trend continues, it could contribute to more extreme winter weather events in North America, as experienced this year with warm conditions in California and Alaska and intrusion of cold Arctic air across the eastern USA,” says geochemist Gabe Bowen, senior author of the study.

The study was published online April 16 by the journal Nature Communications.

“A sinuous or curvy winter jet stream means unusual warmth in the West, drought conditions in part of the West, and abnormally cold winters in the East and Southeast,” adds Bowen, an associate professor of geology and geophysics at the University of Utah. “We saw a good example of extreme wintertime climate that largely fit that pattern this past winter,” although in the typical pattern California often is wetter.

It is not new for scientists to forecast that the current warming of Earth’s climate due to carbon dioxide, methane and other “greenhouse” gases already has led to increased weather extremes and will continue to do so.

The new study shows the jet stream pattern that brings North American wintertime weather extremes is millennia old – “a longstanding and persistent pattern of climate variability,” Bowen says. Yet it also suggests global warming may enhance the pattern so there will be more frequent or more severe winter weather extremes or both.

“This is one more reason why we may have more winter extremes in North America, as well as something of a model for what those extremes may look like,” Bowen says. Human-caused climate change is reducing equator-to-pole temperature differences; the atmosphere is warming more at the poles than at the equator. Based on what happened in past millennia, that could make a curvy jet stream even more frequent and-or intense than it is now, he says.

Bowen and his co-authors analyzed previously published data on oxygen isotope ratios in lake sediment cores and cave deposits from sites in the eastern and western United States and Canada. Those isotopes were deposited in ancient rainfall and incorporated into calcium carbonate. They reveal jet stream directions during the past 8,000 years, a geological time known as middle and late stages of the Holocene Epoch.

Next, the researchers did computer modeling or simulations of jet stream patterns – both curvy and more direct west to east – to show how changes in those patterns can explain changes in the isotope ratios left by rainfall in the old lake and cave deposits.

They found that the jet stream pattern – known technically as the Pacific North American teleconnection – shifted to a generally more “positive phase” – meaning a curvy jet stream – over a 500-year period starting about 4,000 years ago. In addition to this millennial-scale change in jet stream patterns, they also noted a cycle in which increases in the sun’s intensity every 200 years make the jet stream flatter.

Bowen conducted the study with Zhongfang Liu of Tianjin Normal University in China, Kei Yoshimura of the University of Tokyo, Nikolaus Buenning of the University of Southern California, Camille Risi of the French National Center for Scientific Research, Jeffrey Welker of the University of Alaska at Anchorage, and Fasong Yuan of Cleveland State University.

The study was funded by the National Science Foundation, National Natural Science Foundation of China, Japan Society for the Promotion of Science and a joint program by the society and Japan’s Ministry of Education, Culture, Sports, Science and Technology: the Program for Risk Information on Climate Change.

Sinuous Jet Stream Brings Winter Weather Extremes

The Pacific North American teleconnection, or PNA, “is a pattern of climate variability” with positive and negative phases, Bowen says.

“In periods of positive PNA, the jet stream is very sinuous. As it comes in from Hawaii and the Pacific, it tends to rocket up past British Columbia to the Yukon and Alaska, and then it plunges down over the Canadian plains and into the eastern United States. The main effect in terms of weather is that we tend to have cold winter weather throughout most of the eastern U.S. You have a freight car of arctic air that pushes down there.”

Bowen says that when the jet stream is curvy, “the West tends to have mild, relatively warm winters, and Pacific storms tend to occur farther north. So in Northern California, the Pacific Northwest and parts of western interior, it tends to be relatively dry, but tends to be quite wet and unusually warm in northwest Canada and Alaska.”

This past winter, there were times of a strongly curving jet stream, and times when the Pacific North American teleconnection was in its negative phase, which means “the jet stream is flat, mostly west-to-east oriented,” and sometimes split, Bowen says. In years when the jet stream pattern is more flat than curvy, “we tend to have strong storms in Northern California and Oregon. That moisture makes it into the western interior. The eastern U.S. is not affected by arctic air, so it tends to have milder winter temperatures.”

The jet stream pattern – whether curvy or flat – has its greatest effects in winter and less impact on summer weather, Bowen says. The curvy pattern is enhanced by another climate phenomenon, the El Nino-Southern Oscillation, which sends a pool of warm water eastward to the eastern Pacific and affects climate worldwide.

Traces of Ancient Rains Reveal Which Way the Wind Blew

Over the millennia, oxygen in ancient rain water was incorporated into calcium carbonate deposited in cave and lake sediments. The ratio of rare, heavy oxygen-18 to the common isotope oxygen-16 in the calcium carbonate tells geochemists whether clouds that carried the rain were moving generally north or south during a given time.

Previous research determined the dates and oxygen isotope ratios for sediments in the new study, allowing Bowen and colleagues to use the ratios to tell if the jet stream was curvy or flat at various times during the past 8,000 years.

Bowen says air flowing over the Pacific picks up water from the ocean. As a curvy jet stream carries clouds north toward Alaska, the air cools and some of the water falls out as rain, with greater proportions of heavier oxygen-18 falling, thus raising the oxygen-18-to-16 ratio in rain and certain sediments in western North America. Then the jet stream curves south over the middle of the continent, and the water vapor, already depleted in oxygen-18, falls in the East as rain with lower oxygen-18-to-16 ratios.

When the jet stream is flat and moving east-to-west, oxygen-18 in rain is still elevated in the West and depleted in the East, but the difference is much less than when the jet stream is curvy.

By examining oxygen isotope ratios in lake and cave sediments in the West and East, Bowen and colleagues showed that a flatter jet stream pattern prevailed from about 8,000 to 4,000 years ago in North America, but then, over only 500 years, the pattern shifted so that curvy jet streams became more frequent or severe or both. The method can’t distinguish frequency from severity.

The new study is based mainly on isotope ratios at Buckeye Creek Cave, W. Va.; Lake Grinell, N.J.; Oregon Caves National Monument; and Lake Jellybean, Yukon.

Additional data supporting increasing curviness of the jet stream over recent millennia came from seven other sites: Crawford Lake, Ontario; Castor Lake, Wash.; Little Salt Spring, Fla.; Estancia Lake, N.M.; Crevice Lake, Mont.; and Dog and Felker lakes, British Columbia. Some sites provided oxygen isotope data; others showed changes in weather patterns based on tree ring growth or spring deposits.

Simulating the Jet Stream

As a test of what the cave and lake sediments revealed, Bowen’s team did computer simulations of climate using software that takes isotopes into account.

Simulations of climate and oxygen isotope changes in the Middle Holocene and today resemble, respectively, today’s flat and curvy jet stream patterns, supporting the switch toward increasing jet stream sinuosity 4,000 years ago.

Why did the trend start then?

“It was a when seasonality becomes weaker,” Bowen says. The Northern Hemisphere was closer to the sun during the summer 8,000 years ago than it was 4,000 years ago or is now due to a 20,000-year cycle in Earth’s orbit. He envisions a tipping point 4,000 years ago when weakening summer sunlight reduced the equator-to-pole temperature difference and, along with an intensifying El Nino climate pattern, pushed the jet stream toward greater curviness.

The Atlantic Ocean dances with the sun and volcanoes

Imagine a ballroom in which two dancers apparently keep in time to their own individual rhythm. The two partners suddenly find themselves moving to the same rhythm and, after a closer look, it is clear to see which one is leading.

It was an image like this that researchers at Aarhus University were able to see when they compared studies of solar energy release and volcanic activity during the last 450 years, with reconstructions of ocean temperature fluctuations during the same period.

The results actually showed that during the last approximately 250 years – since the period known as the Little Ice Age – a clear correlation can be seen where the external forces, i.e. the Sun’s energy cycle and the impact of volcanic eruptions, are accompanied by a corresponding temperature fluctuation with a time lag of about five years.

In the previous two centuries, i.e. during the Little Ice Age, the link was not as strong, and the temperature of the Atlantic Ocean appears to have followed its own rhythm to a greater extent.

The results were recently published in the scientific journal Nature Communications.

In addition to filling in yet another piece of the puzzle associated with understanding the complex interaction of the natural forces that control the climate, the Danish researchers paved the way for linking the two competing interpretations of the origin of the oscillation phenomenon.

Temperature fluctuations discovered around the turn of the millennium

The climate is defined on the basis of data including mean temperature values recorded over a period of thirty years. Northern Europe thus has a warm and humid climate compared with other regions on the same latitudes. This is due to the North Atlantic Drift (often referred to as the Gulf Stream), an ocean current that transports relatively warm water from the south-west part of the North Atlantic to the sea off the coast of Northern Europe.

Around the turn of the millennium, however, climate researchers became aware that the average temperature of the Atlantic Ocean was not entirely stable, but actually fluctuated at the same rate throughout the North Atlantic. This phenomenon is called the Atlantic Multidecadal Oscillation (AMO), which consists of relatively warm periods lasting thirty to forty years being replaced by cool periods of the same duration.

The researchers were able to read small systematic variations in the water temperature in the North Atlantic in measurements taken by ships during the last 140 years.

Although the temperature fluctuations are small – less than 1°C – there is a general consensus among climate researchers that the AMO phenomenon has had a major impact on the climate in the area around the North Atlantic for thousands of years, but until now there has been doubt about what could cause this slow rhythm in the temperature of the Atlantic Ocean. One model explains the phenomenon as internal variability in the ocean circulation – somewhat like a bathtub sloshing water around in its own rhythm. Another model explains the AMO as being driven by fluctuations in the amount of solar energy received by the Earth, and as being affected by small changes in the energy radiated by the Sun itself and the after-effects of volcanic eruptions. Both these factors are also known as ‘external forces’ that have an impact on the Earth’s radiation balance.

However, there has been considerable scepticism towards the idea that a phenomenon such as an AMO could be driven by external forces at all – a scepticism that the Aarhus researchers now demonstrate as unfounded.

“Our new investigations clearly show that, since the Little Ice Age, there has been a correlation between the known external forces and the temperature fluctuations in the ocean that help control our climate. At the same time, however, the results also show that this can’t be the only driving force behind the AMO, and the explanation must therefore be found in a complex interaction between a number of mechanisms. It should also be pointed out that these fluctuations occur on the basis of evenly increasing ocean temperatures during the last approximately fifty years – an increase connected with global warming,” says Associate Professor Mads Faurschou Knudsen, Department of Geoscience, Aarhus University, who is the main author of the article.

Convincing data from the Earth’s own archives

Researchers have attempted to make computer simulations of the phenomenon ever since the discovery of the AMO, partly to enable a better understanding of the underlying mechanism. However, it is difficult for the computer models to reproduce the actual AMO signal that can be read in the temperature data from the last 140 years.

Associate Professor Knudsen and his colleagues instead combined all available data from the Earth’s own archives, i.e. previous studies of items such as radioactive isotopes and volcanic ash in ice cores. This provides information about solar energy release and volcanic activity during the last 450 years, and the researchers compared the data with reconstructions of the AMO’s temperature rhythm during the same period.

“We’ve only got direct measurements of the Atlantic Ocean temperature for the last 140 years, where it was measured by ships. But how do you measure the water temperature further back in time? Studies of growth rings in trees from the entire North Atlantic region come into the picture here, where ‘good’ and ‘bad’ growth conditions are calibrated to the actual measurements, and the growth rings from trees along the coasts further back in time can therefore act as reserve thermometers,” explains Associate Professor Knudsen.

The results provide a new and very important perspective on the AMO phenomenon because they are based on data and not computer models, which are inherently incomplete. The problem is that the models do not completely describe all the physical correlations and feedbacks in the system, partly because these are not fully understood. And when the models are thus unable to reproduce the actual AMO signal, it is hard to know whether they have captured the essence of the AMO phenomenon.

Impact of the sun and volcanoes

An attempt to simply explain how external forces such as the Sun and volcanoes can control the climate could sound like this: a stronger Sun heats up the ocean, while the ash from volcanic eruptions shields the Sun and cools down the ocean. However, it is hardly as simple as that.

“Fluctuations in ocean temperature have a time lag of about five years in relation to the peaks we can read in the external forces. However, the direct effect of major volcanic eruptions is clearly seen as early as the same year in the mean global atmospheric temperature, i.e. a much shorter delay. The effect we studied is more complex, and it takes time for this effect to spread to the ocean currents,” explains Associate Professor Knudsen.

“An interesting new theory among solar researchers and meteorologists is that the Sun can control climate variations via the very large variations in UV radiation, which are partly seen in connection with changes in sunspot activity during the Sun’s eleven-year cycle. UV radiation heats the stratosphere in particular via increased production of ozone, which can have an impact on wind systems and thereby indirectly on the global ocean currents as well,” says Associate Professor Knudsen. However, he emphasises that researchers have not yet completely understood how a development in the stratosphere can affect the ocean currents on Earth.

Towards a better understanding of the climate

“In our previous study of the climate in the North Atlantic region during the last 8,000 years, we were able to show that the temperature of the Atlantic Ocean was presumably not controlled by the Sun’s activity. Here the temperature fluctuated in its own rhythm for long intervals, with warm and cold periods lasting 25 years. The prevailing pattern was that this climate fluctuation in the ocean was approximately 30󈞔% faster than the fluctuation we’d previously observed in solar activity, which lasted about ninety years. What we can now see is that the Atlantic Ocean would like to – or possibly even prefer to – dance alone. However, under certain circumstances, the external forces interrupt the ocean’s own rhythm and take over the lead, which has been the case during the last 250 years,” says Associate Professor Bo Holm Jacobsen, Department of Geoscience, Aarhus University, who is the co-author of the article.

“It’ll be interesting to see how long the Atlantic Ocean allows itself to be led in this dance. The scientific challenge partly lies in understanding the overall conditions under which the AMO phenomenon is sensitive to fluctuations in solar activity and volcanic eruptions,” he continues.

“During the last century, the AMO has had a strong bearing on significant weather phenomena such as hurricane frequency and droughts – with considerable economic and human consequences. A better understanding of this phenomenon is therefore an important step for efforts to deal with and mitigate the impact of climate variations,” Associate Professor Knudsen concludes.

Birth of Earth’s continents

New research led by a University of Calgary geophysicist provides strong evidence against continent formation above a hot mantle plume, similar to an environment that presently exists beneath the Hawaiian Islands.

The analysis, published this month in Nature Geoscience, indicates that the nuclei of Earth’s continents formed as a byproduct of mountain-building processes, by stacking up slabs of relatively cold oceanic crust. This process created thick, strong ‘keels’ in the Earth’s mantle that supported the overlying crust and enabled continents to form.

The scientific clues leading to this conclusion derived from computer simulations of the slow cooling process of continents, combined with analysis of the distribution of diamonds in the deep Earth.

The Department of Geoscience’s Professor David Eaton developed computer software to enable numerical simulation of the slow diffusive cooling of Earth’s mantle over a time span of billions of years.

Working in collaboration with former graduate student, Assistant Professor Claire Perry from the Universite du Quebec a Montreal, Eaton relied on the geological record of diamonds found in Africa to validate his innovative computer simulations.

“For the first time, we are able to quantify the thermal evolution of a realistic 3D Earth model spanning billions of years from the time continents were formed,” states Perry.

Mantle plumes consist of an upwelling of hot material within Earth’s mantle. Plumes are thought to be the cause of some volcanic centres, especially those that form a linear volcanic chain like Hawaii. Diamonds, which are generally limited to the deepest and oldest parts of the continental mantle, provide a wealth of information on how the host mantle region may have formed.

“Ancient mantle keels are relatively strong, cold and sometimes diamond-bearing material. They are known to extend to depths of 200 kilometres or more beneath the ancient core regions of continents,” explains Professor David Eaton. “These mantle keels resisted tectonic recycling into the deep mantle, allowing the preservation of continents over geological time and providing suitable environments for the development of the terrestrial biosphere.”

His method takes into account important factors such as dwindling contribution of natural radioactivity to the heat budget, and allows for the calculation of other properties that strongly influence mantle evolution, such as bulk density and rheology (mechanical strength).

“Our computer model emerged from a multi-disciplinary approach combining classical physics, mathematics and computer science,” explains Eaton. “By combining those disciplines, we were able to tackle a fundamental geoscientific problem, which may open new doors for future research.”

This work provides significant new scientific insights into the formation and evolution of continents on Earth.

Click on this image to view the .mp4 video
This computer simulation spanning 2.5 billion years of Earth history is showing density difference of the mantle, compared to an oceanic reference, starting from a cooler initial state. Density is controlled by mantle composition as well as slowly cooling temperature; a keel of low-density material extending to about 260 km depth on the left side (x < 600 km) provides buoyancy that prevents continents from being subducted ('recycled' into the deep Earth). Graph on the top shows a computed elevation model. – David Eaton, University of Calgary.

West Antarctica ice sheet existed 20 million years earlier than previously thought

Adelie penguins walk in file on sea ice in front of US research icebreaker Nathaniel B. Palmer in McMurdo Sound. -  John Diebold
Adelie penguins walk in file on sea ice in front of US research icebreaker Nathaniel B. Palmer in McMurdo Sound. – John Diebold

The results of research conducted by professors at UC Santa Barbara and colleagues mark the beginning of a new paradigm for our understanding of the history of Earth’s great global ice sheets. The research shows that, contrary to the popularly held scientific view, an ice sheet on West Antarctica existed 20 million years earlier than previously thought.

The findings indicate that ice sheets first grew on the West Antarctic subcontinent at the start of a global transition from warm greenhouse conditions to a cool icehouse climate 34 million years ago. Previous computer simulations were unable to produce the amount of ice that geological records suggest existed at that time because neighboring East Antarctica alone could not support it. The findings were published today in Geophysical Research Letters, a journal of the American Geophysical Union.

Given that more ice grew than could be hosted only on East Antarctica, some researchers proposed that the missing ice formed in the northern hemisphere, many millions of years before the documented ice growth in that hemisphere, which started about 3 million years ago. But the new research shows it is not necessary to have ice hosted in the northern polar regions at the start of greenhouse-icehouse transition.

Earlier research published in 2009 and 2012 by the same team showed that West Antarctica bedrock was much higher in elevation at the time of the global climate transition than it is today, with much of its land above sea level. The belief that West Antarctic elevations had always been low lying (as they are today) led researchers to ignore it in past studies. The new research presents compelling evidence that this higher land mass enabled a large ice sheet to be hosted earlier than previously realized, despite a warmer ocean in the past.

“Our new model identifies West Antarctica as the site needed for the accumulation of the extra ice on Earth at that time,” said lead author Douglas S. Wilson, a research geophysicist in UCSB’s Department of Earth Science and Marine Science Institute. “We find that the West Antarctic Ice Sheet first appeared earlier than the previously accepted timing of its initiation sometime in the Miocene, about 14 million years ago. In fact, our model shows it appeared at the same time as the massive East Antarctic Ice Sheet some 20 million years earlier.”

Wilson and his team used a sophisticated numerical ice sheet model to support this view. Using their new bedrock elevation map for the Antarctic continent, the researchers created a computer simulation of the initiation of the Antarctic ice sheets. Unlike previous computer simulations of Antarctic glaciation, this research found the nascent Antarctic ice sheet included substantial ice on the subcontinent of West Antarctica. The modern West Antarctic Ice Sheet contains about 10 percent of the total ice on Antarctica and is similar in scale to the Greenland Ice Sheet.

West Antarctica and Greenland are both major players in scenarios of sea level rise due to global warming because of the sensitivity of the ice sheets on these subcontinents. Recent scientific estimates conclude that global sea level would rise an average of 11 feet should the West Antarctic Ice Sheet melt. This amount would add to sea level rise from the melting of the Greenland ice sheet (about 24 feet).

The UCSB researchers computed a range of ice sheets that consider the uncertainty in the topographic reconstructions, all of which show ice growth on East and West Antarctica 34 million years ago. A surprising result is that the total volume of ice on East and West Antarctica at that time could be more than 1.4 times greater than previously realized and was likely larger than the ice sheet on Antarctica today.

“We feel it is important for the public to know that the origins of the West Antarctic Ice Sheet are under increased scrutiny and that scientists are paying close attention to its role in Earth’s climate now and in the past,” concluded co-author Bruce Luyendyk, UCSB professor emeritus in the Department of Earth Science and research professor at the campus’s Earth Research Institute.

Devastating long-distance impact of earthquakes

In 2006 the island of Java, Indonesia was struck by a devastating earthquake followed by the onset of a mud eruption to the east, flooding villages over several square kilometers and that continues to erupt today. Until now, researchers believed the earthquake was too far from the mud volcano to trigger the eruption. Geophysicists at the University of Bonn, Germany and ETH Zurich, Switzerland use computer-based simulations to show that such triggering is possible over long distances. The results have been published in “Nature Geoscience.”

On May 27, 2006 the ground of the Indonesian island Java was shaking with a magnitude 6.3 earthquake. The epicenter was located 25 km southwest of the city of Yogyakarta and initiated at a depth of 12 km. The earthquake took thousands of lives, injured ten thousand and destroyed buildings and homes. 47 hours later, about 250 km from the earthquake hypocenter, a mud volcano formed that came to be known as “Lusi”, short for “Lumpur Sidoarjo”. Hot mud erupted in the vicinity of an oil drilling-well, shooting mud up to 50 m into the sky and flooding the area. Scientists expect the mud volcano to be active for many more years.

Eruption of mud volcano has natural cause

Was the eruption of the mud triggered by natural events or was it man-made by the nearby exploration-well? Geophysicists at the University of Bonn, Germany and at ETH Zürich, Switzerland investigated this question with numerical wave-propagation experiments. “Many researchers believed that the earthquake epicenter was too far from Lusi to have activated the mud volcano,” says Prof. Dr. Stephen A. Miller from the department of Geodynamics at the University of Bonn. However, using their computer simulations that include the geological features of the Lusi subsurface, the team of Stephen Miller concluded that the earthquake was the trigger, despite the long distance.

The overpressured solid mud layer was trapped between layers with different acoustic properties, and this system was shaken from the earthquake and aftershocks like a bottle of champagne. The key, however, is the reflections provided by the dome-shaped geology underneath Lusi that focused the seismic waves of the earthquakes like the echo inside a cave. Prof. Stephen Miller explains: “Our simulations show that the dome-shaped structure with different properties focused seismic energy into the mud layer and could very well have liquified the mud that then injected into nearby faults.”

Previous studies would have underestimated the energy of the seismic waves, as ground motion was only considered at the surface. However, geophysicists at the University of Bonn suspect that those were much less intense than at depth. The dome-like structure “kept” the seismic waves at depth and damped those that reached the surface. “This was actually a lower estimate of the focussing effect because only one wave cycle was input. This effect increases with each wave cycle because of the reducing acoustic impedance of the pressurizing mud layer”. In response to claims that the reported highest velocity layer used in the modeling is a measurement artifact, Miller says “that does not change our conclusions because this effect will occur whenever a layer of low acoustic impedance is sandwiched between high impedance layers, irrespective of the exact values of the impedances. And the source of the Lusi mud was the inside of the sandwich.”

It has already been proposed that a tectonic fault is connecting Lusi to a 15 km distant volcanic system. Prof. Miller explains “This connection probably supplies the mud volcano with heat and fluids that keep Lusi erupting actively up to today”, explains Miller.

With their publication, scientists from Bonn and Zürich point out, that earthquakes can trigger processes over long distances, and this focusing effect may apply to other hydrothermal and volcanic systems. Stephen Miller concludes: “Being a geological rarity, the mud volcano may contribute to a better understanding of triggering processes and relationships between seismic and volcanic activity.” Miller also adds “maybe this work will settle the long-standing controversy and focus instead on helping those affected.” The island of Java is part of the so called Pacific Ring of Fire, a volcanic belt which surrounds the entire Pacific Ocean. Here, oceanic crust is subducted underneath oceanic and continental tectonic plates, leading to melting of crustal material at depth. The resulting magma uprises and is feeding numerous volcanoes.

Earthquake acoustics can indicate if a massive tsunami is imminent, Stanford researchers find

On March 11, 2011, a magnitude 9.0 undersea earthquake occurred 43 miles off the shore of Japan. The earthquake generated an unexpectedly massive tsunami that washed over eastern Japan roughly 30 minutes later, killing more than 15,800 people and injuring more than 6,100. More than 2,600 people are still unaccounted for.

Now, computer simulations by Stanford scientists reveal that sound waves in the ocean produced by the earthquake probably reached land tens of minutes before the tsunami. If correctly interpreted, they could have offered a warning that a large tsunami was on the way.

Although various systems can detect undersea earthquakes, they can’t reliably tell which will form a tsunami, or predict the size of the wave. There are ocean-based devices that can sense an oncoming tsunami, but they typically provide only a few minutes of advance warning.

Because the sound from a seismic event will reach land well before the water itself, the researchers suggest that identifying the specific acoustic signature of tsunami-generating earthquakes could lead to a faster-acting warning system for massive tsunamis.

Discovering the signal

The finding was something of a surprise. The earthquake’s epicenter had been traced to the underwater Japan Trench, a subduction zone about 40 miles east of Tohoku, the northeastern region of Japan’s larger island. Based on existing knowledge of earthquakes in this area, seismologists puzzled over why the earthquake rupture propagated from the underground fault all the way up to the seafloor, creating a massive upward thrust that resulted in the tsunami.

Direct observations of the fault were scarce, so Eric Dunham, an assistant professor of geophysics in the School of Earth Sciences, and Jeremy Kozdon, a postdoctoral researcher working with Dunham, began using the cluster of supercomputers at Stanford’s Center for Computational Earth and Environmental Science (CEES) to simulate how the tremors moved through the crust and ocean.

The researchers built a high-resolution model that incorporated the known geologic features of the Japan Trench and used CEES simulations to identify possible earthquake rupture histories compatible with the available data.

Retroactively, the models accurately predicted the seafloor uplift seen in the earthquake, which is directly related to tsunami wave heights, and also simulated sound waves that propagated within the ocean.

In addition to valuable insight into the seismic events as they likely occurred during the 2011 earthquake, the researchers identified the specific fault conditions necessary for ruptures to reach the seafloor and create large tsunamis.

The model also generated acoustic data; an interesting revelation of the simulation was that tsunamigenic surface-breaking ruptures, like the 2011 earthquake, produce higher amplitude ocean acoustic waves than those that do not.

The model showed how those sound waves would have traveled through the water and indicated that they reached shore 15 to 20 minutes before the tsunami.

“We’ve found that there’s a strong correlation between the amplitude of the sound waves and the tsunami wave heights,” Dunham said. “Sound waves propagate through water 10 times faster than the tsunami waves, so we can have knowledge of what’s happening a hundred miles offshore within minutes of an earthquake occurring. We could know whether a tsunami is coming, how large it will be and when it will arrive.”

Worldwide application

The team’s model could apply to tsunami-forming fault zones around the world, though the characteristics of telltale acoustic signature might vary depending on the geology of the local environment. The crustal composition and orientation of faults off the coasts of Japan, Alaska, the Pacific Northwest and Chile differ greatly.

“The ideal situation would be to analyze lots of measurements from major events and eventually be able to say, ‘this is the signal’,” said Kozdon, who is now an assistant professor of applied mathematics at the Naval Postgraduate School. “Fortunately, these catastrophic earthquakes don’t happen frequently, but we can input these site specific characteristics into computer models – such as those made possible with the CEES cluster – in the hopes of identifying acoustic signatures that indicates whether or not an earthquake has generated a large tsunami.”

Dunham and Kozdon pointed out that identifying a tsunami signature doesn’t complete the warning system. Underwater microphones called hydrophones would need to be deployed on the seafloor or on buoys to detect the signal, which would then need to be analyzed to confirm a threat, both of which could be costly. Policymakers would also need to work with scientists to settle on the degree of certainty needed before pulling the alarm.

If these points can be worked out, though, the technique could help provide precious minutes for an evacuation.

The study is detailed in the current issue of the journal the Bulletin of the Seismological Society of America.

Supercomputer Unleashes Virtual 9.0 Megaquake in Pacific Northwest

Scientists used a supercomputer-driven 'virtual earthquake' to explore likely ground shaking in a magnitude 9.0 megathrust earthquake in the Pacific Northwest. Peak ground velocities are displayed in yellow and red. The legend represents speed in meters per second (m/s) with red equaling 2.3 m/s. Although the largest ground motions occur offshore near the fault and decrease eastward, sedimentary basins lying beneath some cities amplify the shaking in Seattle, Tacoma, Olympia, and Vancouver, increasing the risk of damage. - Credit: Kim Olsen, SDSU
Scientists used a supercomputer-driven ‘virtual earthquake’ to explore likely ground shaking in a magnitude 9.0 megathrust earthquake in the Pacific Northwest. Peak ground velocities are displayed in yellow and red. The legend represents speed in meters per second (m/s) with red equaling 2.3 m/s. Although the largest ground motions occur offshore near the fault and decrease eastward, sedimentary basins lying beneath some cities amplify the shaking in Seattle, Tacoma, Olympia, and Vancouver, increasing the risk of damage. – Credit: Kim Olsen, SDSU

On January 26, 1700, at about 9 p.m. local time, the Juan de Fuca plate beneath the ocean in the Pacific Northwest suddenly moved, slipping some 60 feet eastward beneath the North American plate in a monster quake of approximately magnitude 9, setting in motion large tsunamis that struck the coast of North America and traveled to the shores of Japan.

Since then, the earth beneath the region – which includes the cities of Vancouver, Seattle and Portland — has been relatively quiet. But scientists believe that earthquakes with magnitudes greater than 8, so-called “megathrust events,” occur along this fault on average every 400 to 500 years.

To help prepare for the next megathrust earthquake, a team of researchers led by seismologist Kim Olsen of San Diego State University (SDSU) used a supercomputer-powered “virtual earthquake” program to calculate for the first time realistic three-dimensional simulations that describe the possible impacts of megathrust quakes on the Pacific Northwest region. Also participating in the study were researchers from the San Diego Supercomputer Center at UC San Diego and the U.S. Geological Survey.

What the scientists learned from this simulation is not reassuring, as reported in the Journal of Seismology, particularly for residents of downtown Seattle.

With a rupture scenario beginning in the north and propagating toward the south along the 600-mile long Cascadia Subduction Zone, the ground moved about 1 ½ feet per second in Seattle; nearly 6 inches per second in Tacoma, Olympia and Vancouver; and 3 inches in Portland, Oregon. Additional simulations, especially of earthquakes that begin in the southern part of the rupture zone, suggest that the ground motion under some conditions can be up to twice as large.

“We also found that these high ground velocities were accompanied by significant low-frequency shaking, like what you feel in a roller coaster, that lasted as long as five minutes – and that’s a long time,” said Olsen.

The long-duration shaking, combined with high ground velocities, raises the possibility that such an earthquake could inflict major damage on metropolitan areas — especially on high-rise buildings in downtown Seattle. Compounding the risks, like Los Angeles to the south, Seattle, Tacoma, and Olympia sit on top of sediment-filled geological basins that are prone to greatly amplifying the waves generated by major earthquakes.

“One thing these studies will hopefully do is to raise awareness of the possibility of megathrust earthquakes happening at any given time in the Pacific Northwest,” said Olsen. “Because these events will tend to occur several hundred kilometers from major cities, the study also implies that the region could benefit from an early warning system that can allow time for protective actions before the brunt of the shaking starts.” Depending on how far the earthquake is from a city, early warning systems could give from a few seconds to a few tens of seconds to implement measures, such as automatically stopping trains and elevators.

Added Olsen, “The information from these simulations can also play a role in research into the hazards posed by large tsunamis, which can originate from such megathrust earthquakes like the ones generated in the 2004 Sumatra-Andeman earthquake in Indonesia.” One of the largest earthquakes ever recorded, the magnitude 9.2 Sumatra-Andeman event was felt as far away as Bangladesh, India, and Malaysia, and triggered devastating tsunamis that killed more than 200,000 people.

In addition to increasing scientific understanding of these massive earthquakes, the results of the simulations can also be used to guide emergency planners, to improve building codes, and help engineers design safer structures — potentially saving lives and property in this region of some 9 million people.

Even with the large supercomputing and data resources at SDSC, creating “virtual earthquakes” is a daunting task. The computations to prepare initial conditions were carried out on SDSC’s DataStar supercomputer, and then the resulting information was transferred for the main simulations to the center’s Blue Gene Data supercomputer via SDSC’s advanced virtual file system or GPFS-WAN, which makes data seamlessly available on different – sometimes distant – supercomputers.

Coordinating the simulations required a complex choreography of moving information into and out of the supercomputer as Olsen’s sophisticated “Anelastic Wave Model” simulation code was running. Completing just one of several simulations, running on 2,000 supercomputer processors, required some 80,000 processor hours – equal to running one program continuously on your PC for more than 9 years!

“To solve the new challenges that arise when researchers need to run their codes at the largest scales, and data sets grow to great size, we worked closely with the earthquake scientists through several years of code optimization and modifications,” said SDSC computational scientist Yifeng Cui, who contributed numerous refinements to allow the computer model to “scale up” to capture a magnitude 9 earthquake over such a vast area.

In order to run the simulations, the scientists must recreate in their model the components that encompass all the important aspects of the earthquake. One component is an accurate representation of the earth’s subsurface layering, and how its structure will bend, reflect, and change the size and direction of the traveling earthquake waves. Co-author William Stephenson of the USGS worked with Olsen and Andreas Geisselmeyer, from Ulm University in Germany, to create the first unified “velocity model” of the layering for this entire region, extending from British Columbia to Northern California.

Another component is a model of the earthquake source from the slipping of the Juan de Fuca plate underneath the North American plate. Making use of the extensive measurements of the massive 2004 Sumatra-Andeman earthquake in Indonesia, the scientists developed a model of the earthquake source for similar megathrust earthquakes in the Pacific Northwest.

The sheer physical size of the region in the study was also challenging. The scientists included in their virtual model an immense slab of the earth more than 650 miles long by 340 miles by 30 miles deep — more than 7 million cubic miles — and used a computer mesh spacing of 250 meters to divide the volume into some 2 billion cubes. This mesh size allows the simulations to model frequencies up to 0.5 Hertz, which especially affect tall buildings.

“One of the strengths of an earthquake simulation model is that it lets us run scenarios of different earthquakes to explore how they may affect ground motion,” said Olsen. Because the accumulated stresses or “slip deficit” can be released in either one large event or several smaller events, the scientists ran scenarios for earthquakes of different sizes.

“We found that the magnitude 9 scenarios generate peak ground velocities five to 10 times larger than those from the smaller magnitude 8.5 quakes.”

The researchers are planning to conduct additional simulations to explore the range of impacts that depend on where the earthquake starts, the direction of travel of the rupture along the fault, and other factors that can vary.

This research was supported by the National Science Foundation, the U.S. Geological Survey, the Southern California Earthquake Center, and computing time on an NSF supercomputer at SDSC.