Antarctica: Heat comes from the deep

The Antarctic ice sheet is a giant water reservoir. The ice cap on the southern continent is on average 2,100 meters thick and contains about 70 percent of the world’s fresh water. If this ice mass were to melt completely, it could raise the global sea level by 60 meters. Therefore scientists carefully observe changes in the Antarctic. In the renowned international journal Science, researchers from Germany, the UK, the US and Japan are now publishing data according to which water temperatures, in particular on the shallow shelf seas of West Antarctica, are rising. “There are many large glaciers in the area. The elevated temperatures have accelerated the melting and sliding of these glaciers in recent decades and there are no indications that this trend is changing,” says the lead author of the study, Dr. Sunke Schmidtko from GEOMAR Helmholtz Centre for Ocean Research Kiel.

For their study, he and his colleagues of the University of East Anglia, the California Institute of Technology and the University of Hokkaido (Japan) evaluated all oceanographic data from the waters around Antarctica from 1960 to 2014 that were available in public databases. These data show that five decades ago, the water masses in the West Antarctic shelf seas were already warmer than in other parts of Antarctica, for example, in the Weddell Sea. However, the temperature difference is not constant. Since 1960, the temperatures in the West Antarctic Amundsen Sea and the Bellingshausen Sea have been rising. “Based on the data we were able to see that this shelf process is induced from the open ocean,” says Dr. Schmidtko.

Around Antarctica in greater depth along the continental slope water masses with temperatures from 0.5 to 1.5°C (33-35°F) are predominant. These temperatures are very warm for Antarctic conditions. “These waters have warmed in West Antarctica over the past 50 years. And they are significant shallower than 50 years ago,” says Schmidtko. Especially in the Amundsen Sea and Bellingshausen Sea they now increasingly spill onto the shelf and warm the shelf.

“These are the regions in which accelerated glacial melting has been observed for some time. We show that oceanographic changes over the past 50 years have probably caused this melting. If the water continues to warm, the increased penetration of warmer water masses onto the shelf will likely further accelerate this process, with an impact on the rate of global sea level rise ” explains Professor Karen Heywood from the University of East Anglia.

The scientists also draw attention to the rising up of warm water masses in the southwestern Weddell Sea. Here very cold temperatures (less than minus 1.5°C or 29°F) prevail on the shelf and a large-scale melting of shelf ice has not been observed yet. If the shoaling of warm water masses continues, it is expected that there will be major environmental changes with dramatic consequences for the Filchner or Ronne Ice Shelf, too. For the first time glaciers outside the West Antarctic could experience enhanced melting from below.

To what extent the diverse biology of the Southern Ocean is influenced by the observed changes is not fully understood. The shelf areas include spawning areas for the Antarctic krill, a shrimp species widespread in the Southern Ocean, which plays a key role in the Antarctic food chain. Research results have shown that spawning cycles could change in warmer conditions. A final assessment of the impact has not yet been made.

The exact reasons for the increase of the heating and the rising of warm water masses has not yet been completely resolved. “We suspect that they are related to large-scale variations in wind systems over the southern hemisphere. But which processes specifically play a role must be evaluated in more detail.” says Dr. Schmidtko.

Antarctica: Heat comes from the deep

The Antarctic ice sheet is a giant water reservoir. The ice cap on the southern continent is on average 2,100 meters thick and contains about 70 percent of the world’s fresh water. If this ice mass were to melt completely, it could raise the global sea level by 60 meters. Therefore scientists carefully observe changes in the Antarctic. In the renowned international journal Science, researchers from Germany, the UK, the US and Japan are now publishing data according to which water temperatures, in particular on the shallow shelf seas of West Antarctica, are rising. “There are many large glaciers in the area. The elevated temperatures have accelerated the melting and sliding of these glaciers in recent decades and there are no indications that this trend is changing,” says the lead author of the study, Dr. Sunke Schmidtko from GEOMAR Helmholtz Centre for Ocean Research Kiel.

For their study, he and his colleagues of the University of East Anglia, the California Institute of Technology and the University of Hokkaido (Japan) evaluated all oceanographic data from the waters around Antarctica from 1960 to 2014 that were available in public databases. These data show that five decades ago, the water masses in the West Antarctic shelf seas were already warmer than in other parts of Antarctica, for example, in the Weddell Sea. However, the temperature difference is not constant. Since 1960, the temperatures in the West Antarctic Amundsen Sea and the Bellingshausen Sea have been rising. “Based on the data we were able to see that this shelf process is induced from the open ocean,” says Dr. Schmidtko.

Around Antarctica in greater depth along the continental slope water masses with temperatures from 0.5 to 1.5°C (33-35°F) are predominant. These temperatures are very warm for Antarctic conditions. “These waters have warmed in West Antarctica over the past 50 years. And they are significant shallower than 50 years ago,” says Schmidtko. Especially in the Amundsen Sea and Bellingshausen Sea they now increasingly spill onto the shelf and warm the shelf.

“These are the regions in which accelerated glacial melting has been observed for some time. We show that oceanographic changes over the past 50 years have probably caused this melting. If the water continues to warm, the increased penetration of warmer water masses onto the shelf will likely further accelerate this process, with an impact on the rate of global sea level rise ” explains Professor Karen Heywood from the University of East Anglia.

The scientists also draw attention to the rising up of warm water masses in the southwestern Weddell Sea. Here very cold temperatures (less than minus 1.5°C or 29°F) prevail on the shelf and a large-scale melting of shelf ice has not been observed yet. If the shoaling of warm water masses continues, it is expected that there will be major environmental changes with dramatic consequences for the Filchner or Ronne Ice Shelf, too. For the first time glaciers outside the West Antarctic could experience enhanced melting from below.

To what extent the diverse biology of the Southern Ocean is influenced by the observed changes is not fully understood. The shelf areas include spawning areas for the Antarctic krill, a shrimp species widespread in the Southern Ocean, which plays a key role in the Antarctic food chain. Research results have shown that spawning cycles could change in warmer conditions. A final assessment of the impact has not yet been made.

The exact reasons for the increase of the heating and the rising of warm water masses has not yet been completely resolved. “We suspect that they are related to large-scale variations in wind systems over the southern hemisphere. But which processes specifically play a role must be evaluated in more detail.” says Dr. Schmidtko.

Volcano hazards and the role of westerly wind bursts in El Niño

On June 27, lava from Kīlauea, an active volcano on the island of Hawai'i, began flowing to the northeast, threatening the residents in a community in the District of Puna. -  USGS
On June 27, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in a community in the District of Puna. – USGS

On 27 June, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in Pāhoa, a community in the District of Puna, as well as the only highway accessible to this area. Scientists from the U.S. Geological Survey’s Hawaiian Volcano Observatory (HVO) and the Hawai’i County Civil Defense have been monitoring the volcano’s lava flow and communicating with affected residents through public meetings since 24 August. Eos recently spoke with Michael Poland, a geophysicist at HVO and a member of the Eos Editorial Advisory Board, to discuss how he and his colleagues communicated this threat to the public.

Drilling a Small Basaltic Volcano to Reveal Potential Hazards


Drilling into the Rangitoto Island Volcano in the Auckland Volcanic Field in New Zealand offers insight into a small monogenetic volcano, and may improve understanding of future hazards.

From AGU’s journals: El Niño fades without westerly wind bursts

The warm and wet winter of 1997 brought California floods, Florida tornadoes, and an ice storm in the American northeast, prompting climatologists to dub it the El Niño of the century. Earlier this year, climate scientists thought the coming winter might bring similar extremes, as equatorial Pacific Ocean conditions resembled those seen in early 1997. But the signals weakened by summer, and the El Niño predictions were downgraded. Menkes et al. used simulations to examine the differences between the two years.

The El Niño-Southern Oscillation is defined by abnormally warm sea surface temperatures in the eastern Pacific Ocean and weaker than usual trade winds. In a typical year, southeast trade winds push surface water toward the western Pacific “warm pool”–a region essential to Earth’s climate. The trade winds dramatically weaken or even reverse in El Niño years, and the warm pool extends its reach east.

Scientists have struggled to predict El Niño due to irregularities in the shape, amplitude, and timing of the surges of warm water. Previous studies suggested that short-lived westerly wind pulses (i.e. one to two weeks long) could contribute to this irregularity by triggering and sustaining El Niño events.

To understand the vanishing 2014 El Niño, the authors used computer simulations and examined the wind’s role. The researchers find pronounced differences between 1997 and 2014. Both years saw strong westerly wind events between January and March, but those disappeared this year as spring approached. In contrast, the westerly winds persisted through summer in 1997.

In the past, it was thought that westerly wind pulses were three times as likely to form if the warm pool extended east of the dateline. That did not occur this year. The team says their analysis shows that El Niño’s strength might depend on these short-lived and possibly unpredictable pulses.

###

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing more than 62,000 members in 144 countries. Join our conversation on Facebook, Twitter, YouTube, and other social media channels.

Volcano hazards and the role of westerly wind bursts in El Niño

On June 27, lava from Kīlauea, an active volcano on the island of Hawai'i, began flowing to the northeast, threatening the residents in a community in the District of Puna. -  USGS
On June 27, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in a community in the District of Puna. – USGS

On 27 June, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in Pāhoa, a community in the District of Puna, as well as the only highway accessible to this area. Scientists from the U.S. Geological Survey’s Hawaiian Volcano Observatory (HVO) and the Hawai’i County Civil Defense have been monitoring the volcano’s lava flow and communicating with affected residents through public meetings since 24 August. Eos recently spoke with Michael Poland, a geophysicist at HVO and a member of the Eos Editorial Advisory Board, to discuss how he and his colleagues communicated this threat to the public.

Drilling a Small Basaltic Volcano to Reveal Potential Hazards


Drilling into the Rangitoto Island Volcano in the Auckland Volcanic Field in New Zealand offers insight into a small monogenetic volcano, and may improve understanding of future hazards.

From AGU’s journals: El Niño fades without westerly wind bursts

The warm and wet winter of 1997 brought California floods, Florida tornadoes, and an ice storm in the American northeast, prompting climatologists to dub it the El Niño of the century. Earlier this year, climate scientists thought the coming winter might bring similar extremes, as equatorial Pacific Ocean conditions resembled those seen in early 1997. But the signals weakened by summer, and the El Niño predictions were downgraded. Menkes et al. used simulations to examine the differences between the two years.

The El Niño-Southern Oscillation is defined by abnormally warm sea surface temperatures in the eastern Pacific Ocean and weaker than usual trade winds. In a typical year, southeast trade winds push surface water toward the western Pacific “warm pool”–a region essential to Earth’s climate. The trade winds dramatically weaken or even reverse in El Niño years, and the warm pool extends its reach east.

Scientists have struggled to predict El Niño due to irregularities in the shape, amplitude, and timing of the surges of warm water. Previous studies suggested that short-lived westerly wind pulses (i.e. one to two weeks long) could contribute to this irregularity by triggering and sustaining El Niño events.

To understand the vanishing 2014 El Niño, the authors used computer simulations and examined the wind’s role. The researchers find pronounced differences between 1997 and 2014. Both years saw strong westerly wind events between January and March, but those disappeared this year as spring approached. In contrast, the westerly winds persisted through summer in 1997.

In the past, it was thought that westerly wind pulses were three times as likely to form if the warm pool extended east of the dateline. That did not occur this year. The team says their analysis shows that El Niño’s strength might depend on these short-lived and possibly unpredictable pulses.

###

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing more than 62,000 members in 144 countries. Join our conversation on Facebook, Twitter, YouTube, and other social media channels.

Felling pine trees to study their wind resistance

Forestry experts of the French Institute for Agricultural Research INRA together with technicians from NEIKER-Tecnalia and the Chartered Provincial Council of Bizkaia felled radiata pine specimens of different ages in order to find out their resistance to gales and observe the force the wind needs to exert to blow down these trees in the particular conditions of the Basque Country.

This experience is of great interest for the managers of forests and will help them to manage their woodlands better and incorporate the wind variable into decisions like the distribution of plantations, or the most propitious moment for felling the trees.

Professionals like timber growers in the forestry sector, foresters, forestry technicians and researchers gathered to witness the simulation from close quarters. The trees were felled with steel cables that act as the wind force and which were fitted with sensors to measure the force need to bring the trees down. Each radiata pine had been fitted with three tilt meters that recorded the degree of tilt according to the force exerted on the tree. That way it was possible to determine the resistance of the roots and the strength of the trunk, two essential parameters to find out the capacity of the tree to withstand the thrust of the wind.

The experience carried out this morning is part of the seminar ‘FORRISK: Wind damage risk in forests’, which took place in the Bizkaia Aretoa in Bilbao, and was organised by NEIKER-Tecnalia in collaboration with the Chartered Provincial Council of Bizkaia, HAZI and the Atlantic Regional Office of EFI (European Forest Institute). The seminar is part of the European project “FORRISK- Network for innovation in silviculture and integrated systems for forest risk management”. This initiative has been co-funded by the ERDF and by the Sub-Ministry for Agriculture, Fisheries and Food Policy of the Government of the Basque Autonomous Community (region). The seminar took place in Bilbao because of its status as European Forest City 2014.

The seminar was used to present the detailed map of the characteristics of the wind in the Basque Country, which timber growers and forestry managers can now avail themselves of.The map has been produced by researchers at INRA, the French Institute for Agricultural Research, who have used information from the 57 meteorological stations equipped with anemometers in the network of the Basque Meteorological Authority, Euskalmet.

A tool for estimating wind damage

Those attending the seminar also had the chance to get to know the ForestGALES computing tool that allows managers to estimate the probability of wind damage in forests. ForestGALES was originally created for Britain and has been adapted to the characteristics of the Basque geography by INRA, NEIKER-Tecnalia and HAZI technicians. This innovative application is of great use in specifying concrete actions (for example: spacing, silvicultural interventions like clearing or thinning) bearing in mind the probability of wind damage on each plot.

To get the most out of this tool, it is necessary to know the resistance of the roots and strength of the trunks of the relevant species, as well as the characteristics of the wind where the trees are growing.So today’s simulation and the Basque wind map are two fundamental components for developing the ForestGALES model.

Increase in extreme winds owing to climate change

Cyclones like Klaus (2009) and Xynthia (2010) brought down over 200,000 cubic metres of timber as they passed through the Basque Country, owing to gusts of winds in excess of 228 kilometres per hour. Predictions indicate that the frequency of extreme phenomena like these is set to increase owing to climate change. So the forestry sector needs to have information and tools that will enable it to tackle the risks resulting from the wind.

Felling pine trees to study their wind resistance

Forestry experts of the French Institute for Agricultural Research INRA together with technicians from NEIKER-Tecnalia and the Chartered Provincial Council of Bizkaia felled radiata pine specimens of different ages in order to find out their resistance to gales and observe the force the wind needs to exert to blow down these trees in the particular conditions of the Basque Country.

This experience is of great interest for the managers of forests and will help them to manage their woodlands better and incorporate the wind variable into decisions like the distribution of plantations, or the most propitious moment for felling the trees.

Professionals like timber growers in the forestry sector, foresters, forestry technicians and researchers gathered to witness the simulation from close quarters. The trees were felled with steel cables that act as the wind force and which were fitted with sensors to measure the force need to bring the trees down. Each radiata pine had been fitted with three tilt meters that recorded the degree of tilt according to the force exerted on the tree. That way it was possible to determine the resistance of the roots and the strength of the trunk, two essential parameters to find out the capacity of the tree to withstand the thrust of the wind.

The experience carried out this morning is part of the seminar ‘FORRISK: Wind damage risk in forests’, which took place in the Bizkaia Aretoa in Bilbao, and was organised by NEIKER-Tecnalia in collaboration with the Chartered Provincial Council of Bizkaia, HAZI and the Atlantic Regional Office of EFI (European Forest Institute). The seminar is part of the European project “FORRISK- Network for innovation in silviculture and integrated systems for forest risk management”. This initiative has been co-funded by the ERDF and by the Sub-Ministry for Agriculture, Fisheries and Food Policy of the Government of the Basque Autonomous Community (region). The seminar took place in Bilbao because of its status as European Forest City 2014.

The seminar was used to present the detailed map of the characteristics of the wind in the Basque Country, which timber growers and forestry managers can now avail themselves of.The map has been produced by researchers at INRA, the French Institute for Agricultural Research, who have used information from the 57 meteorological stations equipped with anemometers in the network of the Basque Meteorological Authority, Euskalmet.

A tool for estimating wind damage

Those attending the seminar also had the chance to get to know the ForestGALES computing tool that allows managers to estimate the probability of wind damage in forests. ForestGALES was originally created for Britain and has been adapted to the characteristics of the Basque geography by INRA, NEIKER-Tecnalia and HAZI technicians. This innovative application is of great use in specifying concrete actions (for example: spacing, silvicultural interventions like clearing or thinning) bearing in mind the probability of wind damage on each plot.

To get the most out of this tool, it is necessary to know the resistance of the roots and strength of the trunks of the relevant species, as well as the characteristics of the wind where the trees are growing.So today’s simulation and the Basque wind map are two fundamental components for developing the ForestGALES model.

Increase in extreme winds owing to climate change

Cyclones like Klaus (2009) and Xynthia (2010) brought down over 200,000 cubic metres of timber as they passed through the Basque Country, owing to gusts of winds in excess of 228 kilometres per hour. Predictions indicate that the frequency of extreme phenomena like these is set to increase owing to climate change. So the forestry sector needs to have information and tools that will enable it to tackle the risks resulting from the wind.

New study shows 3 abrupt pulse of CO2 during last deglaciation

A new study shows that the rise of atmospheric carbon dioxide that contributed to the end of the last ice age more than 10,000 years ago did not occur gradually, but was characterized by three “pulses” in which C02 rose abruptly.

Scientists are not sure what caused these abrupt increases, during which C02 levels rose about 10-15 parts per million – or about 5 percent per episode – over a period of 1-2 centuries. It likely was a combination of factors, they say, including ocean circulation, changing wind patterns, and terrestrial processes.

The finding is important, however, because it casts new light on the mechanisms that take the Earth in and out of ice age regimes. Results of the study, which was funded by the National Science Foundation, appear this week in the journal Nature.

“We used to think that naturally occurring changes in carbon dioxide took place relatively slowly over the 10,000 years it took to move out of the last ice age,” said Shaun Marcott, lead author on the article who conducted his study as a post-doctoral researcher at Oregon State University. “This abrupt, centennial-scale variability of CO2 appears to be a fundamental part of the global carbon cycle.”

Some previous research has hinted at the possibility that spikes in atmospheric carbon dioxide may have accelerated the last deglaciation, but that hypothesis had not been resolved, the researchers say. The key to the new finding is the analysis of an ice core from the West Antarctic that provided the scientists with an unprecedented glimpse into the past.

Scientists studying past climate have been hampered by the limitations of previous ice cores. Cores from Greenland, for example, provide unique records of rapid climate events going back 120,000 years – but high concentrations of impurities don’t allow researchers to accurately determine atmospheric carbon dioxide records. Antarctic ice cores have fewer impurities, but generally have had lower “temporal resolution,” providing less detailed information about atmospheric CO2.

However, a new core from West Antarctica, drilled to a depth of 3,405 meters in 2011 and spanning the last 68,000 years, has “extraordinary detail,” said Oregon State paleoclimatologist Edward Brook, a co-author on the Nature study and an internationally recognized ice core expert. Because the area where the core was taken gets high annual snowfall, he said, the new ice core provides one of the most detailed records of atmospheric CO2.

“It is a remarkable ice core and it clearly shows distinct pulses of carbon dioxide increase that can be very reliably dated,” Brook said. “These are some of the fastest natural changes in CO2 we have observed, and were probably big enough on their own to impact the Earth’s climate.

“The abrupt events did not end the ice age by themselves,” Brook added. “That might be jumping the gun a bit. But it is fair to say that the natural carbon cycle can change a lot faster than was previously thought – and we don’t know all of the mechanisms that caused that rapid change.”

The researchers say that the increase in atmospheric CO2 from the peak of the last ice age to complete deglaciation was about 80 parts per million, taking place over 10,000 years. Thus, the finding that 30-45 ppm of the increase happened in just a few centuries was significant.

The overall rise of atmospheric carbon dioxide during the last deglaciation was thought to have been triggered by the release of CO2 from the deep ocean – especially the Southern Ocean. However, the researchers say that no obvious ocean mechanism is known that would trigger rises of 10-15 ppm over a time span as short as one to two centuries.

“The oceans are simply not thought to respond that fast,” Brook said. “Either the cause of these pulses is at least part terrestrial, or there is some mechanism in the ocean system we don’t yet know about.”

One reason the researchers are reluctant to pin the end of the last ice age solely on CO2 increases is that other processes were taking place, according to Marcott, who recently joined the faculty of the University of Wisconsin-Madison.

“At the same time CO2 was increasing, the rate of methane in the atmosphere was also increasing at the same or a slightly higher rate,” Marcott said. “We also know that during at least two of these pulses, the Atlantic Meridional Overturning Circulation changed as well. Changes in the ocean circulation would have affected CO2 – and indirectly methane, by impacting global rainfall patterns.”

“The Earth is a big coupled system,” he added, “and there are many pieces to the puzzle. The discovery of these strong, rapid pulses of CO2 is an important piece.”

New study shows 3 abrupt pulse of CO2 during last deglaciation

A new study shows that the rise of atmospheric carbon dioxide that contributed to the end of the last ice age more than 10,000 years ago did not occur gradually, but was characterized by three “pulses” in which C02 rose abruptly.

Scientists are not sure what caused these abrupt increases, during which C02 levels rose about 10-15 parts per million – or about 5 percent per episode – over a period of 1-2 centuries. It likely was a combination of factors, they say, including ocean circulation, changing wind patterns, and terrestrial processes.

The finding is important, however, because it casts new light on the mechanisms that take the Earth in and out of ice age regimes. Results of the study, which was funded by the National Science Foundation, appear this week in the journal Nature.

“We used to think that naturally occurring changes in carbon dioxide took place relatively slowly over the 10,000 years it took to move out of the last ice age,” said Shaun Marcott, lead author on the article who conducted his study as a post-doctoral researcher at Oregon State University. “This abrupt, centennial-scale variability of CO2 appears to be a fundamental part of the global carbon cycle.”

Some previous research has hinted at the possibility that spikes in atmospheric carbon dioxide may have accelerated the last deglaciation, but that hypothesis had not been resolved, the researchers say. The key to the new finding is the analysis of an ice core from the West Antarctic that provided the scientists with an unprecedented glimpse into the past.

Scientists studying past climate have been hampered by the limitations of previous ice cores. Cores from Greenland, for example, provide unique records of rapid climate events going back 120,000 years – but high concentrations of impurities don’t allow researchers to accurately determine atmospheric carbon dioxide records. Antarctic ice cores have fewer impurities, but generally have had lower “temporal resolution,” providing less detailed information about atmospheric CO2.

However, a new core from West Antarctica, drilled to a depth of 3,405 meters in 2011 and spanning the last 68,000 years, has “extraordinary detail,” said Oregon State paleoclimatologist Edward Brook, a co-author on the Nature study and an internationally recognized ice core expert. Because the area where the core was taken gets high annual snowfall, he said, the new ice core provides one of the most detailed records of atmospheric CO2.

“It is a remarkable ice core and it clearly shows distinct pulses of carbon dioxide increase that can be very reliably dated,” Brook said. “These are some of the fastest natural changes in CO2 we have observed, and were probably big enough on their own to impact the Earth’s climate.

“The abrupt events did not end the ice age by themselves,” Brook added. “That might be jumping the gun a bit. But it is fair to say that the natural carbon cycle can change a lot faster than was previously thought – and we don’t know all of the mechanisms that caused that rapid change.”

The researchers say that the increase in atmospheric CO2 from the peak of the last ice age to complete deglaciation was about 80 parts per million, taking place over 10,000 years. Thus, the finding that 30-45 ppm of the increase happened in just a few centuries was significant.

The overall rise of atmospheric carbon dioxide during the last deglaciation was thought to have been triggered by the release of CO2 from the deep ocean – especially the Southern Ocean. However, the researchers say that no obvious ocean mechanism is known that would trigger rises of 10-15 ppm over a time span as short as one to two centuries.

“The oceans are simply not thought to respond that fast,” Brook said. “Either the cause of these pulses is at least part terrestrial, or there is some mechanism in the ocean system we don’t yet know about.”

One reason the researchers are reluctant to pin the end of the last ice age solely on CO2 increases is that other processes were taking place, according to Marcott, who recently joined the faculty of the University of Wisconsin-Madison.

“At the same time CO2 was increasing, the rate of methane in the atmosphere was also increasing at the same or a slightly higher rate,” Marcott said. “We also know that during at least two of these pulses, the Atlantic Meridional Overturning Circulation changed as well. Changes in the ocean circulation would have affected CO2 – and indirectly methane, by impacting global rainfall patterns.”

“The Earth is a big coupled system,” he added, “and there are many pieces to the puzzle. The discovery of these strong, rapid pulses of CO2 is an important piece.”

Offshore islands amplify, rather than dissipate, a tsunami’s power

This model shows the impact of coastal islands on a tsunami's height. -  Courtesy of Jose Borrero/eCoast/USC
This model shows the impact of coastal islands on a tsunami’s height. – Courtesy of Jose Borrero/eCoast/USC

A long-held belief that offshore islands protect the mainland from tsunamis turns out to be the exact opposite of the truth, according to a new study.

Common wisdom — from Southern California to the South Pacific — for coastal residents and scientists alike has long been that offshore islands would create a buffer that blocked the power of a tsunami. In fact, computer modeling of tsunamis striking a wide variety of different offshore island geometries yielded no situation in which the mainland behind them fared better.

Instead, islands focused the energy of the tsunami, increasing flooding on the mainland by up to 70 percent.

“This is where many fishing villages are located, behind offshore islands, in the belief that they will be protected from wind waves. Even Southern California residents believe that the Channel Islands and Catalina will protect them,” said Costas Synolakis of the USC Viterbi School of Engineering, a member of the multinational team that conducted the research.

The research was inspired by a field survey of the impact of the 2010 tsunami on the Mentawai Islands off of Sumatra. The survey data showed that villages located in the shadow of small offshore islets suffered some of the strongest tsunami impacts, worse than villages located along open coasts.

Subsequent computer modeling by Jose Borrero, adjunct assistant research professor at the USC Viterbi Tsunami Research Center, showed that the offshore islands had actually contributed to — not diminished — the tsunami’s impact.

Synolakis then teamed up with researchers Emile Contal and Nicolas Vayatis of Ecoles Normales de Cachan in Paris; and Themistoklis S. Stefanakis and Frederic Dias, who both have joint appointments at Ecoles Normales de Cachan and University College Dublin to determine whether that was a one-of-a-kind situation, or the norm.

Their study, of which Dias was the corresponding author, was published in Proceedings of the Royal Society A on Nov. 5.

The team designed a computer model that took into consideration various island slopes, beach slopes, water depths, distance between the island and the beach, and wavelength of the incoming tsunami.

“Even a casual analysis of these factors would have required hundreds of thousands of computations, each of which could take up to half a day,” Synolakis said. “So instead, we used machine learning.”

Machine learning is a mathematical process that makes it easier to identify the maximum values of interdependent processes with multiple parameters by allowing the computer to “learn” from previous results.

The computer starts to understand how various tweaks to the parameters affect the overall outcome and finds the best answer quicker. As such, results that traditionally could have taken hundreds of thousands of models to uncover were found with 200 models.

“This work is applicable to some of our tsunami study sites in New Zealand,” said Borrero, who is producing tsunami hazard maps for regions of the New Zealand coast. “The northeast coast of New Zealand has many small islands offshore, similar to those in Indonesia, and our modeling suggests that this results in areas of enhanced tsunami heights.”

“Substantial public education efforts are needed to help better explain to coastal residents tsunami hazards, and whenever they need to be extra cautious and responsive with evacuations during actual emergencies,” Synolakis said.

###

The research was funded by EDSP of ENS-Cachan; the Cultural Service of the French Embassy in Dublin; the ERC; SFI; University College Dublin; and the EU FP7 program ASTARTE. The study can be found online at http://rspa.royalsocietypublishing.org/content/470/2172/20140575.

Offshore islands amplify, rather than dissipate, a tsunami’s power

This model shows the impact of coastal islands on a tsunami's height. -  Courtesy of Jose Borrero/eCoast/USC
This model shows the impact of coastal islands on a tsunami’s height. – Courtesy of Jose Borrero/eCoast/USC

A long-held belief that offshore islands protect the mainland from tsunamis turns out to be the exact opposite of the truth, according to a new study.

Common wisdom — from Southern California to the South Pacific — for coastal residents and scientists alike has long been that offshore islands would create a buffer that blocked the power of a tsunami. In fact, computer modeling of tsunamis striking a wide variety of different offshore island geometries yielded no situation in which the mainland behind them fared better.

Instead, islands focused the energy of the tsunami, increasing flooding on the mainland by up to 70 percent.

“This is where many fishing villages are located, behind offshore islands, in the belief that they will be protected from wind waves. Even Southern California residents believe that the Channel Islands and Catalina will protect them,” said Costas Synolakis of the USC Viterbi School of Engineering, a member of the multinational team that conducted the research.

The research was inspired by a field survey of the impact of the 2010 tsunami on the Mentawai Islands off of Sumatra. The survey data showed that villages located in the shadow of small offshore islets suffered some of the strongest tsunami impacts, worse than villages located along open coasts.

Subsequent computer modeling by Jose Borrero, adjunct assistant research professor at the USC Viterbi Tsunami Research Center, showed that the offshore islands had actually contributed to — not diminished — the tsunami’s impact.

Synolakis then teamed up with researchers Emile Contal and Nicolas Vayatis of Ecoles Normales de Cachan in Paris; and Themistoklis S. Stefanakis and Frederic Dias, who both have joint appointments at Ecoles Normales de Cachan and University College Dublin to determine whether that was a one-of-a-kind situation, or the norm.

Their study, of which Dias was the corresponding author, was published in Proceedings of the Royal Society A on Nov. 5.

The team designed a computer model that took into consideration various island slopes, beach slopes, water depths, distance between the island and the beach, and wavelength of the incoming tsunami.

“Even a casual analysis of these factors would have required hundreds of thousands of computations, each of which could take up to half a day,” Synolakis said. “So instead, we used machine learning.”

Machine learning is a mathematical process that makes it easier to identify the maximum values of interdependent processes with multiple parameters by allowing the computer to “learn” from previous results.

The computer starts to understand how various tweaks to the parameters affect the overall outcome and finds the best answer quicker. As such, results that traditionally could have taken hundreds of thousands of models to uncover were found with 200 models.

“This work is applicable to some of our tsunami study sites in New Zealand,” said Borrero, who is producing tsunami hazard maps for regions of the New Zealand coast. “The northeast coast of New Zealand has many small islands offshore, similar to those in Indonesia, and our modeling suggests that this results in areas of enhanced tsunami heights.”

“Substantial public education efforts are needed to help better explain to coastal residents tsunami hazards, and whenever they need to be extra cautious and responsive with evacuations during actual emergencies,” Synolakis said.

###

The research was funded by EDSP of ENS-Cachan; the Cultural Service of the French Embassy in Dublin; the ERC; SFI; University College Dublin; and the EU FP7 program ASTARTE. The study can be found online at http://rspa.royalsocietypublishing.org/content/470/2172/20140575.