Is the ice at the South Pole melting?

The change in the ice mass covering Antarctica is a critical factor in global climate events. Scientists at the GFZ German Research Centre for Geosciences have now found that the year by year mass variations in the western Antarctic are mainly attributable to fluctuations in precipitation, which are controlled significantly by the climate phenomenon El Nino. They examined the GFZ data of the German-American satellite mission GRACE (Gravity Recovery and Climate Experiment). The investigation showed significant regional differences in the western coastal area of the South Pole area.

Two areas in Antarctica are of particular interest because of their potential sensitivity to global climate change: the Antarctic Peninsula, which is currently experiencing a warming exceeding the global mean and the disappearance of large ice shelf areas, and the Amundsen Sector of West Antarctica, where currently the largest flow rates and mass loss of the Antarctic Ice Sheet is occurring. For some glaciers the ice thickness is decreasing rapidly, and glaciers and ice streams are notably retreating back into the interior. With 0.3 millimeters per year, both regions are currently contributing considerably to the global sea level change of about three millimeters per year.

In the study, the mass balance of both regions is reevaluated from gravity data of the satellite mission GRACE. As a result, the estimates were lower than those of conventional mass balance methods. “With the GRACE time series, it was for the first time possible to observe how the large-scale ice mass varies in the two areas due to fluctuations in rainfall from year to year,” said the GFZ scientists Ingo Sasgen. It has long been known that the Pacific El Niño climate phenomenon and the snowfall in Antarctica are linked. The complementary piece to the warm phase El Nino, the cold phase known as La Nina, also affects the Antarctic climate: “The cooler La Nina years lead to a strong low pressure area over the Amundsen Sea, which favors heavy rainfall along the Antarctic Peninsula – the ice mass is increasing there. In contrast, the Amundsen area is dominated by dry air from the interior during this time. El Nino years with their warm phase lead to precisely the opposite pattern: reduced rainfall and mass loss in the Antarctic Peninsula, and an increase in the Amundsen Sectorfield, respectively” explains Professor Maik Thomas, head of the section “Earth System Modelling” at the German Research Centre for Geosciences (Helmholtz Association).

The recording of the entire ice mass of the South Pole and its variations is a central task in climate research and still raises many unanswered questions. In principle, the study could show that the continuous gravity data of the GRACE satellite mission contain another important medium-term climate signal.

Phosphorus identified as the missing link in evolution of animals

A University of Alberta geomicrobiologist and his PhD student are part of a research team that has identified phosphorus as the mystery ingredient that pushed oxygen levels in the oceans high enough to establish the first animals on Earth 750 million years ago.

By examining ancient-ocean sediments, Kurt Konhauser, student Stefan Lalonde and other colleagues discovered that as the last glacier to encircle Earth receded, leaving behind glacial debris containing phosphorus that washed into the oceans. Phosphorus is an essential nutrient that promoted the growth of cyanobacteria, or blue-green-algae, and its metabolic byproduct is oxygen. The new, higher oxygen levels in the ocean reached a threshold favourable for animals to evolve.

Konhauser’s past research into ancient phosphorus levels in a unique suite of rocks called banded iron formations led him and his colleagues at the University of California Riverside to their current findings.

In 2007, Konhauser and his U of A team published research in the magazine Science that was contrary to the then-accepted theory that phosphorus was scarce throughout much of Earth’s history, it was in fact plentiful.

“Now in 2010 we showed that phosphorus levels actually peaked between 750 and 635 million years ago at the very same time that oxygen levels increased, allowing complex life forms to emerge,” says Lalonde. “That establishes our link between phosphorus and the evolution of animals.”

A speed gun for the Earth’s insides

Researchers at the University of Bristol reveal today in the journal Nature that they have developed a seismological ‘speed gun’ for the inside of the Earth. Using this technique they will be able to measure the way the Earth’s deep interior slowly moves around. This mantle motion is what controls the location of our continents and oceans, and where the tectonic plates collide to shake the surface we live on.

For 2,900 km (1800 miles) beneath our feet, the Earth is made of the rocky mantle. Although solid, it is so hot that it can flow like putty over millions of years. It is heated from below, so that it circulates like water on a stove. While geophysicists know something about how the material moves by the time it reaches the top of the mantle, what goes on at the bottom is still a puzzle. However researchers need to know both to predict how the Earth’s surface-our home-will behave.

Andy Nowacki, at the School of Earth Sciences at Bristol University, explained: “The only way to measure the inside of the Earth at such huge depths is with seismic waves. When a large earthquake occurs and waves travel through the Earth, they are affected in different ways, and we can examine their properties to work out what is happening thousands of miles beneath our feet, a region where we can never go. This study focuses on a mysterious layer where the mantle meets the core, a sphere of iron at the center of the Earth 7,000 km (4400 miles) across. This part just above the core has curious properties which we can measure using seismic waves that pass through it.”

This enigmatic part of the Earth is known as D″ (pronounced ‘dee-double-prime’). Dr James Wookey said: “We believe that D″ is made from crystals which line up in a certain orientation when the mantle flows. We can measure how they line up, and in this study we do this for one part of the world – North and Central America. In the future our method can then be used to see which direction the mantle is moving everywhere.”

Professor Mike Kendall added: “This part of the Earth is incredibly important. The lowermost mantle is where two colossal, churning engines-the mantle and the core-meet and interact. The core is moving very quickly and creates our magnetic field which protects us from the Sun’s rays. The mantle above is sluggish, but drives the motion of the plates on the Earth’s surface, which build mountains, feed volcanoes and cause earthquakes. Measuring the flow in the lowermost mantle is vital to understanding the long term evolution of the Earth.”

Purdue-led research team finds Haiti quake caused by unknown fault

Eric Calais, a Purdue professor of geophysics, shows GPS equipment used to track movements of the Earth's surface as small as one millimeter. These tiny movements show the build up of stress that could lead to an earthquake and are used in evaluating the potential threat to an area. -  Purdue News Service file photo/David Umberger
Eric Calais, a Purdue professor of geophysics, shows GPS equipment used to track movements of the Earth’s surface as small as one millimeter. These tiny movements show the build up of stress that could lead to an earthquake and are used in evaluating the potential threat to an area. – Purdue News Service file photo/David Umberger

Researchers found a previously unmapped fault was responsible for the devastating Jan. 12 earthquake in Haiti and that the originally blamed fault remains ready to produce a large earthquake.

Eric Calais, a Purdue University professor of earth and atmospheric sciences, led the team that was the first on the ground in Haiti after the magnitude 7.0 earthquake, which killed more than 200,000 people and left 1.5 million homeless.

The team determined the earthquake’s origin is a previously unmapped fault, which they named the LÄogëne fault. The newly discovered fault runs almost parallel to the Enriquillo fault, which was originally thought to be the source of the earthquake, he said.

“This means that the Enriquillo fault is still capable of producing large earthquakes and that Haiti has to adapt to this seismic hazard,” said Calais, who in September was appointed science adviser for the United Nations Development Program in Haiti. “The fault system is more complex than we originally thought, and we don’t yet know how the January earthquake impacted the other faults. Preliminary measurements indicate that the Enriquillo fault did not release any accumulated seismic energy and, therefore, remains a significant threat for Haiti, and Port-au-Prince in particular. We need to investigate the fault system further to be able to determine where the next earthquakes might occur and how large they could be.”

The shifting of the Earth’s crust after a major earthquake can add to or reduce stresses building up in nearby faults and can apply pressures that effectively stop or release other earthquakes. Because of this, the earthquake along the Léogâne fault may have delayed or advanced the timing for the next earthquake on the Enriquillo fault, he said.

“For practical purposes, speculating on when the next earthquake might happen is not an effective strategy,” Calais said. “We rather need to focus attention, energy and funds on proactive measures to help the country adapt to earthquake hazards and, eventually, reduce economic losses and save lives. Our finding raises many important scientific questions and we are working to find the answers, but we already know that the earthquake threat in Haiti is inexorable. The reconstruction process that is now starting in Haiti is an opportunity to build better, of course, but also to develop an effective prevention and mitigation strategy for the future.”

The team analyzed data they recorded before the Jan. 12 earthquake and new measurements taken after the event. Their work is detailed in a paper that will be published in the November issue of Nature Geosciences.

Andrew Freed, paper co-author and a Purdue professor of earth and atmospheric sciences, said the absence of any surface rupture was the first clue that the earthquake did not happen along the Enriquillo fault.

“It was a big surprise that we couldn’t find a surface rupture anywhere,” Freed said. “We did find other physical changes that we expected after an earthquake of that magnitude, but in entirely the wrong location to have come from the Enriquillo fault.”

For instance the team found that the epicenter area rose by a little more than half a meter and that the earthquake caused contraction of the Earth’s crust opposite of what would be expected from the Enriquillo fault, he said.

The team used global positioning system equipment and radar interferometry to measure how the ground moved during the earthquake, which provides insight into what is happening as much as 20 kilometers below the surface. The team then used a computer model to determine what characteristics the source of the earthquake must have in order to produce the observed changes.

Through this work, the team discovered the previously unmapped Léogâne fault, which is located just to the north of the Enriquillo fault and dips by a 60-degree angle to the north. The fault is a blind thrust, meaning one side of the fault is being thrust over the other, but the fault does not reach the surface.

About 30 kilometers of the fault shifted during the January earthquake, and the sides of the fault moved by as much as five meters relative to each other below the Earth’s surface. The full length of the fault is not known, Freed said.

“Only portions of a fault are affected during any given earthquake, and the length of the portion affected is relative to the magnitude of the event,” Freed said. “Because this is a blind fault, we don’t have some of the clues at the surface, like scars from past ruptures, that show where the fault runs. On the Enriquillo fault you can almost walk the line of the fault because scars from many past events reveal the fault below. That isn’t the case with the Léogâne fault.”

The team plans to continue to take measurements of the postseismic processes that allow them to understand changing stresses within the Earth’s crust over time that could help point to areas where seismic hazard is increasing. In addition they plan to create models to better understand the fault systems, their behavior and why they exist at these particular locations, Freed said.

In addition to Freed, co-authors include Glen Mattioli of the University of Arkansas; Falk Amelung, Sang-Hoon Hong and Timothy Dixon of the University of Miami; Sigurjùn Jùnsson of the King Abdullah University of Science and Technology in Saudi Arabia; Pamela Jansma of the University of Texas at Arlington; Claude PrÄpetit of the Bureau of Mines in Haiti; and Roberte Momplaisir of the State University of Haiti.

Calais has studied the Enriquillo and Septentrional faults on the island of Hispaniola, which includes Haiti and the Dominican Republic, since 1989. His research team has been measuring the build up of energy along these faults using global positioning system technology for 10 years. The team first reported the risk for a major earthquake there in 2008.

‘Fracking’ mobilizes uranium in marcellus shale

UB Professor Tracy Bank and her colleagues have found that hydraulic fracturing or 'fracking' of Marcellus shale causes naturally occurring uranium to be released, raising additional environmental concerns. -  UB/Douglas Levere
UB Professor Tracy Bank and her colleagues have found that hydraulic fracturing or ‘fracking’ of Marcellus shale causes naturally occurring uranium to be released, raising additional environmental concerns. – UB/Douglas Levere

Scientific and political disputes over drilling Marcellus shale for natural gas have focused primarily on the environmental effects of pumping millions of gallons of water and chemicals deep underground to blast through rocks to release the natural gas.

But University at Buffalo researchers have now found that that process — called hydraulic fracturing or “fracking”– also causes uranium that is naturally trapped inside Marcellus shale to be released, raising additional environmental concerns.

The research will be presented at the annual meeting of the Geological Society of America in Denver on Nov. 2.

Marcellus shale is a massive rock formation that stretches from New York through Pennsylvania, Ohio and West Virginia, and which is often described as the nation’s largest source of natural gas.

“Marcellus shale naturally traps metals such as uranium and at levels higher than usually found naturally, but lower than manmade contamination levels,” says Tracy Bank, PhD, assistant professor of geology in UB’s College of Arts and Sciences and lead researcher. “My question was, if they start drilling and pumping millions of gallons of water into these underground rocks, will that force the uranium into the soluble phase and mobilize it? Will uranium then show up in groundwater?”

To find out, Bank and her colleagues at UB scanned the surfaces of Marcellus shale samples from Western New York and Pennsylvania. Using sensitive chemical instruments, they created a chemical map of the surfaces to determine the precise location in the shale of the hydrocarbons, the organic compounds containing natural gas.

“We found that the uranium and the hydrocarbons are in the same physical space,” says Bank. “We found that they are not just physically — but also chemically — bound.

“That led me to believe that uranium in solution could be more of an issue because the process of drilling to extract the hydrocarbons could start mobilizing the metals as well, forcing them into the soluble phase and causing them to move around.”

When Bank and her colleagues reacted samples in the lab with surrogate drilling fluids, they found that the uranium was indeed, being solubilized.

In addition, she says, when the millions of gallons of water used in hydraulic fracturing come back to the surface, it could contain uranium contaminants, potentially polluting streams and other ecosystems and generating hazardous waste.

The research required the use of very sophisticated methods of analysis, including one called Time-of-Flight Secondary Ion Mass Spectrometry, or ToF-SIMS, in the laboratory of Joseph A. Gardella Jr., Larkin Professor of Chemistry at UB.

The UB research is the first to map samples using this technique, which identified the precise location of the uranium.

“Even though at these levels, uranium is not a radioactive risk, it is still a toxic, deadly metal,” Bank concludes. “We need a fundamental understanding of how uranium exists in shale. The more we understand about how it exists, the more we can better predict how it will react to ‘fracking.’

Scientists pioneer wireless sensors to explore little known glacier phenomenon

This is Dr. Kirk Martinez with the wireless sensor basestation on the Skalafellsjökull glacier in Iceland. -  University of Southampton
This is Dr. Kirk Martinez with the wireless sensor basestation on the Skalafellsjökull glacier in Iceland. – University of Southampton

Researchers at the University of Southampton are pioneering the use of wireless sensors to study a little-known phenomenon that affects the movement of glaciers.

Professor Jane Hart from the School of Geography and Dr Kirk Martinez of the School of Electronics and Computer Science (ECS) have been awarded a Leverhulme Trust Research Project Grant to study glacier ‘stick-slip’ motion as it affects the Skalafellsjökull glacier in Iceland.

According to the team, scientists know surprisingly little about ‘stick-slip’ motion, the term given to the events which cause ice sheet movement, and occur in the normal course of glacier sliding.

“Due to the logistical problems of studying glaciers and the subglacial environment, we know very little about the process,” said Professor Hart. “Until recently, it was assumed that glaciers flowed slowly and continuously, but there is a growing body of evidence that glacier movement can be episodic and can be modeled in a similar way to earthquakes as stick-slip motion.”

To measure the ‘stick’ phase, the researchers plan to use an innovative wireless multisensory probe, which they developed to use on Glacsweb, a project which deployed the world’s first wireless probe to measure in-situ processes at the base of a glacier in Briksdalsbreen, Norway.

They plan to use a GPS and accelerometers on the glacier surface to measure the ‘slip’ phase.

“This research is significant because it uses the most recent technological advances in wireless sensor network research to understand a fundamental property of glacier dynamics,” Dr Martinez added. “Environmental Sensor Networks provide a unique way of studying glacial motion and the associated responses of the ice and till. The Glacsweb system is the only glacial wireless system in use today and serves as an ideal platform to investigate new scientific problems.”

The project will continue for three years and the data collected will be sent back daily to a server in the UK via the mobile phone network, and published on the web for other researchers.

The sound of the underground! New acoustic early warning system for landslide prediction

A new type of sound sensor system has been developed to predict the likelihood of a landslide.

Thought to be the first system of its kind in the world, it works by measuring and analyzing the acoustic behavior of soil to establish when a landslide is imminent so preventative action can be taken.

Noise created by movement under the surface builds to a crescendo as the slope becomes unstable and so gauging the increased rate of generated sound enables accurate prediction of a catastrophic soil collapse.

The technique has been developed by researchers at Loughborough University, in collaboration with the British Geological Survey, through two projects funded by the Engineering and Physical Sciences Research Council (EPSRC).

The detection system consists of a network of sensors buried across the hillside or embankment that presents a risk of collapse. The sensors, acting as microphones in the subsoil, record the acoustic activity of the soil across the slope and each transmits a signal to a central computer for analysis.

Noise rates, created by inter-particle friction, are proportional to rates of soil movement and so increased acoustic emissions mean a slope is closer to failure. Once a certain noise rate is recorded, the system can send a warning, via a text message, to the authorities responsible for safety in the area. An early warning allows them to evacuate an area, close transport routes that cross the slope or carry out works to stabilize the soil.

Neil Dixon, professor of geotechnical engineering at Loughborough University and principal investigator on the project, explains how the system – thought to be a global first – works. “In just the same way as bending a stick creates cracking noises that build up until it snaps, so the movement of soil before a landslide creates increasing rates of noise,” said Professor Dixon.

“This has been known since the 1960s, but what we have been able to do that is new is capture and process this information so as to quantify the link between noise and soil displacement rates as it happens, in real time – and hence provide an early warning,” he added.

The system is now being developed further to produce low cost, self-contained sensors that do not require a central computer. This work, which is being carried out under the second project funded by EPSRC, is focused on manufacture of very low cost sensors with integrated visual and/or audible alarms, for use in developing countries. Ongoing work includes field trials, market research and planning commercial exploitation of the technology.

“The development of low cost independent acoustic slope sensors has only become possible in very recent times due to the availability of microprocessors that are fast, small and cheap enough for this task,” says Dixon.

As well as the life-saving implications for countries prone to disastrous landslides, the technique can also be used in monitoring the condition of potentially unstable slopes built to support transport infrastructure, such as rail and road embankments, in developed countries such as the UK.

Current development work is being funded through Loughborough University’s knowledge transfer account, a fund supplied by EPSRC to help commercial exploitation of inventions arising from its research projects. A commercially available Alarms sensor is expected to be launched in the next two years.

Calming earthquake fears in the Midwest

When people in the Midwest say they fear a big earthquake is going to hit their hometown soon, Northwestern University geologist Seth Stein, the author of the new book “Disaster Deferred: How New Science Is Changing Our View of Earthquake Hazards in the Midwest,” tries to reassure them.

There’s little scientific evidence for this fear, according to Stein, the William Deering Professor of Earth and Planetary Sciences in the Weinberg College of Arts and Sciences at Northwestern.

Apocalyptic predictions of an earthquake in the New Madrid seismic zone persist. In 1990, a widely touted prediction said a big quake would hit the area, and a media circus ensued. The prediction proved false but highlighted the fear and hype surrounding the idea of a big Midwestern earthquake.

As the 200th anniversary of the big earthquakes that occurred in the area of New Madrid, Mo., approaches, talk of catastrophe is rising again.

“It’s said that the 1811 and 1812 earthquakes were the biggest in U.S. history, which isn’t true,” Stein said. “Or that they rang church bells in Boston, which isn’t true. And that another huge earthquake is on the way, which there’s no reason to believe.”

In the 1990s, Stein and other researchers conducted routine measurements of earthquake-related activity in New Madrid, with a high-tech version of the GPS technology used in cars and cell phones. Because big earthquakes had happened here about 500 years apart in the past, they expected to see the ground deforming as it stored up energy for another big earthquake. Instead, they found nothing.

The researchers were amazed. “We put markers in the ground and later measured their positions to an accuracy of a millimeter and found that the ground wasn’t moving — so there’s no sign that a big earthquake is on the way,” he said. “Now we’ve got this whole new way of thinking about earthquakes in the middle of our continent.”

The findings detailed in “Disaster Deferred” (Columbia University Press, October 2010) come from more than 20 years of research about the New Madrid seismic zone. The book describes Stein’s scientific adventures that found no sign that big earthquakes will hit the New Madrid area in the next several hundred or even thousands of years.

Stein spoke with Erin White, broadcast editor at Northwestern, about the book.

Why was it important to write this book?

Widely circulated reports say a huge, disastrous earthquake is coming to the Midwest. You hear terrifying predictions about thousands of dead people, hundreds of billions of dollars of damage and other terrible stuff. These predictions are very vague about when the earthquake is coming but claim it’s soon enough that we have to start expensive preparations now to make buildings as strong as in California, where large earthquakes are much more common.

We, of course, can’t say there will never be another New Madrid earthquake like the ones in 1811 and 1812, but there’s no sign of one coming. The next could be thousands of years or tens of thousands of years in the future.

Talk about your surprise at the findings.

The most exciting surprise for a scientist is when a result comes out opposite what you expect. It shows that the way you’d been thinking has to be changed. It’s like opening a door. In this case, it showed that the faults at New Madrid were acting very differently than we expected — they switch on and off.

Now you have this whole new picture of earthquakes that you didn’t have before.

We now understand a lot more about earthquakes in the middle of continents. Continents have lots of faults spread over a huge area. For short times, some will be active and produce large earthquakes. Geologically, that’s a few thousand years. Then, they’ll be essentially dead — producing at most small earthquakes — for many thousands of years. Eventually, they or another fault will switch on. What we’ve learned is important for understanding the earthquake hazard in the Midwest but also for what it tells us about how earthquakes in continents work.

How do you explain the small earthquakes taking place in the Midwest?

Earthquake physics shows that many of those small earthquakes are aftershocks of the big earthquakes 200 years ago. They don’t show that a big one is coming.

What would you tell someone in the Midwest who’s worried about earthquakes?

Enjoy the New Madrid bicentennial but don’t worry too much about earthquakes. They’re an interesting science question but not a serious danger. Make plans for your community carefully, using your experience that in the Midwest earthquakes aren’t a big problem. Decide between spending billions of dollars making buildings as strong as in California or using a less expensive standard and using the money for other needs. Consider whether more good would come from hiring teachers or putting lots of steel in schools that are very unlikely to be seriously shaken.

Why does “Disaster Deferred” focus on how ideas are changing?

When we try to interest young people in science careers, we shouldn’t present science as cut and dry, just facts. The book talks about the process of doing science and how we try to find out how the world works.

Why are some people disappointed when you say that a big earthquake isn’t on the way?

People like to be a little scared. We like Halloween, we like riding roller coasters, we like horror movies. We like the idea of danger, as long as it’s not too big.

That’s why we respond to the disaster stories that come along every few years. Remember, we had Y2K, and the world was going to end. And then we had swine flu, and the world was going to end. Then we were all supposed to go out and get duct tape and tape up our houses against biological terrorism.

These stories are “disaster chic,” as one news guy said. These disasters generally don’t happen, but they make a good story.

The Story Behind “Disaster Deferred” from Northwestern News on Vimeo.

Measuring changes in rock

The capture and storage of carbon dioxide in deep geologic formations, a strategy for minimizing the impacts of greenhouse gases on global warming, may currently be technologically feasible. But one key question that must be answered is the ability of subsurface materials to maintain their integrity in the presence of supercritical carbon dioxide — a fluid state in which the gas is condensed at high temperature and pressure into a liquid.

A research team at the Pacific Northwest National Laboratory has developed tools in EMSL, a national user facility at PNNL, to study the effects of supercritical CO2 on minerals commonly found in potential storage sites. They are presenting their results today at the AVS 57th International Symposium & Exhibition, which takes place this week at the Albuquerque Convention Center in New Mexico.

“The mechanisms of surface interactions with carbon dioxide under these conditions are unknown,” says Scott Lea of PNNL. “We need to know if the carbon dioxide can dry out the clay minerals, creating cracks or have other interactions that could create pores in the rock.”

Because carbon dioxide will be stored at pressures many times greater than atmospheric pressure, the integrity of the rock must be assured.

The same temperature and pressure conditions create a challenge for researchers trying to observe changes in rock samples as they occur. The PNNL group will present a high pressure atomic force microscope (AFM) head that can integrate with existing commercial systems. The new AFM is designed to handle pressures up to 1500 psi. The presentation will show that the AFM head is capable of operating at temperatures and pressures required to maintain carbon dioxide in a supercritical state and that the noise levels are low enough to observe the atomic scale topographic changes due to chemical reactions that may occur between mineral substances and supercritical CO2.

Measuring sea-level rise in the Falklands

San Carlos Water, one of many inlets on East Falkland. The islands are heavily indented by sounds and fjords
San Carlos Water, one of many inlets on East Falkland. The islands are heavily indented by sounds and fjords

Sea levels around the Falkland Islands in the South Atlantic have risen since the mid nineteenth century and the rate of sea-level rise has accelerated over recent decades, according to newly published research. The findings are as expected under global warming and consistent with observations elsewhere around the globe.

“We have been fortunate in being able to compare modern sea-level measurements obtained from tide gauges and from satellite radar altimeters with historical measurements made at Port Louis in the Falkland Islands in 1842,” explained researcher Prof. Philip Woodworth of the National Oceanography Centre.

In 1839, distinguished naval officer and polar explorer James Clark Ross (1800�) set off on an expedition to the Southern Ocean with two ships, HMS Erebus and HMS Terror. In April 1842, he stopped at Port Louis, primarily to make magnetic field and other measurements, but also to make repairs to his ships which had been badly damaged in the Drake Passage. Having set up a winter base, he took the opportunity to make careful measurements of sea level relative to two benchmarks cut into the cliffs and marked with brass plaques.

These marks remain in good condition to this day. This fact, along with the apparent good quality of Ross’s data, has allowed Woodworth’s team to compare the sea level records from 1842 with measurements taken at Port Louis using modern instruments in 1981�, 1984 and 2009. They also used information from nearby Port Stanley, where a permanent tide gauge was operated in the 1960s and 1970s and where NOC has had an operational gauge since 1992.

After correction for air pressure effects and vertical land movement due to geological processes, the researchers find that sea levels rose by an average of around 0.75 millimeters a year between 1842 and the early 1980s. They point out that this figure is similar to previous estimates for the long-term rate of sea-level rise at Port Arthur in Tasmania, measurements with which Ross was also associated, and at other locations in the Northern and Southern Hemispheres.

However, they also find evidence that the rate of sea-level rise has accelerated over recent decades. Specifically, they estimate that sea levels around the Falkland Islands have risen by an average of around 2.5 millimeters a year since 1992, a figure consistent with measurements made by satellite radar altimeters over the same period.

Longer-term data from the Falklands, and from many other locations, are needed to establish whether the apparent acceleration in sea-level rise is due to increased global warming, or the result of some kind of decadal fluctuation.

“The benchmarks left by James Clark Ross on the cliffs of Port Louis will facilitate future studies of sea-level change – just as Ross intended,” said Woodworth.