Public review begins of world’s first standard for geologic storage of CO2

The draft of the world’s first standard for geologic storage of carbon dioxide now is available for public review.

“We’re very proud to provide the link for academics, individuals, researchers and scientists to the world’s first standard for geologic storage of carbon dioxide on both our website and on our Twitter feed, @IPAC_CO2,” said Carmen Dybwad, Chief Executive Officer of IPAC-CO2.

Feedback can be provided online through the CSA Standards public review system on a clause by clause basis.

“It’s a very thorough, professional and measurable way to obtain feedback,” Dybwad said.

CSA Standards, a leading developer of standards, codes and personnel certification programs since 1919, and the International Performance Assessment Centre for Geologic Storage of Carbon Dioxide (IPAC-CO2) began work on June 16, 2010 on the new standard.

A Technical Committee (TC) comprising almost three dozen experts from Canada and the United States began reviewing the seed document IPAC-CO2 had prepared to form the basis of the standard on November 24.

Rick Chalaturnyk, a geotechnical engineering professor and holder of the Foundation CMB Endowed Chair in Reservoir Geomechanics at the University of Alberta in Edmonton, is the chair of the TC.

Sara Forbes, who leads the CCS work at the World Resources Institute (WRI) in Washington, D.C., is the vice-chair of the TC.

“The public review period ends on Dec. 27 so we encourage people to log into the system using the “ipac-co2″ affiliation to share their concerns, insights and opportunities for improvement,” Dybwad said. “All of the information gathered during the public review period will be considered before a final draft is written.”

Upon completion, the new standard will provide essential guidelines for regulators, industry and others around the world involved with scientific and commercial CCS projects.

The new standard will be submitted to the Standards Council of Canada and ANSI in the United States for bi-lateral recognition making it the world’s first formally recognized CCS standard in this area.

The new standard will provide the basis for development of the international standards by the International Organization for Standardization (ISO).

Watching the birth of an iceberg

In October, 2011, NASA's Operation IceBridge discovered a major rift in the Pine Island Glacier in western Antarctica. This crack, which extends at least 18 miles and is 50 meters deep, could produce an iceberg more than 800 square kilometers in size. IceBridge scientists returned soon after to make the first-ever detailed airborne measurements of a major iceberg calving in progress. -  Credit: NASA/Goddard
In October, 2011, NASA’s Operation IceBridge discovered a major rift in the Pine Island Glacier in western Antarctica. This crack, which extends at least 18 miles and is 50 meters deep, could produce an iceberg more than 800 square kilometers in size. IceBridge scientists returned soon after to make the first-ever detailed airborne measurements of a major iceberg calving in progress. – Credit: NASA/Goddard

After discovering an emerging crack that cuts across the floating ice shelf of Pine Island Glacier in Antarctica, NASA’s Operation IceBridge has flown a follow-up mission and made the first-ever detailed airborne measurements of a major iceberg calving in progress.

NASA’s Operation Ice Bridge, the largest airborne survey of Earth’s polar ice ever flown, is in the midst of its third field campaign from Punta Arenas, Chile. The six-year mission will yield an unprecedented three-dimensional view of Arctic and Antarctic ice sheets, ice shelves and sea ice.

Pine Island Glacier last calved a significant iceberg in 2001, and some scientists have speculated recently that it was primed to calve again. But until an Oct. 14 IceBridge flight of NASA’s DC-8, no one had seen any evidence of the ice shelf beginning to break apart. Since then, a more detailed look back at satellite imagery seems to show the first signs of the crack in early October.

While Pine Island has scientists’ attention because it is both big and unstable – scientists call it the largest source of uncertainty in global sea level rise projections – the calving underway now is part of a natural process for a glacier that terminates in open water. Gravity pulls the ice in the glacier westward along Antarctica’s Hudson Mountains toward the Amundsen Sea. A floating tongue of ice reaches out 30 miles into the Amundsen beyond the grounding line, the below-sea-level point where the ice shelf locks onto the continental bedrock. As ice pushes toward the sea from the interior, inevitably the ice shelf will crack and send a large iceberg free.

“We are actually now witnessing how it happens and it’s very exciting for us,” said IceBridge project scientist Michael Studinger, Goddard Space Flight Center, Greenbelt, Md. “It’s part of a natural process but it’s pretty exciting to be here and actually observe it while it happens. To my knowledge, no one has flown a lidar instrument over an actively developing rift such as this.”

A primary goal of Operation IceBridge is to put the same instruments over the exact same flight lines and satellite tracks, year after year, to gather meaningful and accurate data of how ice sheets and glaciers are changing over time. But discovering a developing rift in one of the most significant science targets in the world of glaciology offered a brief change in agenda for the Oct. 26 flight, if only for a 30-minute diversion from the day’s prescribed flight lines.

The IceBridge team observed the rift running across the ice shelf for about 18 miles. The lidar instrument on the DC-8, the Airborne Topographic Mapper, measured the rift’s shoulders about 820 feet apart (250 meters) at its widest, although the rift stretched about 260 feet wide along most of the crack. The deepest points from the ice shelf surface ranged 165 to 195 feet (50 to 60 meters). When the iceberg breaks free it will cover about 340 square miles (880 square kilometers) of surface area. Radar measurements suggested the ice shelf in the region of the rift is about 1,640 feet (500 meters) feet thick, with only about 160 feet of that floating above water and the rest submerged. It is likely that once the iceberg floats away, the leading edge of the ice shelf will have receded farther than at any time since its location was first recorded in the 1940s.

Veteran DC-8 pilot Bill Brockett first flew the day’s designed mission, crisscrossing the flow of the glacier near the grounding line to gather data on its elevation, topography and thickness. When it came time to investigate the crack, Brockett flew across it before turning to fly along the rift by sight. The ATM makes its precision topography maps with a laser than scans 360 degrees 20 times per second, while firing 3,000 laser pulses per second. When flying at an altitude of 3,000 feet, as during this flight, it measures a swath of the surface about 1,500 feet wide. As the crack measured at more than 800 feet wide in places, it was important for Brockett to hold tight over the crevasse.

“The pilots did a really nice job of keeping the aircraft and our ATM scan swath pretty much centered over the rift as you flew from one end to the other,” said Jim Yungel, who leads the ATM team out of NASA’s Wallops Island Flight Facility in Virginia. “It was a real challenge to be told?we’re going to attempt to fly along it and let’s see if your lidar systems can map that crack and can map the bottom of the crack.

“And it was a lot of fun on a personal level to see if something that you built over the years can actually do a job like that. So, yeah, I enjoyed it. I really enjoyed seeing the results being produced.”

While the ATM provided the most detailed measurements of the topography of the rift, other instruments onboard the DC-8 also captured unique aspects. The Digital Mapping System, a nadir-view camera, gathered high-definition close-ups of the craggy split. On the flight perpendicular to the crack, the McCORDS radar also measured its depth and the thickness of the ice shelf in that region.

Catching the rift in action required a bit of luck, but is also testimony to the science benefit of consistent, repeated trips and the flexibility of a manned mission in the field.

“A lot of times when you’re in science, you don’t get a chance to catch the big stories as they happen because you’re not there at the right place at the right time,” said John Sonntag, Instrument Team Lead for Operation IceBridge, based at Goddard Space Flight Center. “But this time we were.”

var image = ‘';
if (swfobject.hasFlashPlayerVersion(“9.0.0″)) {
// has Flash 9; use JW Player 4
var s1 = new SWFObject(“”,”ply”,”320″,”240″,”9″,”#FFFFFF”);
// Note: “file” location is relative to the player’s location (i.e. in “jw4/”); “image” location is relative to this page’s location
s1.addParam(“flashvars”,”file=” + image + “&stretching=none”);
} else {
// has Flash < 9; use JW Player 3
var s1 = new SWFObject("","single","320","240","7");

In October, 2011, NASA’s Operation IceBridge discovered a major rift in the Pine Island Glacier in western Antarctica. This crack, which extends at least 18 miles and is 50 meters deep, could produce an iceberg more than 800 square kilometers in size. IceBridge scientists returned soon after to make the first-ever detailed airborne measurements of a major iceberg calving in progress. – NASA/Goddard/Jefferson Beck

Lessons from the Christchurch, New Zealand earthquake

Details of an earthquake that rocked the largest city in the South Island of New Zealand in February 2011 may transform the way scientists assess the potential threat of fault lines that run through urban centers.

According to a series of new papers published today in Seismological Research Letters (SRL), scientists were surprised at the impact of the earthquake, which registered a relatively moderate magnitude 6.2. The in-depth review of the earthquake that killed more than 180 people and left thousands of homes uninhabitable in Christchurch represents an approach that the authors say should be applied to all earthquakes retrospectively.

“The March 2011 Japan earthquake and tsunami overshadowed the Christchurch earthquake, which was absolutely devastating in its own right,” said Jonathan M. Lees, editor-in-chief of SRL and professor of geosciences at the University of North Carolina at Chapel Hill.

“Compared to the earthquake that destroyed much of Haiti, the scale of disaster in Christchurch may seem small,” said Lees. “Christchurch, however, was constructed using much better technology and engineering practices, raising a very sobering alarm to other major, high density western urban centers.”

The Christchurch earthquake ruptured a previously unmapped fault, surprising many with strong ground motion far greater than previously observed or expected from a magnitude 6.2 seismic event.

The SRL special issue features 19 original technical papers that cover different aspects of the 2011 Christchurch earthquake, including seismological, geodetic, geological and engineering perspectives.

Erol Kalkan, a research structural engineer and manager of the National Strong Motion Network with the U.S. Geological Survey and guest editor of the issue, says the issue serves as “a stand-alone reference” on the Christchurch earthquake and is an example of what should be done for every major earthquake. The first eight papers of this issue focus on earthquake source modeling, fault stress variation and aftershock sequence.

“This earthquake was remarkable on several counts,” said Kalkan. “The ground motion was much larger than previously recorded, the high intensity of shaking was greater than expected, particularly for a moderate size earthquake, and the liquefaction-induced damage was extensive and severe within the Central Business District (CBD) of Christchurch.”

The earthquake was reported to be felt across the South Island and the lower and central North Island. The Christchurch earthquake was especially meaningful, say the authors, because it followed a larger quake that produced less damage and no deaths.

The Feb. 22 earthquake was the strongest seismic event in a series of aftershocks following the magnitude 7.1 Darfield, New Zealand quake on Sept. 4, 2010. Both the Darfield and Christchurch earthquakes ruptured previously unmapped faults, but the corresponding damage was quite different, offering seismologists and engineers a unique opportunity to understand why the Christchurch earthquake proved so devastating.

In this issue, eight papers focus on the observed structural and geotechnical damages associated with the strong ground motion shaking, comparing differing levels of soil liquefaction and the corresponding structural performance of buildings, lifeline structures and engineering systems. The authors collectively provide a detailed catalogue of damage to levees, bridges and multi-story buildings, including stark contrasts in damage due to differing levels of liquefaction.

Much of Christchurch was formerly swampland, beach dune sand, estuaries and lagoons that were drained as the area was settled. Consequently, large areas beneath the city and its environs are characterized by loose silt, sand and gravel. Widespread liquefaction-induced damage within the CBD required 1000 buildings to be demolished, as detailed in a paper by Cubrinovski, et al.

Three papers concentrate on recorded strong ground motions and their engineering implications. “Many urban areas are built over soft sediments and in valleys or over basins, for example the San Francisco Bay Area and Los Angeles Metropolitan,” Kalkan said. “These are urban areas that sit atop geological features that may exaggerate or amplify ground motion, just as Christchurch experienced. The question is how to apply or account for such significant, higher-than-expected ground motions, as seen in Christchurch, when evaluating the design of existing and new structures.”

The Christchurch earthquake will have long-lasting, significant impact on engineering practices leading to profound changes in New Zealand’s building code, says Kalkan, and on the understanding of amplified ground motion.

EARTH: Return of the dust bowl: Geoscientists predict a dry, dusty future for the American West

Haboobs, giant dust storms, walloped Arizona last summer – some close to 2 kilometers high and 160 kilometers wide – knocking out electricity, creating traffic jams and grounding airplanes. Even old-timers say they can’t remember anything quite like this year’s aerial assaults. Meanwhile Texas is experiencing one of the most extreme droughts in recent history, with almost 90 percent of the state in the most extreme level of drought. Arizona, California, Colorado, New Mexico, Oklahoma, Utah and other states are also experiencing drought conditions. The worry is that this might just be the start of a trend, as EARTH reports in the November issue: Over the next couple of decades, researchers say, the American West will transition to an environment that may make the 1930s Dust Bowl seem mild and brief.

The problem, researchers told EARTH in “Return of the Dust Bowl,” is that rising temperatures will contribute directly and indirectly to there being more dust in the air. Then, persistent droughts, increasingly violent and variable weather patterns, urban and suburban development and even off-road recreational vehicle usage compound the problem. So, is the West doomed? Or is there any reason to believe that this forecast may not come true?

Read more about the havoc that dust has been wreaking in the West in the November issue of EARTH now available digitally. Also featured are other stories, such as how one geoscientist took a road trip to find out if the U.S. is ready for vehicles running on natural gas; how subducting seamounts are not to blame for producing megathrust earthquakes like the Japan quake last March; and, you won’t want to miss the news story about La Niña formation, as NOAA just announced that the pattern has formed again in the Pacific and will affect us this winter.

Geologists find ponds not the cause of arsenic poisoning in India’s groundwater

The source of arsenic in India’s groundwater continues to elude scientists more than a decade after the toxin was discovered in the water supply of the Bengal delta in India. But a recent study with a Kansas State University geologist and graduate student, as well as Tulane University, has added a twist — and furthered the mystery.

Arsenic is a naturally occurring trace element, and it causes skin lesions, respiratory failure and cancer when present in high concentrations in drinking water. The environmental crisis began after large traces of the element were detected in the groundwater in the Bengal Basin — an area inhabited by more than 60 million residents. This has caused a water shortage, illness and death in the region, leaving residents unable to even use the water for ordinary tasks like washing dishes or ablution.

“It’s an awful situation,” said Saugata Datta, a Kansas State University assistant professor of geology. “This is one of the worst mass poisoning cases in this history of mankind.”

Though no definitive arsenic source has been determined, many geologists have claimed that recent man-made ponds in the region are a major contributor, as the heavy rainfall and erosion have created high amounts of organic material — containing arsenic — in the ponds. From there the pond’s water and organic material seep into the groundwaters.

Datta and colleagues recently completed a study looking at the ponds. Their findings, “Perennial ponds are not an important source of water or dissolved organic matter to groundwaters with high arsenic concentration in West Bengal, India,” was published in Geophysical Research Letters in late October, and it also appeared in the journal Nature.

“Our study suggests that ponds are not contributing substantial amount of water or this old organic matter into the groundwaters in the shallow aquifer in this region,” Datta said. “These very high arsenic levels are actually coming from something else, possibly from within the organic matter contained in these Holocene sedimentary basins.”

Datta, along with Tulane University colleague Karen Johannesson — the study’s other lead investigator — came to this conclusion after modeling the transport of the pond’s organic matter through the meters of sand and clay to the aquifers below. Because of the organic matter’s highly reactive nature to minerals — like arsenic — researchers found that this organic matter actually serves as a retardant and causes minerals to absorb more slowly into the aquifer sediments.

“Characteristically the organic matter is very sticky and likes to glom onto mineral surfaces,” Datta said. “So it takes much longer for the organic matter to move the same distance along a groundwater flow path than it does through just the water itself.”

According to their model, it would take thousands of years to reach roughly 30 meters into the aquifers in the Bengal delta, which is where we see this peak of arsenic.

“These high arsenic waters at the 30 meter depth are approximately 50 years old,” Datta said. “Since the ponds that supply the organic matter have been around for thousands of years, the current ponds would not be the source of this organic matter.”

The team created their model based on stable isotope data at Kansas State University’s Stable Isotope Spectrometry Laboratory. The lab is operated by Troy Ocheltree, a biology research assistant who co-authored the study.

In the near future, Datta, Sankar Manalikada Sasidharan, a geology graduate student, India, and Sophia Ford, a geology undergraduate student, Wilson, will travel to the region to collect groundwater and aquifer sediment samples for an extensive study that accounts for various valleys and ponds. In addition to arsenic, the team will also monitor for high concentrations of manganese, as scientists are finding that the two metals often appear together.

“The work that we’ve started to look into this source mechanism release in the Bengal delta is still far from being solved,” Datta said. “The mystery still remains. We just added a little bit more to it.”

Mapping the formation of an underwater volcano

This image was taken in 1998 from within the Spanish Exclusive Economic Zone in the area of El Hierro Island. The second has been taken now and shows the new volcano and its lava tongue that descends in the path of the old underwater valley. -  IEO/MICINN
This image was taken in 1998 from within the Spanish Exclusive Economic Zone in the area of El Hierro Island. The second has been taken now and shows the new volcano and its lava tongue that descends in the path of the old underwater valley. – IEO/MICINN

On Oct. 9 an underwater volcano started to emerge in waters off El Hierro Island in the Canaries, Spain. Researchers of the Spanish Institute of Oceanography (IEO, Ministry of Science and Innovation) only needed 15 days to map its formation in high resolution. The volcanic cone has reached a height of 100 m and the lava tongue flows down its side, even though its activity has slowed down in the past few days.

“This is probably the first time that such a young underwater volcano has been mapped in such high resolution,” explains Juan Acosta, head of the IEO campaign set to study the volcanic cone that emerged this month near El Hierro island in the Canaries.

On the 9th October, scientists of Spain’s National Geographic Institute (Spanish Ministry of Development) detected the initial seismic movements that gave way to the birth of the underwater volcano. Then, by the 24th of this month, scientists on board the IEO’s ship Ramón Margalef had already completed the bathymetry (mapping of the sea bed) with unprecedented precision.

The boat has a cutting-edge sensor system which means that details of less than 10 metres can be observed on the sea bed. The bathymetry was obtained in two days by tracing parallel scans.

In 1998, within the framework in Spanish Exclusive Economic Zone, researchers of the IEO and Spain’s Marine Hydrographic Institute (Spanish Ministry of Defence) also mapped the same area from within the oceanographic ship Hespérides. Using a geographic information system, these images have now been superimposed onto those just taken and thus the birth of the volcano has been confirmed.

Acosta says that “it is spectacular to see how what was once an underwater valley is now a volcanic cone with its descending lava tongue.”

The base of the volcano lies at a depth of 300 m. It is conical and 100 m high with a base diameter of 700 m and a crater width of 120 m. The volume of the volcano is around 0.012 km3, 0.07 km3 of which is made up of its lava tongue that is slowly filling the adjacent valley.

Scientists have also created graphs of the gas plumes that are consistently coming out of the main crater and the surrounding cracks. However, at present the possible development and risks of the volcano have not been officially declared. Their mission is to provide data to those in charge of the Special Civil Protection Plan for Emergency Volcanic Risk in the Canary Islands (PEVOLCA) as a way of aiding them in the decision making process.

Named Bimbache after the first settlers of El Hierro Island, this scientific campaign is currently entering its second phase under the orders of the researcher Francisco Sánchez who is also from the IEO.

Until the 31st October, photographs and videos will be taken of the volcanic cone with an array of high resolution cameras which will be pulled by the remote observation submarine Liropus. From there, a third stage is predicted to get underway. This will involve the analysis of the currents and the physicochemical properties of the columns of water that surround the new volcano.

Prehistoric greenhouse data from ocean floor could predict earth’s future, study finds

Kenneth MacLeod, MU professor of geological sciences, says changes in ocean circulation patterns 70 million years ago could help scientists understand the consequences of modern increases in greenhouse gases. -  MU News Bureau
Kenneth MacLeod, MU professor of geological sciences, says changes in ocean circulation patterns 70 million years ago could help scientists understand the consequences of modern increases in greenhouse gases. – MU News Bureau

New research from the University of Missouri indicates that Atlantic Ocean temperatures during the greenhouse climate of the Late Cretaceous Epoch were influenced by circulation in the deep ocean. These changes in circulation patterns 70 million years ago could help scientists understand the consequences of modern increases in greenhouse gases.

“We are examining ocean conditions from several past greenhouse climate intervals so that we can understand better the interactions among the atmosphere, the oceans, the biosphere, and climate,” said Kenneth MacLeod, professor of geological sciences in the College of Arts and Science. “The Late Cretaceous Epoch is a textbook example of a greenhouse climate on earth, and we have evidence that a northern water mass expanded southwards while the climate was cooling. At the same time, a warm, salty water mass that had been present throughout the greenhouse interval disappeared from the tropical Atlantic.”

The study found that at the end of the Late Cretaceous greenhouse interval, water sinking around Greenland was replaced by surface water flowing north from the South Atlantic. This change caused the North Atlantic to warm while the rest of the globe cooled. The change started about five million years before the asteroid impact that ended the Cretaceous Period.

To track circulation patterns, the researchers focused on “neodymium,” an element that is taken up by fish teeth and bones when a fish dies and falls to the ocean floor. MacLeod said the ratio of two isotopes of neodymium acts as a natural tracking system for water masses. In the area where a water mass forms, the water takes on a neodymium ratio like that in rocks on nearby land. As the water moves through the ocean, though, that ratio changes little. Because the fish take up the neodymium from water at the seafloor, the ratio in the fish fossils reflects the values in the area where the water sank into the deep ocean. Looking at changes through time and at many sites allowed the scientists to track water mass movements.

While high atmospheric levels of carbon dioxide caused Late Cretaceous warmth, MacLeod notes that ocean circulation influenced how that warmth was distributed around the globe. Further, ocean circulation patterns changed significantly as the climate warmed and cooled.

“Understanding the degree to which climate influences circulation and vice versa is important today because carbon dioxide levels are rapidly approaching levels most recently seen during ancient greenhouse times,” said MacLeod. “In just a few decades, humans are causing changes in the composition of the atmosphere that are as large as the changes that took millions of years to occur during geological climate cycles.”

Global warming target to stay below 2 degrees requires more action this decade

Climate scientists say the world’s target to stay below a global warming of 2 degrees, made at the United Nations conference in Copenhagen in 2009 and Cancun 2010 will require decisive action this decade.

A comprehensive review of 193 emission scenarios from scientific literature to date has been published in Nature Climate Change by University of Melbourne and international scientists.

This study found the target of 44 billion tons of carbon dioxide equivalent emissions (GtCO2eq) by 2020 is a feasible milestone and an economically optimal approach for countries to meet the internationally agreed 2 degree target.

Dr Malte Meinshausen from the University of Melbourne’s School of Earth Sciences and a senior author on the study said the world is currently at 48 GtCO2eq and as this research suggests, to reverse the growing emission trend this decade is vital.

The study analysed feasible emissions scenarios, which included a mix of mitigation actions ranging from energy efficiency to carbon free technologies such as solar photovoltaic, wind and biomass.

“Our study revealed there are many emissions scenarios that are economically and technologically feasible pathways to a 2 degree target, but that for countries to get closer to this target they need to honour the higher end of their pledges,” he said.

Using a risk-based climate model developed by Dr Meinshausen, an international team of scientists led by Joeri Rogelj from ETH Zurich, Switzerland, analyzed how global greenhouse gas emissions in 2020 can be managed with a long-term 2 degree target.

By analyzing the emissions scenarios in the climate model, researchers were able to generate a probabilistic projection of CO2 concentration in the atmosphere and global temperature for the next hundred years.

And to determine in particular, which scenarios provided the best possible chance of reaching the global target of 2 degrees and moving to a zero carbon economy in the latter half of the century.

“As long as we keep emitting carbon dioxide, the climate will continue to warm. There is no way around a zero carbon economy sooner or later if we want to stay below 2 degrees,” Dr Meinshausen said.

A previous United Nations Emissions Gap report in 2010 which summarised all comparable emissions pledges by industrialized and developing countries, found 2020 emissions would still rise well beyond 50 GtCO2eq.

By specifying the level of 44 GtCO2eq, today’s study suggests that countries’ current pledges made at Copenhagen and Cancun are insufficient to meet the economically optimal milestone by 2020 to reach the 2 degree target.

In terms of Australia, the Federal Government recently announced its emission trading system to reduce its emissions by 5% to 25% below 2000 levels. Targeting the 500 top polluters is the cornerstone to the policy to achieve its 5% target.

“Our study confirms that only by moving to the more ambitious end of the pledges, 25% in the case of Australia, the world would be getting closer to being on track to the 44 GtCO2eq, 2 degree milestone,” he said.

“If the international community is serious about avoiding dangerous climate change, countries seem ill-advised by continuing to increase emissions, which they have done so in the last ten years, which ultimately will lead to disastrous consequences later on,” he said.

“We can anticipate Australia will be one of the countries hardest hit by climate change due to recent years of droughts and floods. This is consistent with projections that we are going to expect more of these kinds of extreme conditions in the coming decades,” he added.

“By our calculations, the world needs to do more this decade, as otherwise the 2 degree target to avert serious effects of climate change, is slipping out of reach,” he said.