When continents formed

The continental crust is the principal record of conditions on the Earth for the last 4.4 billion years. Its formation modified the composition of the mantle and the atmosphere, it supports life, and it remains a sink for carbon dioxide through weathering and erosion. The continental crust therefore has had a key role in the evolution of the Earth, and yet the timing of its generation remains the topic of considerable debate.

It is widely believed that the juvenile continental crust has grown from the depleted upper mantle. One common way to assess when new crust was formed is to determine the radiogenic isotope composition of any crustal sample, and to compare its isotope signature with that of the depleted mantle. In other words, radiogenic isotopes can be used to calculate ‘model ages’ of crust formation, which represent the time since a crustal sample was separated from its mantle source.

The concept of ‘model age’ has been widely used in crustal evolution studies for the last three decades. However it is increasingly clear that using the isotope composition of the depleted mantle as a reference for the calculation of model ages of continental crust generation can lead to incomplete interpretations.

In a paper published today in Science, Dr Bruno Dhuime of Bristol’s School of Earth Sciences and colleagues describe a new methodology for the calculation of model ages, based on the isotope composition of the average new continental crust.

Dr Dhuime said: “Ages calculated this way are significantly younger than model ages calculated from the isotope composition of the depleted mantle. New ages obtained are more consistent with the geological record, which opens new perspectives in crustal evolution studies based on radiogenic isotopes.”

Earth’s hot past could be prologue to future climate

The magnitude of climate change during Earth’s deep past suggests that future temperatures may eventually rise far more than projected if society continues its pace of emitting greenhouse gases, a new analysis concludes. The study, by National Center for Atmospheric Research (NCAR) scientist Jeffrey Kiehl, will appear as a “Perspectives” piece in this week’s issue of the journal Science.

Building on recent research, the study examines the relationship between global temperatures and high levels of carbon dioxide in the atmosphere tens of millions of years ago. It warns that, if carbon dioxide emissions continue at their current rate through the end of this century, atmospheric concentrations of the greenhouse gas will reach levels that existed about 30 million to 100 million years ago, when global temperatures averaged about 29 degrees Fahrenheit (16 degrees Celsius) above pre-industrial levels.

Kiehl said that global temperatures may gradually rise over centuries or millennia in response to the carbon dioxide. The elevated levels of carbon dioxide may remain in the atmosphere for tens of thousands of years, according to recent computer model studies of geochemical processes that the study cites.

The study also indicates that the planet’s climate system, over long periods of times, may be at least twice as sensitive to carbon dioxide than currently projected by computer models, which have generally focused on shorter-term warming trends. This is largely because even sophisticated computer models have not yet been able to incorporate critical processes, such as the loss of ice sheets, that take place over centuries or millennia and amplify the initial warming effects of carbon dioxide.

“If we don’t start seriously working toward a reduction of carbon emissions, we are putting our planet on a trajectory that the human species has never experienced,” says Kiehl, a climate scientist who specializes in studying global climate in Earth’s geologic past. “We will have committed human civilization to living in a different world for multiple generations.”

The Perspectives article pulls together several recent studies that look at various aspects of the climate system, while adding a mathematical approach by Kiehl to estimate average global temperatures in the distant past. Its analysis of the climate system’s response to elevated levels of carbon dioxide is supported by previous studies that Kiehl cites. The work was funded by the National Science Foundation, NCAR’s sponsor.

Learning from Earth’s past

Kiehl focused on a fundamental question: when was the last time Earth’s atmosphere contained as much carbon dioxide as it may by the end of this century?

If society continues on its current pace of increasing the burning of fossil fuels, atmospheric levels of carbon dioxide are expected to reach about 900 to 1,000 parts per million by the end of this century. That compares with current levels of about 390 parts per million, and pre-industrial levels of about 280 parts per million.

Since carbon dioxide is a greenhouse gas that traps heat in Earth’s atmosphere, it is critical for regulating Earth’s climate. Without carbon dioxide, the planet would freeze over. But as atmospheric levels of the gas rise, which has happened at times in the geologic past, global temperatures increase dramatically and additional greenhouse gases, such as water vapor and methane, enter the atmosphere through processes related to evaporation and thawing. This leads to further heating.

Kiehl drew on recently published research that, by analyzing molecular structures in fossilized organic materials, showed that carbon dioxide levels likely reached 900 to 1,000 parts per million about 35 million years ago.

At that time, temperatures worldwide were substantially warmer than at present, especially in polar regions-even though the Sun’s energy output was slightly weaker. The high levels of carbon dioxide in the ancient atmosphere kept the tropics at about 9-18 degrees F (5-10 degrees C) above present-day temperatures. The polar regions were some 27-36 degrees F (15-20 degrees C) above present-day temperatures.

Kiehl applied mathematical formulas to calculate that Earth’s average annual temperature 30 to 40 million years ago was about 88 degrees F (31 degrees C)-substantially higher than the pre-industrial average temperature of about 59 degrees F (15 degrees C).

Twice the heat?

The study also found that carbon dioxide may have at least twice the effect on global temperatures than currently projected by computer models of global climate.

The world’s leading computer models generally project that a doubling of carbon dioxide in the atmosphere would have a heating impact in the range of 0.5 to 1.0 degree C watts per square meter. (The unit is a measure of the sensitivity of Earth’s climate to changes in greenhouse gases.) However, the published data show that the comparable impact of carbon dioxide 35 million years ago amounted to about 2 degrees C watts per square meter.

Computer models successfully capture the short-term effects of increasing carbon dioxide in the atmosphere. But the record from Earth’s geologic past also encompasses longer-term effects, which accounts for the discrepency in findings. The eventual melting of ice sheets, for example, leads to additional heating because exposed dark surfaces of land or water absorb more heat than ice sheets.

“This analysis shows that on longer time scales our planet may be much more sensitive to greenhouse gases than we thought,” Kiehl says.

Climate scientists are currently adding more sophisticated depictions of ice sheets and other factors to computer models. As these improvements come on line, Kiehl believes that the computer models and the paleoclimate record will be in closer agreement, showing that the impacts of carbon dioxide on climate over time will likely be far more substantial than recent research has indicated.

Because carbon dioxide is being pumped into the atmosphere at a rate that has never been experienced, Kiehl could not estimate how long it would take for the planet to fully heat up. However, a rapid warm-up would make it especially difficult for societies and ecosystems to adapt, he says.

If emissions continue on their current trajectory, “the human species and global ecosystems will be placed in a climate state never before experienced in human history,” the paper states.

Earth: Finding new oil and gas frontiers

Where to next in the search for oil and gas? EARTH examines several possible new frontiers – including the Arctic, the Falkland Islands, the Levant, Trinidad and Tobago and Sudan – where oil and gas exploration are starting to take hold. One of those places, Sudan, is in the news for other reasons: South Sudan voted yesterday on whether to secede from North Sudan.

But given that South Sudan holds more than 70 percent of Sudan’s 5 billion to 6 billion barrels of proven reserves, a lot in this election hinges on oil. If South Sudan does secede, how will both sides agree to a new oil profit-sharing agreement? What will it mean for both sides’ economies? EARTH examines what role oil will play in this international affair, as well as looking at how development in other new frontiers will affect the oil and gas marketplace.

Learn more about this eye-opening subject in February’s articles “Finding New Oil and Gas Frontiers,” and read other analytical stories on topics such as determining dinosaur origins, tracing nuclear weapons using bomb debris, and reconsidering the economic implications of climate change also in the February issue.

These stories and many more can be found in the February issue of EARTH, now available digitally (http://www.earthmagazine.org/digital/) or in print on your local newsstands.

Engineers lead national effort to save lives and buildings during earthquakes

Benson Shing, a UC San Diego structural engineering professor, is leading a national effort to keep masonry buildings such as schools, hotels and apartments earthquake safe. -  UC San Diego
Benson Shing, a UC San Diego structural engineering professor, is leading a national effort to keep masonry buildings such as schools, hotels and apartments earthquake safe. – UC San Diego

Several major earthquake events around the world over the last few years have led to significant damage and loss of lives. Many of these quakes caused buildings to collapse related to the construction quality of those structures. Here in the United States, engineers are working to ensure that the quality of buildings is much better than those that have collapsed due to earthquakes in recent years.

For example, earthquake engineers from UC San Diego, University of Texas at Austin and Washington State University are joining efforts to make buildings such as hotels, schools, apartments and hospitals safer. To do this, the researchers will put a three-story reinforced masonry structure with shear wall systems through a series of rigorous earthquakes beginning Jan. 10. This three-story masonry structure represents a basic, repetitive part of common apartment or hotel buildings made of reinforced, bearing-wall masonry. It will be the first time this type structure has been tested at this scale on a shake table. The series of two-week tests will be performed at the UC San Diego Englekirk Structural Engineering Center, home of the world’s largest outdoor shake table. The engineers will model the simulated shakes after historic earthquakes such as the 1994 Northridge, which measured a 6.7 magnitude. The engineers are expected to simulate earthquakes all the way up to a 7.0 magnitude and perhaps above during the tests. The project is mainly funded by a $1.5 million grant from the National Institute of Standards and Technology through the American Recovery and Reinvestment Act program. The shake table tests are supported by the National Science Foundation through the Network for Earthquake Engineering Simulation (NEES) Program.

Reinforced masonry construction is common across the country. While many people believe that earthquakes mainly happen in California, there is also significant seismic risk in the Midwest and Eastern United States, said Benson Shing, a UC San Diego structural engineering professor who is leading the project.

“We have low probability, high consequence events in those parts of the countries, so the performance of reinforced masonry structures in earthquakes is very important,” Shing said.

Shing said these types of structures demonstrated good performance in the 1994 Northridge Earthquake. The structure being tested has been designed according to the latest building code requirements. In theory, Shing said, it should perform even better than those built before Northridge. However, neither current nor pre-Northridge designs have been tested in an extraordinarily strong earthquake or with a large scale shake table test like this one.

“The building design code has been changing over the years,” Shing said. “We want the structure to have a low probability of collapse under an extreme earthquake event. The shake table tests will provide good data to see if we could achieve such a goal. The data will also enhance our confidence on analytical tools, which can be used for future evaluation studies of buildings with different design details and configurations.

“If we find that there is some room for improvement we will recommend improved design details for these structures. Of course, if we find problems with existing structures we will think about how we could improve the performance of these structures by using some retrofit techniques,” he added.

In the second phase of the project, Shing and his colleagues will test, in early 2012, a two-story, low-rise masonry structure with smaller window openings.

“This type of building is difficult to analyze so it presents a major challenge in design,” Shing said. “You can’t reliably assess the performance of these structures with analytical methods normally used by engineers. Hopefully we can use our data to develop better design methodologies and analytical tools.”

While life safety is high on the list for protecting these structures from severe earthquake damage or collapse, economics is also an important consideration. Shing said it’s critical to minimize the life cycle cost of the buildings by avoiding costly repairs after major earthquakes.

“Civil structures are very different from airplane structures,” Shing said. “We don’t have a standard prototype to work with. Every structure is unique and different; the design of this type of structure is an art. How it performs in an earthquake is sometimes very difficult to predict. Testing the structure at full scale under realistic conditions is a very rare opportunity. So that’s why we are very excited about this type of research.”

var image = ‘http://www.geologytimes.com/flash/249165832.jpg';
if (swfobject.hasFlashPlayerVersion(“9.0.0″)) {
// has Flash 9; use JW Player 4
var s1 = new SWFObject(“http://www.geologytimes.com/flash/player.swf”,”ply”,”320″,”240″,”9″,”#FFFFFF”);
// Note: “file” location is relative to the player’s location (i.e. in “jw4/”); “image” location is relative to this page’s location
s1.addParam(“flashvars”,”file=http://www.geologytimes.com/flash/249165832.flv&image=” + image + “&stretching=none”);
} else {
// has Flash < 9; use JW Player 3
var s1 = new SWFObject("http://www.geologytimes.com/flash/flvplayer.swf","single","320","240","7");

Earthquake engineers from UC San Diego, University of Texas at Austin and Washington State University are joining efforts to make buildings such as hotels, schools, apartments and hospitals safer. To do this, the researchers will put a three-story reinforced masonry structure with shear wall systems through a series of rigorous earthquakes beginning Jan. 10. – UC San Diego

Mountain glacier melt to contribute 12 centimeters to world sea-level increases by 2100

Melt off from small mountain glaciers and ice caps will contribute about 12 centimetres to world sea-level increases by 2100, according to UBC research published this week in Nature Geoscience.

The largest contributors to projected global sea-level increases are glaciers in Arctic Canada, Alaska and landmass bound glaciers in the Antarctic. Glaciers in the European Alps, New Zealand, the Caucasus, Western Canada and the Western United Sates–though small absolute contributors to global sea-level increases–are projected to lose more than 50 per cent of their current ice volume.

The study modelled volume loss and melt off from 120,000 mountain glaciers and ice caps, and is one of the first to provide detailed projections by region. Currently, melt from smaller mountain glaciers and ice caps is responsible for a disproportionally large portion of sea level increases, even though they contain less than one per cent of all water on Earth bound in glacier ice.

“There is a lot of focus on the large ice sheets but very few global scale studies quantifying how much melt to expect from these smaller glaciers that make up about 40 percent of the entire sea-level rise that we observe right now,” says Valentina Radic, a postdoctoral researcher with the Department of Earth and Ocean Sciences and lead author of the study.

Increases in sea levels caused by the melting of the Greenland and Antarctic ice sheets, and the thermal expansion of water, are excluded from the results.

Radic and colleague Regine Hock at the University of Alaska, Fairbanks, modeled future glacier melt based on temperature and precipitation projections from 10 global climate models used by the Intergovernmental Panel on Climate Change.

“While the overall sea level increase projections in our study are on par with IPCC studies, our results are more detailed and regionally resolved,” says Radic. “This allows us to get a better picture of projected regional ice volume change and potential impacts on local water supplies, and changes in glacier size distribution.”

Global projections of sea level rises from mountain glacier and ice cap melt from the IPCC range between seven and 17 centimetres by the end of 2100. Radic’s projections are only slightly higher, in the range of seven to 18 centimetres.

Radic’s projections don’t include glacier calving–the production of icebergs. Calving of tide-water glaciers may account for 30 per cent to 40 per cent of their total mass loss.

“Incorporating calving into the models of glacier mass changes on regional and global scale is still a challenge and a major task for future work,” says Radic.

However, the new projections include detailed projection of melt off from small glaciers surrounding the Greenland and Antarctic ice sheets, which have so far been excluded from, or only estimated in, global assessments.

Lessons learned from oil rig disaster

The Deepwater Horizon oil rig explosion in the Gulf of Mexico on April 20 triggered one of history’s biggest oil spills at sea. Eleven people lost their lives in the accident. Here the beaches are being cleared. (Photo: BP)
The Deepwater Horizon oil rig explosion in the Gulf of Mexico on April 20 triggered one of history’s biggest oil spills at sea. Eleven people lost their lives in the accident. Here the beaches are being cleared. (Photo: BP)

When interviewed by the BBC, the now retired BP boss Tony Hayward admitted to his company’s insufficient response to the Deepwater Horizon rig accident in the Gulf of Mexico. Could the company have been better prepared for what turned out to be one of the biggest oil disasters in history?

“We were making it up day to day,” Hayward said of BP’s rescue plan. Together with chairman of the board, Carl-Henrik Svenberg, he was held responsible for 11 dead and 17 injured workers. According to the New York Times, five million barrels of oil leaked into the ocean outside the coast of Louisiana between April and August 2010.

A lack of safety procedures was identified by the oil spill investigation commission, set up by President Barack Obama, as a determining factor behind the disaster. The three companies involved in the accident – BP, Transocean and Halliburton – were all accused of having cut corners in order to complete the well. At the time of the blow-out, this job was five weeks behind schedule. Five survivors talked to CNN about a corporate culture in which safety warnings were routinely ignored.

“Major accidents such as the Deepwater Horizon disaster in the Gulf of Mexico could also happen in the North Sea,” says Preben Lindøe, professor of societal safety and security at the University of Stavanger, Norway.

“But strong, organizational barriers between the oil industry, trade unions and the Petroleum Safety Authority Norway reduce the risk,” he adds.

Together with his colleague, associate professor Ole Andreas Engen, he is part of the Robust Regulation in the Petroleum Sector team. The four-year research project, funded by the Norwegian Research Council, also involve the independent research group Sintef and the University of Oslo, in addition to legal expertise affiliated with Boston university.

Different practice

The researchers compare oil industry regulation in the USA, Great Britain and Norway.

“There are hardly any unions in the Gulf of Mexico. Tripartite collaboration, as it is practiced on the Norwegian continental shelf, is therefore impossible,” says Lindøe.

The US regulator, Minerals Management Service, carries out inspections based on a fairly meticulous body of rules. Inspectors are transported to offshore installations, equipped with long and detailed check lists.

By comparison, Norwegian regulation is based on internal control. The authorities thereby rely on the companies administering their safety work themselves. While the Norwegian model is based on trust – built up over time – and the sharing of experience and information, the situation in the US is almost the opposite, according to Lindøe.

“The reason this model has succeeded in Norway, is because the parties have been able to fill the concept of internal control with substance. Both employers and unions are involved in developing industrial standards and good practice which can be adhered to,” he says.

Close shave in the North Sea

But recent near-accidents in Norway may potentially have become disasters. In May 2010, Norwegian oil major Statoil had problems during drilling operations at its Gullfaks field in the North Sea. While drilling a well from the Gullfaks C installation, gas entered the well and reached the platform deck. According to the company’s investigation report, only luck prevented the incident from becoming a much more complicated subsea blowout.

The Petroleum Safety Authority shared this conclusion, and pointed out that the incident could easily have evolved into a disaster. It issued four enforcement notices to Statoil, and the company was reported to the police by environmental group Bellona. International media compared the Gullfaks incident to the Deepwater Horizon accident.

“The public’s attention is triggered by such incidents, and we are made aware of society’s unpredictability. When perceived threats are referred to by the media, societal safety is pushed up on the agenda. The attention paid to this subject varies, which lies in its nature. When safety work succeeds, its success is proved by the non-occurrence of serious incidents. When nothing happens, we may become less attentive and sloppier in adhering to routines and procedures,” says Ole Andreas Engen.

“When attention fades, accidents happen more easily, and are followed by increased awareness. Societal safety is thus a perpetual Sisyphus effort. It is a big challenge for all organizations to maintain a high level of safety awareness over time,” he says.

Robust regulation

The researchers point to another example: A gas leak at the North Sea field Snorre in 2004, when an accident equivalent to Deepwater Horizon was only a spark away. But in spite of a number of near-accidents, Norwegian regulation is still more robust than the US’.

The petroleum industry in Norway has gone through several critical phases in its history. Gradually, the parties involved have learned to trust each other. A robust system like this is able to withstand a blow. This is not the case in the USA, where the authorities have a much more difficult task in monitoring regulations. There are strict requirements for new regulations to undergo cost-benefit analyses, which must be submitted to the President’s office, the researchers explain.

Moreover, the regulation of safety and the work environment is divided between two governmental agencies. The US Coast Guard is the controlling authority of personnel safety on offshore platforms, says Lindøe.

“Workers don’t enjoy the same legitimacy with regard to their role in safety work as they do in Norway,” he adds.

According to Lindøe and Engen, it is common practice in the US to look for scapegoats, and pin the blame for accidents on them, instead of changing the systems. In Norway, the parties are more likely to come together to find out how systems and routines may have contributed to an employee making a mistake. The researcher sum up the lessons learned after the Gulf of Mexico disaster:

“The Deepwater Horizon accident has uncovered some evident weaknesses within US safety regulation. The Government being restrained from intervening directly with the industry is one of them. To the Norwegian industry, this accident and the near-accident on Gullfaks C, should serve as reminders of the importance of maintaining the foundation pillars of the Norwegian safety management system: Effective and well qualified authorities, and clear guidelines for cooperation and trust between the parties,” Lindøe concludes.

Scientists find methane gas concentrations have returned to near-normal levels

UCSB graduate student Stephanie Mendes, left, and postdoctoral researcher Molly Redmond are sampling water. -  Texas A&M University and NOAA
UCSB graduate student Stephanie Mendes, left, and postdoctoral researcher Molly Redmond are sampling water. – Texas A&M University and NOAA

Calling the results “extremely surprising,” researchers from the University of California, Santa Barbara and Texas A&M University report that methane gas concentrations in the Gulf of Mexico have returned to near normal levels only months after a massive release occurred following the Deepwater Horizon oil rig explosion.

Findings from the research study, led by oceanographers John Kessler of Texas A&M and David Valentine of UCSB, were published today in Science Xpress, in advance of their publication in the journal Science. The findings show that Mother Nature quickly saw to the removal of more than 200,000 metric tons of dissolved methane through the action of bacteria blooms that completely consumed the immense gas plumes the team had identified in mid-June. At that time, the team reported finding methane gas in amounts 100,000 times above normal levels. But, about 120 days after the initial spill, they could find only normal concentrations of methane and clear evidence of complete methane respiration.

“What we observed in June was a horizon of deep water laden with methane and other hydrocarbon gases,” Valentine said. “When we returned in September and October and tracked these waters, we found the gases were gone. In their place were residual methane-eating bacteria, and a 1 million ton deficit in dissolved oxygen that we attribute to respiration of methane by these bacteria.”

Kessler added: “Based on our measurements from earlier in the summer and previous other measurements of methane respiration rates around the world, it appeared that (Deepwater Horizon) methane would be present in the Gulf for years to come. Instead, the methane respiration rates increased to levels higher than have ever been recorded, ultimately consuming it and prohibiting its release to the atmosphere.”

While the scientists’ research documents the changing conditions of the Gulf waters, it also sheds some light on how the planet functions naturally.

“This tragedy enabled an impossible experiment,” Valentine said, “one that allowed us to track the fate of a massive methane release in the deep ocean, as has occurred naturally throughout Earth’s history.”

Kessler noted: “We were glad to have the opportunity to lend our expertise to study this oil spill. But also we tried to make a little good come from this disaster and use it to learn something about how the planet functions naturally. The seafloor stores large quantities of methane, a potent greenhouse gas, which has been suspected to be released naturally, modulating global climate. What the Deepwater Horizon incident has taught us is that releases of methane with similar characteristics will not have the capacity to influence climate.”

The Deepwater Horizon offshore drilling platform exploded on April 20, 2010, about 40 miles off the Louisiana coast. The blast killed 11 workers and injured 17 others. Oil was gushing from the site at the rate of 62,000 barrels per day, eventually spilling an estimated 170 million gallons of oil into the Gulf. The leak was capped on July 15, and the well was permanently sealed on Sept. 19.

The research team collected thousands of water samples at 207 locations covering an area of about 36,000 square miles. The researchers based their conclusions on measurements of dissolved methane concentrations, dissolved oxygen concentrations, methane oxidation rates, and microbial community structure.

Freshwater methane release changes greenhouse gas equation

An international team of scientists has released data indicating that greenhouse gas uptake by continents is less than previously thought because of methane emissions from freshwater areas.

John Downing, an Iowa State University professor in the ecology, evolution and organismal biology department, is part of an international team that concluded that methane release from inland waters is higher than previous estimates.

The study, published in the journal Science, indicates that methane gas release from freshwater areas changes the net absorption of greenhouse gases by natural continental environments, such as forests, by at least 25 percent. Past analyses of carbon and greenhouse gas exchanges on continents failed to account for the methane gas that is naturally released from lakes and running water.

Downing, a laboratory limnologist at Iowa State, has also conducted research measuring the amount of carbon sequestered in lake and pond sediment. This new study gives scientists a better understanding of the balance between carbon sequestration and greenhouse gas releases from fresh water bodies.

“Methane is a greenhouse gas that is more potent than carbon dioxide in the global change scenario,” Downing said. “The bottom line is that we have uncovered an important accounting error in the global carbon budget. Acre for acre, lakes, ponds, rivers and streams are many times more active in carbon processing than seas or land surfaces, so they need to be included in global carbon budgets.”

Methane emissions from lakes and running water occur naturally, but have been difficult to assess. David Bastviken, principal author and professor in the department of water and environmental studies, at Linköping University in Sweden, said small methane emissions from the surfaces of water bodies occur continuously.

“Greater emissions occur suddenly and with irregular timing, when methane bubbles from the sediment reach the atmosphere, and such fluxes have been difficult to measure,” Bastviken said.

The greenhouse effect is caused by human emission of gasses that act like a blanket and trap heat inside the Earth’s atmosphere, according to the International Panel on Climate Change. Some ecosystems, such as forests can absorb and store greenhouse gasses. The balance between emissions and uptake determine how climate will change. The role of freshwater environments has been unclear in previous budgets, Downing said.

Sulphur proves important in the formation of gold mines

Collaborating with an international research team, an economic geologist from The University of Western Ontario has discovered how gold-rich magma is produced, unveiling an all-important step in the formation of gold mines.

The findings were published in the December issue of Nature Geoscience.

Robert Linnen, the Robert Hodder Chair in Economic Geology in Western’s Department of Earth Sciences conducts research near Kirkland Lake, Ontario and says the results of the study could lead to a breakthrough in choosing geographic targets for gold exploration and making exploration more successful.

Noble metals, like gold, are transported by magma from deep within the mantle (below the surface) of the Earth to the shallow crust (the surface), where they form deposits. Through a series of experiments, Linnen and his colleagues from the University of Hannover (Germany), the University of Potsdam (Germany) and Laurentian University found that gold-rich magma can be generated in mantle also containing high amounts of sulphur.

“Sulphur wasn’t recognized as being that important, but we found it actually enhances gold solubility and solubility is a very important step in forming a gold deposit,” explains Linnen. “In some cases, we were detecting eight times the amount of gold if sulphur was also present.”

Citing the World Gold Council, Linnen says the best estimates available suggest the total volume of gold mined up to the end of 2009 was approximately 165,600 tonnes. Approximately 65 per cent of that total has been mined since 1950.

“All the easy stuff has been found,” offers Linnen. “So when you project to the future, we’re going to have to come up with different ways, different technologies and different philosophies for finding more resources because the demand for resources is ever-increasing.”

Researchers developing shale gas reservoir simulator

University of Oklahoma researchers are developing a new simulator for shale gas reservoirs that will provide oil and gas companies with an essential tool for managing production and choosing drilling locations to lower costs and increase production.

OU professors Richard Sigal, Faruk Civan and Deepak Devegowda, Mewbourne College of Earth and Energy, are the first to systematically tackle this challenge. The project is supported with $1,053,778 from the Research for Partnership to Secure Energy for America plus an additional $250,000 in matching funds from a consortium of six oil and gas producing companies.

Natural gas has a smaller greenhouse gas effect and is less polluting than other fossil fuels, plus gas produced from shale gas reservoirs can have a positive impact on the U.S. economy by replacing coal used for electrical generation, natural gas imports and oil imports in some applications.

“Simulators for conventional reservoirs are not suited for shale gas reservoirs,” says Sigal. “An example of this is the deposition of frac water used to force the gas from the reservoir. In a shale gas reservoir, massive hydraulic fracturing opens up the reservoir so the gas can flow. This involves pumping a large amount of water into the reservoir. In conventional reservoirs all this water is produced back, but in shale gas reservoirs only a small percentage of the water is produced.”

According to Sigal, “Current commercial simulators do not successfully predict the amount of water produced. Researchers need to model the deposition of this water to better understand the reservoir and address concerns the effects this water can have on shallow aquifers. One goal of the simulator project is to determine and provide the capacity to model frac water deposition.”

“Predicting long-term gas production with history matching requires more accurate physics and geology,” states Sigal. “Using a new $2 million microscope at OU to see the detailed porosity of the rocks, Professor Carl H. Sondergeld and his collaborators have found two kinds of pore space in the rocks. Besides the inorganic pore space where we expect to find gas, they discovered pores the size of nanometers in the organic portion of the rock. This discovery needs to be incorporated into the simulator design.”

OU researchers recognize the physics of fluid flow and storage are very different in the inorganic and the organic portions of shale gas reservoirs. And, these reservoirs contain both natural and induced fracture systems each with different properties. OU researchers will develop a quad porosity model to take into account these differences.

There are three basic issues with the physics of these natural non-porous systems. First, the standard equations used to describe gas transport are incorrect in the small pores in the organic material where a significant portion of the hydrocarbon gas is stored. Researchers studying artificial nanomaterials have developed new gas transport equations that need to be adapted to the complicated pore spaces that describe shale gas reservoirs.

Secondly, in standard simulators, an assumption known as instantaneous capillary equilibrium provides the relationship between the gas and water pressure. Equilibrium cannot be maintained because of differences in the transport rates for water and gas in shale gas reservoirs, so the standard equations must be modified. Finally, the very large capillary forces caused by the very small pore size require a different treatment of relative permeability, which controls the relative transport of the water and gas.

“This is a three-year project to develop the new simulator starting with the fundamentals,” Sigal remarks. “We have already developed a 1-D model. The next step will be to build a simple 3-D testbed system. At first, we will test this model against models run on commercial simulators.”

“Next, we will build modules that incorporate the individual modifications needed for conventional simulators to correctly model shale gas reservoirs,” Sigal comments. “These modules will be available for adoption by industry for use in existing company or commercial simulators. Finally, we will use the modified simulators to history match production from existing reservoirs. Our commercial sponsors will provide data for this.”