Burrowing animals may have been key to stabilizing Earth’s oxygen

This image depicts a 530-million-year-old fossil of burrow activity in sediment. -  Martin Brasier, University of Oxford
This image depicts a 530-million-year-old fossil of burrow activity in sediment. – Martin Brasier, University of Oxford

Evolution of the first burrowing animals may have played a major role in stabilizing the Earth’s oxygen reservoir, according to a new study in Nature Geoscience.

Around 540 million years ago, the first burrowing animals evolved. When these worms began to mix up the ocean floor’s sediments (a process known as bioturbation), their activity came to significantly influence the ocean’s phosphorus cycle and as a result, the amount of oxygen in Earth’s atmosphere.

“Our research is an attempt to place the spread of animal life in the context of wider biogeochemical cycles, and we conclude that animal activity had a decreasing impact on the global oxygen reservoir and introduced a stabilizing effect on the connection between the oxygen and phosphorus cycles”, says lead author Dr. Richard Boyle from the Nordic Center for Earth Evolution (NordCEE) at the University of Southern Denmark.

The computer modelling study by Dr. Richard Boyle and colleagues from Denmark, Germany, China and the UK, published in Nature Geoscience, links data from the fossil record to well established connections between the phosphorus and oxygen cycles.

Marine organic carbon burial is a source of oxygen to the atmosphere, and its rate is proportional to the amount of phosphate in the oceans. This means that (over geologic timescales) anything that decreases the size of the ocean phosphate reservoir also decreases oxygen. The study focuses on one such removal process, burial of phosphorus in the organic matter in ocean sediments.

The authors hypothesize the following sequence of events: Around 540 million years ago, the evolution of the first burrowing animals significantly increased the extent to which oxygenated waters came into contact with ocean sediments. Exposure to oxygenated conditions caused the bacteria that inhabit such sediments to store phosphate in their cells (something that is observed in modern day experiments). This caused an increase in phosphorus burial in sediments that had been mixed up by burrowing animals. This in turn triggered decreases in marine phosphate concentrations, productivity, organic carbon burial and ultimately oxygen. Because an oxygen decrease was initiated by something requiring oxygen (i.e. the activity burrowing animals) a net negative feedback loop was created.

Boyle states: “It has long been appreciated that organic phosphorus burial is greater from the kind of well oxygenated, well-mixed sediments that animals inhabit, than from poorly mixed, low oxygen “laminated” sediments. The key argument we make in this paper is that this difference is directly attributable to bioturbation. This means that (1) animals are directly involved in an oxygen-regulating cycle or feedback loop that has previously been overlooked, and (2) we can directly test the idea (despite the uncertainties associated with looking so far back in time) by looking for a decrease in ocean oxygenation in conjunction with the spread of bioturbation. My colleague, Dr Tais Dahl from University of Copenhagen, compiled data on ocean metals with oxygen-sensitive burial patterns, which does indeed suggest such an oxygen decrease as bioturbation began – confirming the conclusions of the modelling. It is our hope that wider consideration of this feedback loop and the timing of its onset, will improve our understanding of the extent to which Earth’s atmosphere-ocean oxygen reservoir is regulated.”

Co-author Professor Tim Lenton of the University of Exeter adds: “We already think this cycle was key to helping stabilise atmospheric oxygen during the Phanerozoic (the last 542 million years) – and that oxygen stability is a good thing for the evolution of plants and animals. What is new in this study is it attributes the oxygen stabilisation to biology – the presence or absence of animals stirring up the ocean sediments.”

Earlier this year, researchers from the Nordic Center for Earth Evolution showed that early animals may have needed surprisingly little oxygen to grow, supporting the theory that rising oxygen levels were not crucial for animal life to evolve on Earth.

Burrowing animals may have been key to stabilizing Earth’s oxygen

This image depicts a 530-million-year-old fossil of burrow activity in sediment. -  Martin Brasier, University of Oxford
This image depicts a 530-million-year-old fossil of burrow activity in sediment. – Martin Brasier, University of Oxford

Evolution of the first burrowing animals may have played a major role in stabilizing the Earth’s oxygen reservoir, according to a new study in Nature Geoscience.

Around 540 million years ago, the first burrowing animals evolved. When these worms began to mix up the ocean floor’s sediments (a process known as bioturbation), their activity came to significantly influence the ocean’s phosphorus cycle and as a result, the amount of oxygen in Earth’s atmosphere.

“Our research is an attempt to place the spread of animal life in the context of wider biogeochemical cycles, and we conclude that animal activity had a decreasing impact on the global oxygen reservoir and introduced a stabilizing effect on the connection between the oxygen and phosphorus cycles”, says lead author Dr. Richard Boyle from the Nordic Center for Earth Evolution (NordCEE) at the University of Southern Denmark.

The computer modelling study by Dr. Richard Boyle and colleagues from Denmark, Germany, China and the UK, published in Nature Geoscience, links data from the fossil record to well established connections between the phosphorus and oxygen cycles.

Marine organic carbon burial is a source of oxygen to the atmosphere, and its rate is proportional to the amount of phosphate in the oceans. This means that (over geologic timescales) anything that decreases the size of the ocean phosphate reservoir also decreases oxygen. The study focuses on one such removal process, burial of phosphorus in the organic matter in ocean sediments.

The authors hypothesize the following sequence of events: Around 540 million years ago, the evolution of the first burrowing animals significantly increased the extent to which oxygenated waters came into contact with ocean sediments. Exposure to oxygenated conditions caused the bacteria that inhabit such sediments to store phosphate in their cells (something that is observed in modern day experiments). This caused an increase in phosphorus burial in sediments that had been mixed up by burrowing animals. This in turn triggered decreases in marine phosphate concentrations, productivity, organic carbon burial and ultimately oxygen. Because an oxygen decrease was initiated by something requiring oxygen (i.e. the activity burrowing animals) a net negative feedback loop was created.

Boyle states: “It has long been appreciated that organic phosphorus burial is greater from the kind of well oxygenated, well-mixed sediments that animals inhabit, than from poorly mixed, low oxygen “laminated” sediments. The key argument we make in this paper is that this difference is directly attributable to bioturbation. This means that (1) animals are directly involved in an oxygen-regulating cycle or feedback loop that has previously been overlooked, and (2) we can directly test the idea (despite the uncertainties associated with looking so far back in time) by looking for a decrease in ocean oxygenation in conjunction with the spread of bioturbation. My colleague, Dr Tais Dahl from University of Copenhagen, compiled data on ocean metals with oxygen-sensitive burial patterns, which does indeed suggest such an oxygen decrease as bioturbation began – confirming the conclusions of the modelling. It is our hope that wider consideration of this feedback loop and the timing of its onset, will improve our understanding of the extent to which Earth’s atmosphere-ocean oxygen reservoir is regulated.”

Co-author Professor Tim Lenton of the University of Exeter adds: “We already think this cycle was key to helping stabilise atmospheric oxygen during the Phanerozoic (the last 542 million years) – and that oxygen stability is a good thing for the evolution of plants and animals. What is new in this study is it attributes the oxygen stabilisation to biology – the presence or absence of animals stirring up the ocean sediments.”

Earlier this year, researchers from the Nordic Center for Earth Evolution showed that early animals may have needed surprisingly little oxygen to grow, supporting the theory that rising oxygen levels were not crucial for animal life to evolve on Earth.

Dust in the wind drove iron fertilization during ice age

Nitrogen is a critical building block for marine algae, yet the plankton in the Southern Ocean north of Antarctica leave much of it unused partly because they lack another needed nutrient, iron. The late John Martin hypothesized that dust-borne iron carried to the region by winds during ice ages may have fertilized the marine algae, allowing more of the Southern Ocean nitrogen to be used for growth and thus drawing CO2 into the ocean.  
To confirm Martin's hypothesis, the researchers measured isotopes of nitrogen in a sediment sample collected from a site that lies within the path of the winds that deposit iron-laden dust in the Subantarctic zone of the Southern Ocean (labeled ODP Site 1090). They found that the ratios of the types of nitrogen in the sample coincided with the predictions of Martin's hypothesis. The colors indicate simulated ice-age dust deposition from low to high (blue to red). The black contour lines show the concentrations of nitrate (a form of nitrogen) in modern surface waters. -  Image courtesy of Alfredo Martínez-García of ETH Zurich and Science/American Association for the Advancement of Science
Nitrogen is a critical building block for marine algae, yet the plankton in the Southern Ocean north of Antarctica leave much of it unused partly because they lack another needed nutrient, iron. The late John Martin hypothesized that dust-borne iron carried to the region by winds during ice ages may have fertilized the marine algae, allowing more of the Southern Ocean nitrogen to be used for growth and thus drawing CO2 into the ocean.
To confirm Martin’s hypothesis, the researchers measured isotopes of nitrogen in a sediment sample collected from a site that lies within the path of the winds that deposit iron-laden dust in the Subantarctic zone of the Southern Ocean (labeled ODP Site 1090). They found that the ratios of the types of nitrogen in the sample coincided with the predictions of Martin’s hypothesis. The colors indicate simulated ice-age dust deposition from low to high (blue to red). The black contour lines show the concentrations of nitrate (a form of nitrogen) in modern surface waters. – Image courtesy of Alfredo Martínez-García of ETH Zurich and Science/American Association for the Advancement of Science

Researchers from Princeton University and the Swiss Federal Institute of Technology in Zurich have confirmed that during the last ice age iron fertilization caused plankton to thrive in a region of the Southern Ocean.

The study published in Science confirms a longstanding hypothesis that wind-borne dust carried iron to the region of the globe north of Antarctica, driving plankton growth and eventually leading to the removal of carbon dioxide from the atmosphere.

Plankton remove the greenhouse gas carbon dioxide (CO2) from the atmosphere during growth and transfer it to the deep ocean when their remains sink to the bottom. Iron fertilization has previously been suggested as a possible cause of the lower CO2 levels that occur during ice ages. These decreases in atmospheric CO2 are believed to have “amplified” the ice ages, making them much colder, with some scientists believing that there would have been no ice ages at all without the CO2 depletion.

Iron fertilization has also been suggested as one way to draw down the rising levels of CO2 associated with the burning of fossil fuels. Improved understanding of the drivers of ocean carbon storage could lead to better predictions of how the rise in manmade carbon dioxide will affect climate in the coming years.

The role of iron in storing carbon dioxide during ice ages was first proposed in 1990 by the late John Martin, an oceanographer at Moss Landing Marine Laboratories in California who made the landmark discovery that iron limits plankton growth in large regions of the modern ocean.

Based on evidence that there was more dust in the atmosphere during the ice ages, Martin hypothesized that this increased dust supply to the Southern Ocean allowed plankton to grow more rapidly, sending more of their biomass into the deep ocean and removing CO2 from the atmosphere. Martin focused on the Southern Ocean because its surface waters contain the nutrients nitrogen and phosphorus in abundance, allowing plankton to be fertilized by iron without running low on these necessary nutrients.

The research confirms Martin’s hypothesis, said Daniel Sigman, Princeton’s Dusenbury Professor of Geological and Geophysical Sciences, and a co-leader of the study. “I was an undergraduate when Martin published his ‘ice age iron hypothesis,'” he said. “I remember being captivated by it, as was everyone else at the time. But I also remember thinking that Martin would have to be the luckiest person in the world to pose such a simple, beautiful explanation for the ice age CO2 paradox and then turn out to be right about it.”

Previous efforts to test Martin’s hypothesis established a strong correlation of cold climate, high dust and productivity in the Subantarctic region, a band of ocean encircling the globe between roughly 40 and 50 degrees south latitude that lies in the path of the winds that blow off South America, South Africa and Australia. However, it was not clear whether the productivity was due to iron fertilization or the northward shift of a zone of naturally occurring productivity that today lies to the south of the Subantarctic. This uncertainty was made more acute by the finding that ice age productivity was lower in the Antarctic Ocean, which lies south of the Subantarctic region.

To settle the matter, the research groups of Sigman at Princeton and Gerald Haug and Tim Eglinton at ETH Zurich teamed up to use a new method developed at Princeton. They analyzed fossils found in deep sea sediment -deposited during the last ice age in the Subantarctic region – with the goal of reconstructing past changes in the nitrogen concentration of surface waters and combining the results with side-by-side measurements of dust-borne iron and productivity. If the dust-borne iron fertilization hypothesis was correct, then nitrogen would have been more completely consumed by the plankton, leading to lower residual nitrogen concentrations in the surface waters. In contrast, if the productivity increases were in response to a northward shift in ocean conditions, then nitrogen concentrations would have risen.

The researchers measured the ratio of nitrogen isotopes, which have the same number of protons but differing numbers of neutrons, that were preserved within the carbonate shells of a group of marine microfossils called foraminifera. The investigators found that nitrogen concentrations indeed declined during the cold periods when iron deposition and productivity rose, in a manner consistent with the dust-borne iron fertilization theory. Ocean models as well as the strong correlation of the sediment core changes with the known changes in atmospheric CO2 suggest that this iron fertilization of Southern Ocean plankton can explain roughly half of the CO2 decline during peak ice ages.

Although Martin had proposed that purposeful iron addition to the Southern Ocean could reduce the rise in atmospheric CO2, Sigman noted that the amount of CO2 removed though iron fertilization is likely to be minor compared to the amount of CO2 that humans are now pushing into the atmosphere.

“The dramatic fertilization that we observed during ice ages should have caused a decline in atmospheric CO2 over hundreds of years, which was important for climate changes over ice age cycles,” Sigman said. “But for humans to duplicate it today would require unprecedented engineering of the global environment, and it would still only compensate for less than 20 years of fossil fuel burning.”

Edward Brook, a paleoclimatologist at Oregon State University who was not involved in the research, said, “This group has been doing a lot of important work in this area for quite a while and this an important advance. It will be interesting to see if the patterns they see in this one spot are consistent with variations in other places relevant to global changes in carbon dioxide.”

Dust in the wind drove iron fertilization during ice age

Nitrogen is a critical building block for marine algae, yet the plankton in the Southern Ocean north of Antarctica leave much of it unused partly because they lack another needed nutrient, iron. The late John Martin hypothesized that dust-borne iron carried to the region by winds during ice ages may have fertilized the marine algae, allowing more of the Southern Ocean nitrogen to be used for growth and thus drawing CO2 into the ocean.  
To confirm Martin's hypothesis, the researchers measured isotopes of nitrogen in a sediment sample collected from a site that lies within the path of the winds that deposit iron-laden dust in the Subantarctic zone of the Southern Ocean (labeled ODP Site 1090). They found that the ratios of the types of nitrogen in the sample coincided with the predictions of Martin's hypothesis. The colors indicate simulated ice-age dust deposition from low to high (blue to red). The black contour lines show the concentrations of nitrate (a form of nitrogen) in modern surface waters. -  Image courtesy of Alfredo Martínez-García of ETH Zurich and Science/American Association for the Advancement of Science
Nitrogen is a critical building block for marine algae, yet the plankton in the Southern Ocean north of Antarctica leave much of it unused partly because they lack another needed nutrient, iron. The late John Martin hypothesized that dust-borne iron carried to the region by winds during ice ages may have fertilized the marine algae, allowing more of the Southern Ocean nitrogen to be used for growth and thus drawing CO2 into the ocean.
To confirm Martin’s hypothesis, the researchers measured isotopes of nitrogen in a sediment sample collected from a site that lies within the path of the winds that deposit iron-laden dust in the Subantarctic zone of the Southern Ocean (labeled ODP Site 1090). They found that the ratios of the types of nitrogen in the sample coincided with the predictions of Martin’s hypothesis. The colors indicate simulated ice-age dust deposition from low to high (blue to red). The black contour lines show the concentrations of nitrate (a form of nitrogen) in modern surface waters. – Image courtesy of Alfredo Martínez-García of ETH Zurich and Science/American Association for the Advancement of Science

Researchers from Princeton University and the Swiss Federal Institute of Technology in Zurich have confirmed that during the last ice age iron fertilization caused plankton to thrive in a region of the Southern Ocean.

The study published in Science confirms a longstanding hypothesis that wind-borne dust carried iron to the region of the globe north of Antarctica, driving plankton growth and eventually leading to the removal of carbon dioxide from the atmosphere.

Plankton remove the greenhouse gas carbon dioxide (CO2) from the atmosphere during growth and transfer it to the deep ocean when their remains sink to the bottom. Iron fertilization has previously been suggested as a possible cause of the lower CO2 levels that occur during ice ages. These decreases in atmospheric CO2 are believed to have “amplified” the ice ages, making them much colder, with some scientists believing that there would have been no ice ages at all without the CO2 depletion.

Iron fertilization has also been suggested as one way to draw down the rising levels of CO2 associated with the burning of fossil fuels. Improved understanding of the drivers of ocean carbon storage could lead to better predictions of how the rise in manmade carbon dioxide will affect climate in the coming years.

The role of iron in storing carbon dioxide during ice ages was first proposed in 1990 by the late John Martin, an oceanographer at Moss Landing Marine Laboratories in California who made the landmark discovery that iron limits plankton growth in large regions of the modern ocean.

Based on evidence that there was more dust in the atmosphere during the ice ages, Martin hypothesized that this increased dust supply to the Southern Ocean allowed plankton to grow more rapidly, sending more of their biomass into the deep ocean and removing CO2 from the atmosphere. Martin focused on the Southern Ocean because its surface waters contain the nutrients nitrogen and phosphorus in abundance, allowing plankton to be fertilized by iron without running low on these necessary nutrients.

The research confirms Martin’s hypothesis, said Daniel Sigman, Princeton’s Dusenbury Professor of Geological and Geophysical Sciences, and a co-leader of the study. “I was an undergraduate when Martin published his ‘ice age iron hypothesis,'” he said. “I remember being captivated by it, as was everyone else at the time. But I also remember thinking that Martin would have to be the luckiest person in the world to pose such a simple, beautiful explanation for the ice age CO2 paradox and then turn out to be right about it.”

Previous efforts to test Martin’s hypothesis established a strong correlation of cold climate, high dust and productivity in the Subantarctic region, a band of ocean encircling the globe between roughly 40 and 50 degrees south latitude that lies in the path of the winds that blow off South America, South Africa and Australia. However, it was not clear whether the productivity was due to iron fertilization or the northward shift of a zone of naturally occurring productivity that today lies to the south of the Subantarctic. This uncertainty was made more acute by the finding that ice age productivity was lower in the Antarctic Ocean, which lies south of the Subantarctic region.

To settle the matter, the research groups of Sigman at Princeton and Gerald Haug and Tim Eglinton at ETH Zurich teamed up to use a new method developed at Princeton. They analyzed fossils found in deep sea sediment -deposited during the last ice age in the Subantarctic region – with the goal of reconstructing past changes in the nitrogen concentration of surface waters and combining the results with side-by-side measurements of dust-borne iron and productivity. If the dust-borne iron fertilization hypothesis was correct, then nitrogen would have been more completely consumed by the plankton, leading to lower residual nitrogen concentrations in the surface waters. In contrast, if the productivity increases were in response to a northward shift in ocean conditions, then nitrogen concentrations would have risen.

The researchers measured the ratio of nitrogen isotopes, which have the same number of protons but differing numbers of neutrons, that were preserved within the carbonate shells of a group of marine microfossils called foraminifera. The investigators found that nitrogen concentrations indeed declined during the cold periods when iron deposition and productivity rose, in a manner consistent with the dust-borne iron fertilization theory. Ocean models as well as the strong correlation of the sediment core changes with the known changes in atmospheric CO2 suggest that this iron fertilization of Southern Ocean plankton can explain roughly half of the CO2 decline during peak ice ages.

Although Martin had proposed that purposeful iron addition to the Southern Ocean could reduce the rise in atmospheric CO2, Sigman noted that the amount of CO2 removed though iron fertilization is likely to be minor compared to the amount of CO2 that humans are now pushing into the atmosphere.

“The dramatic fertilization that we observed during ice ages should have caused a decline in atmospheric CO2 over hundreds of years, which was important for climate changes over ice age cycles,” Sigman said. “But for humans to duplicate it today would require unprecedented engineering of the global environment, and it would still only compensate for less than 20 years of fossil fuel burning.”

Edward Brook, a paleoclimatologist at Oregon State University who was not involved in the research, said, “This group has been doing a lot of important work in this area for quite a while and this an important advance. It will be interesting to see if the patterns they see in this one spot are consistent with variations in other places relevant to global changes in carbon dioxide.”

Maize and bacteria: A 1-2 punch knocks copper out of stamp sand

Maize plants grown in stamp sand inoculated with bacteria, left, were considerably more robust than those grown in stamp sand alone, right. Research led by Michigan Technological University's Ramakrishna Wusirika could lead to new remediation techniques for soils contaminated by copper and other heavy metals. -  Ramakrishna Wusirika
Maize plants grown in stamp sand inoculated with bacteria, left, were considerably more robust than those grown in stamp sand alone, right. Research led by Michigan Technological University’s Ramakrishna Wusirika could lead to new remediation techniques for soils contaminated by copper and other heavy metals. – Ramakrishna Wusirika

Scientists have known for years that together, bacteria and plants can remediate contaminated sites. Ramakrishna Wusirika, of Michigan Technological University, has determined that how you add bacteria to the mix can make a big difference.

He has also shed light on the biochemical pathways that allow plants and bacteria to clean up some of the worst soils on the planet while increasing their fertility.

Wusirika, an associate professor of biological sciences, first collected stamp sands near the village of Gay, in Michigan’s Upper Peninsula. For decades, copper mining companies crushed copper ore and dumped the remnants-an estimated 500 million tons of stamp sand-throughout the region. Almost nothing grows on these manmade deserts, which are laced with high concentrations of copper, arsenic and other plant-unfriendly chemicals.

Then, Wusirika and his team planted maize in the stamp sand, incorporating bacteria in four different ways:

  • mixing it in the stamp sand before planting seed;

  • coating seed with bacteria and planting it;

  • germinating seeds and planting them in soil to which bacteria were added; and

  • the conventional method, immersing the roots of maize seedlings in bacteria and planting them in stamp sand.

After 45 days, the team uprooted the plants and measured their dry weight. All maize grown with bacteria was significantly more vigorous-from two to five times larger-than the maize grown in stamp sand alone. The biggest were those planted as seedlings or as germinated seeds.

However, when the researchers analyzed the dried maize, they made a surprising discovery: the seed-planted maize took up far more copper as a percentage of dry weight. In other words, the smaller plants pulled more copper, ounce per ounce, out of the stamp sands than the bigger ones.

That has implications for land managers trying to remediate contaminated sites, or even for farmers working with marginal soils, Wusirika said. The usual technique-applying bacteria to seedlings’ roots before transplanting-works fine in the lab but would be impractical for large-scale projects. This could open the door to simple, practical remediation of copper-contaminated soils.

But the mere fact that all the plants grown with bacteria did so well also piqued his curiosity. “When we saw this, we wondered what the bacteria were doing to the soil,” Wusirika said. “Based on our research, it looks like they are improving enzyme activity and increasing soil fertility,” in part by freeing up phosphorus that had been locked in the rock.

The bacteria are also changing copper into a form that the plants can take up. “With bacteria, the exchangeable copper is increased three times,” he said. “There’s still a lot of copper that’s not available, but it is moving in the right direction.”

By analyzing metabolic compounds, the team was able to show that the bacteria enhance photosynthesis and help the plants make growth hormones. Bacteria also appear to affect the amount phenolics produced by the maize. Phenolics are antioxidants similar to those in grapes and red wine.

Compared to plants grown in normal soil without bacteria, plants grown in stamp sand alone showed a five-fold increase in phenolics. However, phenolics in plants grown in stamp sand with bacteria showed a lesser increase.

“Growing in stamp sand is very stressful for plants, and they respond by increasing their antioxidant production,” Wusirika said. “Adding the metal-resistant bacteria enables the plants to cope with stress better, resulting in reduced levels of phenolics.”

“There’s still a lot to understand here,” he added. “We’d like to do a study on stamp sands in the field, and we’d also like to work with plants besides maize. We think this work has applications in organic agriculture as well as remediation.”

Maize and bacteria: A 1-2 punch knocks copper out of stamp sand

Maize plants grown in stamp sand inoculated with bacteria, left, were considerably more robust than those grown in stamp sand alone, right. Research led by Michigan Technological University's Ramakrishna Wusirika could lead to new remediation techniques for soils contaminated by copper and other heavy metals. -  Ramakrishna Wusirika
Maize plants grown in stamp sand inoculated with bacteria, left, were considerably more robust than those grown in stamp sand alone, right. Research led by Michigan Technological University’s Ramakrishna Wusirika could lead to new remediation techniques for soils contaminated by copper and other heavy metals. – Ramakrishna Wusirika

Scientists have known for years that together, bacteria and plants can remediate contaminated sites. Ramakrishna Wusirika, of Michigan Technological University, has determined that how you add bacteria to the mix can make a big difference.

He has also shed light on the biochemical pathways that allow plants and bacteria to clean up some of the worst soils on the planet while increasing their fertility.

Wusirika, an associate professor of biological sciences, first collected stamp sands near the village of Gay, in Michigan’s Upper Peninsula. For decades, copper mining companies crushed copper ore and dumped the remnants-an estimated 500 million tons of stamp sand-throughout the region. Almost nothing grows on these manmade deserts, which are laced with high concentrations of copper, arsenic and other plant-unfriendly chemicals.

Then, Wusirika and his team planted maize in the stamp sand, incorporating bacteria in four different ways:

  • mixing it in the stamp sand before planting seed;

  • coating seed with bacteria and planting it;

  • germinating seeds and planting them in soil to which bacteria were added; and

  • the conventional method, immersing the roots of maize seedlings in bacteria and planting them in stamp sand.

After 45 days, the team uprooted the plants and measured their dry weight. All maize grown with bacteria was significantly more vigorous-from two to five times larger-than the maize grown in stamp sand alone. The biggest were those planted as seedlings or as germinated seeds.

However, when the researchers analyzed the dried maize, they made a surprising discovery: the seed-planted maize took up far more copper as a percentage of dry weight. In other words, the smaller plants pulled more copper, ounce per ounce, out of the stamp sands than the bigger ones.

That has implications for land managers trying to remediate contaminated sites, or even for farmers working with marginal soils, Wusirika said. The usual technique-applying bacteria to seedlings’ roots before transplanting-works fine in the lab but would be impractical for large-scale projects. This could open the door to simple, practical remediation of copper-contaminated soils.

But the mere fact that all the plants grown with bacteria did so well also piqued his curiosity. “When we saw this, we wondered what the bacteria were doing to the soil,” Wusirika said. “Based on our research, it looks like they are improving enzyme activity and increasing soil fertility,” in part by freeing up phosphorus that had been locked in the rock.

The bacteria are also changing copper into a form that the plants can take up. “With bacteria, the exchangeable copper is increased three times,” he said. “There’s still a lot of copper that’s not available, but it is moving in the right direction.”

By analyzing metabolic compounds, the team was able to show that the bacteria enhance photosynthesis and help the plants make growth hormones. Bacteria also appear to affect the amount phenolics produced by the maize. Phenolics are antioxidants similar to those in grapes and red wine.

Compared to plants grown in normal soil without bacteria, plants grown in stamp sand alone showed a five-fold increase in phenolics. However, phenolics in plants grown in stamp sand with bacteria showed a lesser increase.

“Growing in stamp sand is very stressful for plants, and they respond by increasing their antioxidant production,” Wusirika said. “Adding the metal-resistant bacteria enables the plants to cope with stress better, resulting in reduced levels of phenolics.”

“There’s still a lot to understand here,” he added. “We’d like to do a study on stamp sands in the field, and we’d also like to work with plants besides maize. We think this work has applications in organic agriculture as well as remediation.”

USF researchers: Life-producing phosphorus carried to Earth by meteorites

This is Matthew Pasek, University of South Florida. -  USF/Aimee Blodgett
This is Matthew Pasek, University of South Florida. – USF/Aimee Blodgett

Scientists may not know for certain whether life exists in outer space, but new research from a team of scientists led by a University of South Florida astrobiologist now shows that one key element that produced life on Earth was carried here on meteorites.

In an article published in the new edition of the Proceedings of the National Academies of Sciences, USF Assistant Professor of Geology Matthew Pasek and researchers from the University of Washington and the Edinburg Centre for Carbon Innovation, revealed new findings that explain how the reactive phosphorus that was an essential component for creating the earliest life forms came to Earth.

The scientists found that during the Hadean and Archean eons – the first of the four principal eons of the Earth’s earliest history – the heavy bombardment of meteorites provided reactive phosphorus that when released in water could be incorporated into prebiotic molecules. The scientists documented the phosphorus in early Archean limestone, showing it was abundant some 3.5 billion years ago.

The scientists concluded that the meteorites delivered phosphorus in minerals that are not seen on the surface of the earth, and these minerals corroded in water to release phosphorus in a form seen only on the early earth.

The discovery answers one of the key questions for scientists trying to unlock the processes that gave rise to early life forms: Why don’t we see new life forms today?

“Meteorite phosphorus may have been a fuel that provided the energy and phosphorus necessary for the onset of life,” said Pasek, who studies the chemical composition of space and how it might have contributed to the origins of life. “If this meteoritic phosphorus is added to simple organic compounds, it can generate phosphorus biomolecules identical to those seen in life today.”

Pasek said the research provides a plausible answer: The conditions under which life arose on the earth billions of years ago are no longer present today.

“The present research shows that this is indeed the case: Phosphorus chemistry on the early earth was substantially different billions of years ago than it is today,” he added.

The research team reached their conclusion after examining earth core samples from Australia, Zimbabwe, West Virginia, Wyoming and in Avon Park, Florida

Previous research had showed that before the emergence of modern DNA-RNA-protein life that is known today, the earliest biological forms evolved from RNA alone. What has stumped scientists, however, was understanding how those early RNA -based life forms synthesized environmental phosphorus, which in its current form is relatively insoluble and unreactive.

Meteorites would have provided reactive phosphorus in the form of the iron-nickel phosphide mineral schreibersite, which in water released soluble and reactive phosphite. Phosphite is the salt scientists believe could have been incorporated into prebiotic molecules.

Of all of the samples analyzed, only the oldest, the Coonterunah carbonate samples from the early Archean of Australia, showed the presence of phosphite, Other natural sources of phosphite include lightning strikes, geothermal fluids and possibly microbial activity under extremely anaerobic condition, but no other terrestrial sources of phosphite have been identified and none could have produced the quantities of phosphite needed to be dissolved in early Earth oceans that gave rise to life, the researchers concluded.

The scientists said meteorite phosphite would have been abundant enough to adjust the chemistry of the oceans, with its chemical signature later becoming trapped in marine carbonate where it was preserved.

It is still possible, the researchers noted, that other natural sources of phosphite could be identified, such as in hydrothermal systems. While that might lead to reducing the total meteoric mass necessary to provide enough phosphite, the researchers said more work would need to be done to determine the exact contribution of separate sources to what they are certain was an essential ingredient to early life.

USF researchers: Life-producing phosphorus carried to Earth by meteorites

This is Matthew Pasek, University of South Florida. -  USF/Aimee Blodgett
This is Matthew Pasek, University of South Florida. – USF/Aimee Blodgett

Scientists may not know for certain whether life exists in outer space, but new research from a team of scientists led by a University of South Florida astrobiologist now shows that one key element that produced life on Earth was carried here on meteorites.

In an article published in the new edition of the Proceedings of the National Academies of Sciences, USF Assistant Professor of Geology Matthew Pasek and researchers from the University of Washington and the Edinburg Centre for Carbon Innovation, revealed new findings that explain how the reactive phosphorus that was an essential component for creating the earliest life forms came to Earth.

The scientists found that during the Hadean and Archean eons – the first of the four principal eons of the Earth’s earliest history – the heavy bombardment of meteorites provided reactive phosphorus that when released in water could be incorporated into prebiotic molecules. The scientists documented the phosphorus in early Archean limestone, showing it was abundant some 3.5 billion years ago.

The scientists concluded that the meteorites delivered phosphorus in minerals that are not seen on the surface of the earth, and these minerals corroded in water to release phosphorus in a form seen only on the early earth.

The discovery answers one of the key questions for scientists trying to unlock the processes that gave rise to early life forms: Why don’t we see new life forms today?

“Meteorite phosphorus may have been a fuel that provided the energy and phosphorus necessary for the onset of life,” said Pasek, who studies the chemical composition of space and how it might have contributed to the origins of life. “If this meteoritic phosphorus is added to simple organic compounds, it can generate phosphorus biomolecules identical to those seen in life today.”

Pasek said the research provides a plausible answer: The conditions under which life arose on the earth billions of years ago are no longer present today.

“The present research shows that this is indeed the case: Phosphorus chemistry on the early earth was substantially different billions of years ago than it is today,” he added.

The research team reached their conclusion after examining earth core samples from Australia, Zimbabwe, West Virginia, Wyoming and in Avon Park, Florida

Previous research had showed that before the emergence of modern DNA-RNA-protein life that is known today, the earliest biological forms evolved from RNA alone. What has stumped scientists, however, was understanding how those early RNA -based life forms synthesized environmental phosphorus, which in its current form is relatively insoluble and unreactive.

Meteorites would have provided reactive phosphorus in the form of the iron-nickel phosphide mineral schreibersite, which in water released soluble and reactive phosphite. Phosphite is the salt scientists believe could have been incorporated into prebiotic molecules.

Of all of the samples analyzed, only the oldest, the Coonterunah carbonate samples from the early Archean of Australia, showed the presence of phosphite, Other natural sources of phosphite include lightning strikes, geothermal fluids and possibly microbial activity under extremely anaerobic condition, but no other terrestrial sources of phosphite have been identified and none could have produced the quantities of phosphite needed to be dissolved in early Earth oceans that gave rise to life, the researchers concluded.

The scientists said meteorite phosphite would have been abundant enough to adjust the chemistry of the oceans, with its chemical signature later becoming trapped in marine carbonate where it was preserved.

It is still possible, the researchers noted, that other natural sources of phosphite could be identified, such as in hydrothermal systems. While that might lead to reducing the total meteoric mass necessary to provide enough phosphite, the researchers said more work would need to be done to determine the exact contribution of separate sources to what they are certain was an essential ingredient to early life.

Phosphorus identified as the missing link in evolution of animals

A University of Alberta geomicrobiologist and his PhD student are part of a research team that has identified phosphorus as the mystery ingredient that pushed oxygen levels in the oceans high enough to establish the first animals on Earth 750 million years ago.

By examining ancient-ocean sediments, Kurt Konhauser, student Stefan Lalonde and other colleagues discovered that as the last glacier to encircle Earth receded, leaving behind glacial debris containing phosphorus that washed into the oceans. Phosphorus is an essential nutrient that promoted the growth of cyanobacteria, or blue-green-algae, and its metabolic byproduct is oxygen. The new, higher oxygen levels in the ocean reached a threshold favourable for animals to evolve.

Konhauser’s past research into ancient phosphorus levels in a unique suite of rocks called banded iron formations led him and his colleagues at the University of California Riverside to their current findings.

In 2007, Konhauser and his U of A team published research in the magazine Science that was contrary to the then-accepted theory that phosphorus was scarce throughout much of Earth’s history, it was in fact plentiful.

“Now in 2010 we showed that phosphorus levels actually peaked between 750 and 635 million years ago at the very same time that oxygen levels increased, allowing complex life forms to emerge,” says Lalonde. “That establishes our link between phosphorus and the evolution of animals.”