No laughing matter: Nitrous oxide rose at end of last ice age

Researchers measured increases in atmospheric nitrous oxide concentrations about 16,000 to 10,000 years ago using ice from Taylor Glacier in Antarctica. -  Adrian Schilt
Researchers measured increases in atmospheric nitrous oxide concentrations about 16,000 to 10,000 years ago using ice from Taylor Glacier in Antarctica. – Adrian Schilt

Nitrous oxide (N2O) is an important greenhouse gas that doesn’t receive as much notoriety as carbon dioxide or methane, but a new study confirms that atmospheric levels of N2O rose significantly as the Earth came out of the last ice age and addresses the cause.

An international team of scientists analyzed air extracted from bubbles enclosed in ancient polar ice from Taylor Glacier in Antarctica, allowing for the reconstruction of the past atmospheric composition. The analysis documented a 30 percent increase in atmospheric nitrous oxide concentrations from 16,000 years ago to 10,000 years ago. This rise in N2O was caused by changes in environmental conditions in the ocean and on land, scientists say, and contributed to the warming at the end of the ice age and the melting of large ice sheets that then existed.

The findings add an important new element to studies of how Earth may respond to a warming climate in the future. Results of the study, which was funded by the U.S. National Science Foundation and the Swiss National Science Foundation, are being published this week in the journal Nature.

“We found that marine and terrestrial sources contributed about equally to the overall increase of nitrous oxide concentrations and generally evolved in parallel at the end of the last ice age,” said lead author Adrian Schilt, who did much of the work as a post-doctoral researcher at Oregon State University. Schilt then continued to work on the study at the Oeschger Centre for Climate Change Research at the University of Bern in Switzerland.

“The end of the last ice age represents a partial analog to modern warming and allows us to study the response of natural nitrous oxide emissions to changing environmental conditions,” Schilt added. “This will allow us to better understand what might happen in the future.”

Nitrous oxide is perhaps best known as laughing gas, but it is also produced by microbes on land and in the ocean in processes that occur naturally, but can be enhanced by human activity. Marine nitrous oxide production is linked closely to low oxygen conditions in the upper ocean and global warming is predicted to intensify the low-oxygen zones in many of the world’s ocean basins. N2O also destroys ozone in the stratosphere.

“Warming makes terrestrial microbes produce more nitrous oxide,” noted co-author Edward Brook, an Oregon State paleoclimatologist whose research team included Schilt. “Greenhouse gases go up and down over time, and we’d like to know more about why that happens and how it affects climate.”

Nitrous oxide is among the most difficult greenhouse gases to study in attempting to reconstruct the Earth’s climate history through ice core analysis. The specific technique that the Oregon State research team used requires large samples of pristine ice that date back to the desired time of study – in this case, between about 16,000 and 10,000 years ago.

The unusual way in which Taylor Glacier is configured allowed the scientists to extract ice samples from the surface of the glacier instead of drilling deep in the polar ice cap because older ice is transported upward near the glacier margins, said Brook, a professor in Oregon State’s College of Earth, Ocean, and Atmospheric Sciences.

The scientists were able to discern the contributions of marine and terrestrial nitrous oxide through analysis of isotopic ratios, which fingerprint the different sources of N2O in the atmosphere.

“The scientific community knew roughly what the N2O concentration trends were prior to this study,” Brook said, “but these findings confirm that and provide more exact details about changes in sources. As nitrous oxide in the atmosphere continues to increase – along with carbon dioxide and methane – we now will be able to more accurately assess where those contributions are coming from and the rate of the increase.”

Atmospheric N2O was roughly 200 parts per billion at the peak of the ice age about 20,000 years ago then rose to 260 ppb by 10,000 years ago. As of 2014, atmospheric N2Owas measured at about 327 ppb, an increase attributed primarily to agricultural influences.

Although the N2O increase at the end of the last ice age was almost equally attributable to marine and terrestrial sources, the scientists say, there were some differences.

“Our data showed that terrestrial emissions changed faster than marine emissions, which was highlighted by a fast increase of emissions on land that preceded the increase in marine emissions,” Schilt pointed out. “It appears to be a direct response to a rapid temperature change between 15,000 and 14,000 years ago.”

That finding underscores the complexity of analyzing how Earth responds to changing conditions that have to account for marine and terrestrial influences; natural variability; the influence of different greenhouse gases; and a host of other factors, Brook said.

“Natural sources of N2O are predicted to increase in the future and this study will help up test predictions on how the Earth will respond,” Brook said.

No laughing matter: Nitrous oxide rose at end of last ice age

Researchers measured increases in atmospheric nitrous oxide concentrations about 16,000 to 10,000 years ago using ice from Taylor Glacier in Antarctica. -  Adrian Schilt
Researchers measured increases in atmospheric nitrous oxide concentrations about 16,000 to 10,000 years ago using ice from Taylor Glacier in Antarctica. – Adrian Schilt

Nitrous oxide (N2O) is an important greenhouse gas that doesn’t receive as much notoriety as carbon dioxide or methane, but a new study confirms that atmospheric levels of N2O rose significantly as the Earth came out of the last ice age and addresses the cause.

An international team of scientists analyzed air extracted from bubbles enclosed in ancient polar ice from Taylor Glacier in Antarctica, allowing for the reconstruction of the past atmospheric composition. The analysis documented a 30 percent increase in atmospheric nitrous oxide concentrations from 16,000 years ago to 10,000 years ago. This rise in N2O was caused by changes in environmental conditions in the ocean and on land, scientists say, and contributed to the warming at the end of the ice age and the melting of large ice sheets that then existed.

The findings add an important new element to studies of how Earth may respond to a warming climate in the future. Results of the study, which was funded by the U.S. National Science Foundation and the Swiss National Science Foundation, are being published this week in the journal Nature.

“We found that marine and terrestrial sources contributed about equally to the overall increase of nitrous oxide concentrations and generally evolved in parallel at the end of the last ice age,” said lead author Adrian Schilt, who did much of the work as a post-doctoral researcher at Oregon State University. Schilt then continued to work on the study at the Oeschger Centre for Climate Change Research at the University of Bern in Switzerland.

“The end of the last ice age represents a partial analog to modern warming and allows us to study the response of natural nitrous oxide emissions to changing environmental conditions,” Schilt added. “This will allow us to better understand what might happen in the future.”

Nitrous oxide is perhaps best known as laughing gas, but it is also produced by microbes on land and in the ocean in processes that occur naturally, but can be enhanced by human activity. Marine nitrous oxide production is linked closely to low oxygen conditions in the upper ocean and global warming is predicted to intensify the low-oxygen zones in many of the world’s ocean basins. N2O also destroys ozone in the stratosphere.

“Warming makes terrestrial microbes produce more nitrous oxide,” noted co-author Edward Brook, an Oregon State paleoclimatologist whose research team included Schilt. “Greenhouse gases go up and down over time, and we’d like to know more about why that happens and how it affects climate.”

Nitrous oxide is among the most difficult greenhouse gases to study in attempting to reconstruct the Earth’s climate history through ice core analysis. The specific technique that the Oregon State research team used requires large samples of pristine ice that date back to the desired time of study – in this case, between about 16,000 and 10,000 years ago.

The unusual way in which Taylor Glacier is configured allowed the scientists to extract ice samples from the surface of the glacier instead of drilling deep in the polar ice cap because older ice is transported upward near the glacier margins, said Brook, a professor in Oregon State’s College of Earth, Ocean, and Atmospheric Sciences.

The scientists were able to discern the contributions of marine and terrestrial nitrous oxide through analysis of isotopic ratios, which fingerprint the different sources of N2O in the atmosphere.

“The scientific community knew roughly what the N2O concentration trends were prior to this study,” Brook said, “but these findings confirm that and provide more exact details about changes in sources. As nitrous oxide in the atmosphere continues to increase – along with carbon dioxide and methane – we now will be able to more accurately assess where those contributions are coming from and the rate of the increase.”

Atmospheric N2O was roughly 200 parts per billion at the peak of the ice age about 20,000 years ago then rose to 260 ppb by 10,000 years ago. As of 2014, atmospheric N2Owas measured at about 327 ppb, an increase attributed primarily to agricultural influences.

Although the N2O increase at the end of the last ice age was almost equally attributable to marine and terrestrial sources, the scientists say, there were some differences.

“Our data showed that terrestrial emissions changed faster than marine emissions, which was highlighted by a fast increase of emissions on land that preceded the increase in marine emissions,” Schilt pointed out. “It appears to be a direct response to a rapid temperature change between 15,000 and 14,000 years ago.”

That finding underscores the complexity of analyzing how Earth responds to changing conditions that have to account for marine and terrestrial influences; natural variability; the influence of different greenhouse gases; and a host of other factors, Brook said.

“Natural sources of N2O are predicted to increase in the future and this study will help up test predictions on how the Earth will respond,” Brook said.

Technology-dependent emissions of gas extraction in the US

The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. -  Photo: F. Geiger/KIT
The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. – Photo: F. Geiger/KIT

Not all boreholes are the same. Scientists of the Karlsruhe Institute of Technology (KIT) used mobile measurement equipment to analyze gaseous compounds emitted by the extraction of oil and natural gas in the USA. For the first time, organic pollutants emitted during a fracking process were measured at a high temporal resolution. The highest values measured exceeded typical mean values in urban air by a factor of one thousand, as was reported in ACP journal. (DOI 10.5194/acp-14-10977-2014)

Emission of trace gases by oil and gas fields was studied by the KIT researchers in the USA (Utah and Colorado) together with US institutes. Background concentrations and the waste gas plumes of single extraction plants and fracking facilities were analyzed. The air quality measurements of several weeks duration took place under the “Uintah Basin Winter Ozone Study” coordinated by the National Oceanic and Atmospheric Administration (NOAA).

The KIT measurements focused on health-damaging aromatic hydrocarbons in air, such as carcinogenic benzene. Maximum concentrations were determined in the waste gas plumes of boreholes. Some extraction plants emitted up to about a hundred times more benzene than others. The highest values of some milligrams of benzene per cubic meter air were measured downstream of an open fracking facility, where returning drilling fluid is stored in open tanks and basins. Much better results were reached by oil and gas extraction plants and plants with closed production processes. In Germany, benzene concentration at the workplace is subject to strict limits: The Federal Emission Control Ordinance gives an annual benzene limit of five micrograms per cubic meter for the protection of human health, which is smaller than the values now measured at the open fracking facility in the US by a factor of about one thousand. The researchers published the results measured in the journal Atmospheric Chemistry and Physics ACP.

“Characteristic emissions of trace gases are encountered everywhere. These are symptomatic of gas and gas extraction. But the values measured for different technologies differ considerably,” Felix Geiger of the Institute of Meteorology and Climate Research (IMK) of KIT explains. He is one of the first authors of the study. By means of closed collection tanks and so-called vapor capture systems, for instance, the gases released during operation can be collected and reduced significantly.

“The gas fields in the sparsely populated areas of North America are a good showcase for estimating the range of impacts of different extraction and fracking technologies,” explains Professor Johannes Orphal, Head of IMK. “In the densely populated Germany, framework conditions are much stricter and much more attention is paid to reducing and monitoring emissions.”

Fracking is increasingly discussed as a technology to extract fossil resources from unconventional deposits. Hydraulic breaking of suitable shale stone layers opens up the fossil fuels stored there and makes them accessible for economically efficient use. For this purpose, boreholes are drilled into these rock formations. Then, they are subjected to high pressure using large amounts of water and auxiliary materials, such as sand, cement, and chemicals. The oil or gas can flow to the surface through the opened microstructures in the rock. Typically, the return flow of the aqueous fracking liquid with the dissolved oil and gas constituents to the surface lasts several days until the production phase proper of purer oil or natural gas. This return flow is collected and then reused until it finally has to be disposed of. Air pollution mainly depends on the treatment of this return flow at the extraction plant. In this respect, currently practiced fracking technologies differ considerably. For the first time now, the resulting local atmospheric emissions were studied at a high temporary resolution. Based on the results, emissions can be assigned directly to the different plant sections of an extraction plant. For measurement, the newly developed, compact, and highly sensitive instrument, a so-called proton transfer reaction mass spectrometer (PTR-MS), of KIT was installed on board of a minivan and driven closer to the different extraction points, the distances being a few tens of meters. In this way, the waste gas plumes of individual extraction sources and fracking processes were studied in detail.

Warneke, C., Geiger, F., Edwards, P. M., Dube, W., Pétron, G., Kofler, J., Zahn, A., Brown, S. S., Graus, M., Gilman, J. B., Lerner, B. M., Peischl, J., Ryerson, T. B., de Gouw, J. A., and Roberts, J. M.: Volatile organic compound emissions from the oil and natural gas industry in the Uintah Basin, Utah: oil and gas well pad emissions compared to ambient air composition, Atmos. Chem. Phys., 14, 10977-10988, doi:10.5194/acp-14-10977-2014, 2014.

Technology-dependent emissions of gas extraction in the US

The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. -  Photo: F. Geiger/KIT
The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution. – Photo: F. Geiger/KIT

Not all boreholes are the same. Scientists of the Karlsruhe Institute of Technology (KIT) used mobile measurement equipment to analyze gaseous compounds emitted by the extraction of oil and natural gas in the USA. For the first time, organic pollutants emitted during a fracking process were measured at a high temporal resolution. The highest values measured exceeded typical mean values in urban air by a factor of one thousand, as was reported in ACP journal. (DOI 10.5194/acp-14-10977-2014)

Emission of trace gases by oil and gas fields was studied by the KIT researchers in the USA (Utah and Colorado) together with US institutes. Background concentrations and the waste gas plumes of single extraction plants and fracking facilities were analyzed. The air quality measurements of several weeks duration took place under the “Uintah Basin Winter Ozone Study” coordinated by the National Oceanic and Atmospheric Administration (NOAA).

The KIT measurements focused on health-damaging aromatic hydrocarbons in air, such as carcinogenic benzene. Maximum concentrations were determined in the waste gas plumes of boreholes. Some extraction plants emitted up to about a hundred times more benzene than others. The highest values of some milligrams of benzene per cubic meter air were measured downstream of an open fracking facility, where returning drilling fluid is stored in open tanks and basins. Much better results were reached by oil and gas extraction plants and plants with closed production processes. In Germany, benzene concentration at the workplace is subject to strict limits: The Federal Emission Control Ordinance gives an annual benzene limit of five micrograms per cubic meter for the protection of human health, which is smaller than the values now measured at the open fracking facility in the US by a factor of about one thousand. The researchers published the results measured in the journal Atmospheric Chemistry and Physics ACP.

“Characteristic emissions of trace gases are encountered everywhere. These are symptomatic of gas and gas extraction. But the values measured for different technologies differ considerably,” Felix Geiger of the Institute of Meteorology and Climate Research (IMK) of KIT explains. He is one of the first authors of the study. By means of closed collection tanks and so-called vapor capture systems, for instance, the gases released during operation can be collected and reduced significantly.

“The gas fields in the sparsely populated areas of North America are a good showcase for estimating the range of impacts of different extraction and fracking technologies,” explains Professor Johannes Orphal, Head of IMK. “In the densely populated Germany, framework conditions are much stricter and much more attention is paid to reducing and monitoring emissions.”

Fracking is increasingly discussed as a technology to extract fossil resources from unconventional deposits. Hydraulic breaking of suitable shale stone layers opens up the fossil fuels stored there and makes them accessible for economically efficient use. For this purpose, boreholes are drilled into these rock formations. Then, they are subjected to high pressure using large amounts of water and auxiliary materials, such as sand, cement, and chemicals. The oil or gas can flow to the surface through the opened microstructures in the rock. Typically, the return flow of the aqueous fracking liquid with the dissolved oil and gas constituents to the surface lasts several days until the production phase proper of purer oil or natural gas. This return flow is collected and then reused until it finally has to be disposed of. Air pollution mainly depends on the treatment of this return flow at the extraction plant. In this respect, currently practiced fracking technologies differ considerably. For the first time now, the resulting local atmospheric emissions were studied at a high temporary resolution. Based on the results, emissions can be assigned directly to the different plant sections of an extraction plant. For measurement, the newly developed, compact, and highly sensitive instrument, a so-called proton transfer reaction mass spectrometer (PTR-MS), of KIT was installed on board of a minivan and driven closer to the different extraction points, the distances being a few tens of meters. In this way, the waste gas plumes of individual extraction sources and fracking processes were studied in detail.

Warneke, C., Geiger, F., Edwards, P. M., Dube, W., Pétron, G., Kofler, J., Zahn, A., Brown, S. S., Graus, M., Gilman, J. B., Lerner, B. M., Peischl, J., Ryerson, T. B., de Gouw, J. A., and Roberts, J. M.: Volatile organic compound emissions from the oil and natural gas industry in the Uintah Basin, Utah: oil and gas well pad emissions compared to ambient air composition, Atmos. Chem. Phys., 14, 10977-10988, doi:10.5194/acp-14-10977-2014, 2014.

ASU, IBM move ultrafast, low-cost DNA sequencing technology a step closer to reality

Led by ASU Regents' professor Stuart Lindsay, a team of scientists from Arizona State University's Biodesign Institute and IBM's T.J. Watson Research Center have developed a prototype DNA reader that could make whole genome profiling an everyday practice in medicine. -  Biodesign Institute at Arizona State University
Led by ASU Regents’ professor Stuart Lindsay, a team of scientists from Arizona State University’s Biodesign Institute and IBM’s T.J. Watson Research Center have developed a prototype DNA reader that could make whole genome profiling an everyday practice in medicine. – Biodesign Institute at Arizona State University

A team of scientists from Arizona State University’s Biodesign Institute and IBM’s T.J. Watson Research Center have developed a prototype DNA reader that could make whole genome profiling an everyday practice in medicine.

“Our goal is to put cheap, simple and powerful DNA and protein diagnostic devices into every single doctor’s office,” said Stuart Lindsay, an ASU physics professor and director of Biodesign’s Center for Single Molecule Biophysics. Such technology could help usher in the age of personalized medicine, where information from an individual’s complete DNA and protein profiles could be used to design treatments specific to their individual makeup.

Such game-changing technology is needed to make genome sequencing a reality. The current hurdle is to do so for less than $1,000, an amount for which insurance companies are more likely to provide reimbursement.

In their latest research breakthrough, the team fashioned a tiny, DNA reading device a thousands of times smaller than width of a single human hair.

The device is sensitive enough to distinguish the individual chemical bases of DNA (known by their abbreviated letters of A, C, T or G) when they are pumped past the reading head.

Proof-of-concept was demonstrated, by using solutions of the individual DNA bases, which gave clear signals sensitive enough to detect tiny amounts of DNA (nanomolar concentrations), even better than today’s state-of-the-art, so called next-generation DNA sequencing technology.

Making the solid-state device is just like making a sandwich, just with ultra high-tech semiconductor tools used to slice and stack the atomic-sized layers of meats and cheeses like the butcher shop’s block. The secret is to make slice and stack the layers just so, to turn the chemical information of the DNA into a change in the electrical signal.

First, they made a “sandwich” composed of two metal electrodes separated by a two-nanometer thick insulating layer (a single nanometer is 10,000 times smaller than a human hair), made by using a semiconductor technology called atomic layer deposition.

Then a hole is cut through the sandwich: DNA bases inside the hole are read as they pass the gap between the metal layers.

“The technology we’ve developed might just be the first big step in building a single-molecule sequencing device based on ordinary computer chip technology,” said Lindsay.

“Previous attempts to make tunnel junctions for reading DNA had one electrode facing another across a small gap between the electrodes, and the gaps had to be adjusted by hand. This made it impossible to use computer chip manufacturing methods to make devices,” said Lindsay.

“Our approach of defining the gap using a thin layer of dielectric (insulating) material between the electrodes and exposing this gap by drilling a hole through the layers is much easier,” he said. “What is more, the recognition tunneling technology we have developed allows us to make a relatively large gap (of two nanometers) compared to the much smaller gaps required previously for tunnel current read-out (which were less than a single nanometer wide). The ability to use larger gaps for tunneling makes the manufacture of the device much easier and gives DNA molecules room to pass the electrodes.”

Specifically, when a current is passed through the nanopore, as the DNA passes through, it causes a spike in the current unique to each chemical base (A, C, T or G) within the DNA molecule. A few more modifications are made to polish and finish the device manufacturing.

The team encountered considerable device-to-device variation, so calibration will be needed to make the technology more robust. And the final big step – of reducing the diameter of the hole through the device to that of a single DNA molecule – has yet to be taken.

But overall, the research team has developed a scalable manufacturing process to make a device that can work reliably for hours at a time, identifying each of the DNA chemical bases while flowing through the two-nanometer gap.

The research team is also working on modifying the technique to read other single molecules, which could be used in an important technology for drug development.

The latest developments could also bring in big business for ASU. Lindsay, dubbed a “serial entrepreneur” by the media, has a new spinout venture, called Recognition Analytix, that hopes to follow the success of Molecular Imaging Corp, a similar instrument company he co-founded in 1993, and sold to Agilent Technologies in 2005.

ASU, IBM move ultrafast, low-cost DNA sequencing technology a step closer to reality

Led by ASU Regents' professor Stuart Lindsay, a team of scientists from Arizona State University's Biodesign Institute and IBM's T.J. Watson Research Center have developed a prototype DNA reader that could make whole genome profiling an everyday practice in medicine. -  Biodesign Institute at Arizona State University
Led by ASU Regents’ professor Stuart Lindsay, a team of scientists from Arizona State University’s Biodesign Institute and IBM’s T.J. Watson Research Center have developed a prototype DNA reader that could make whole genome profiling an everyday practice in medicine. – Biodesign Institute at Arizona State University

A team of scientists from Arizona State University’s Biodesign Institute and IBM’s T.J. Watson Research Center have developed a prototype DNA reader that could make whole genome profiling an everyday practice in medicine.

“Our goal is to put cheap, simple and powerful DNA and protein diagnostic devices into every single doctor’s office,” said Stuart Lindsay, an ASU physics professor and director of Biodesign’s Center for Single Molecule Biophysics. Such technology could help usher in the age of personalized medicine, where information from an individual’s complete DNA and protein profiles could be used to design treatments specific to their individual makeup.

Such game-changing technology is needed to make genome sequencing a reality. The current hurdle is to do so for less than $1,000, an amount for which insurance companies are more likely to provide reimbursement.

In their latest research breakthrough, the team fashioned a tiny, DNA reading device a thousands of times smaller than width of a single human hair.

The device is sensitive enough to distinguish the individual chemical bases of DNA (known by their abbreviated letters of A, C, T or G) when they are pumped past the reading head.

Proof-of-concept was demonstrated, by using solutions of the individual DNA bases, which gave clear signals sensitive enough to detect tiny amounts of DNA (nanomolar concentrations), even better than today’s state-of-the-art, so called next-generation DNA sequencing technology.

Making the solid-state device is just like making a sandwich, just with ultra high-tech semiconductor tools used to slice and stack the atomic-sized layers of meats and cheeses like the butcher shop’s block. The secret is to make slice and stack the layers just so, to turn the chemical information of the DNA into a change in the electrical signal.

First, they made a “sandwich” composed of two metal electrodes separated by a two-nanometer thick insulating layer (a single nanometer is 10,000 times smaller than a human hair), made by using a semiconductor technology called atomic layer deposition.

Then a hole is cut through the sandwich: DNA bases inside the hole are read as they pass the gap between the metal layers.

“The technology we’ve developed might just be the first big step in building a single-molecule sequencing device based on ordinary computer chip technology,” said Lindsay.

“Previous attempts to make tunnel junctions for reading DNA had one electrode facing another across a small gap between the electrodes, and the gaps had to be adjusted by hand. This made it impossible to use computer chip manufacturing methods to make devices,” said Lindsay.

“Our approach of defining the gap using a thin layer of dielectric (insulating) material between the electrodes and exposing this gap by drilling a hole through the layers is much easier,” he said. “What is more, the recognition tunneling technology we have developed allows us to make a relatively large gap (of two nanometers) compared to the much smaller gaps required previously for tunnel current read-out (which were less than a single nanometer wide). The ability to use larger gaps for tunneling makes the manufacture of the device much easier and gives DNA molecules room to pass the electrodes.”

Specifically, when a current is passed through the nanopore, as the DNA passes through, it causes a spike in the current unique to each chemical base (A, C, T or G) within the DNA molecule. A few more modifications are made to polish and finish the device manufacturing.

The team encountered considerable device-to-device variation, so calibration will be needed to make the technology more robust. And the final big step – of reducing the diameter of the hole through the device to that of a single DNA molecule – has yet to be taken.

But overall, the research team has developed a scalable manufacturing process to make a device that can work reliably for hours at a time, identifying each of the DNA chemical bases while flowing through the two-nanometer gap.

The research team is also working on modifying the technique to read other single molecules, which could be used in an important technology for drug development.

The latest developments could also bring in big business for ASU. Lindsay, dubbed a “serial entrepreneur” by the media, has a new spinout venture, called Recognition Analytix, that hopes to follow the success of Molecular Imaging Corp, a similar instrument company he co-founded in 1993, and sold to Agilent Technologies in 2005.

Geologists discover ancient buried canyon in South Tibet

This photo shows the Yarlung Tsangpo Valley close to the Tsangpo Gorge, where it is rather narrow and underlain by only about 250 meters of sediments. The mountains in the upper left corner belong to the Namche Barwa massif. Previously, scientists had suspected that the debris deposited by a glacier in the foreground was responsible for the formation of the steep Tsangpo Gorge -- the new discoveries falsify this hypothesis. -  Ping Wang
This photo shows the Yarlung Tsangpo Valley close to the Tsangpo Gorge, where it is rather narrow and underlain by only about 250 meters of sediments. The mountains in the upper left corner belong to the Namche Barwa massif. Previously, scientists had suspected that the debris deposited by a glacier in the foreground was responsible for the formation of the steep Tsangpo Gorge — the new discoveries falsify this hypothesis. – Ping Wang

A team of researchers from Caltech and the China Earthquake Administration has discovered an ancient, deep canyon buried along the Yarlung Tsangpo River in south Tibet, north of the eastern end of the Himalayas. The geologists say that the ancient canyon–thousands of feet deep in places–effectively rules out a popular model used to explain how the massive and picturesque gorges of the Himalayas became so steep, so fast.

“I was extremely surprised when my colleagues, Jing Liu-Zeng and Dirk Scherler, showed me the evidence for this canyon in southern Tibet,” says Jean-Philippe Avouac, the Earle C. Anthony Professor of Geology at Caltech. “When I first saw the data, I said, ‘Wow!’ It was amazing to see that the river once cut quite deeply into the Tibetan Plateau because it does not today. That was a big discovery, in my opinion.”

Geologists like Avouac and his colleagues, who are interested in tectonics–the study of the earth’s surface and the way it changes–can use tools such as GPS and seismology to study crustal deformation that is taking place today. But if they are interested in studying changes that occurred millions of years ago, such tools are not useful because the activity has already happened. In those cases, rivers become a main source of information because they leave behind geomorphic signatures that geologists can interrogate to learn about the way those rivers once interacted with the land–helping them to pin down when the land changed and by how much, for example.

“In tectonics, we are always trying to use rivers to say something about uplift,” Avouac says. “In this case, we used a paleocanyon that was carved by a river. It’s a nice example where by recovering the geometry of the bottom of the canyon, we were able to say how much the range has moved up and when it started moving.”

The team reports its findings in the current issue of Science.

Last year, civil engineers from the China Earthquake Administration collected cores by drilling into the valley floor at five locations along the Yarlung Tsangpo River. Shortly after, former Caltech graduate student Jing Liu-Zeng, who now works for that administration, returned to Caltech as a visiting associate and shared the core data with Avouac and Dirk Scherler, then a postdoc in Avouac’s group. Scherler had previously worked in the far western Himalayas, where the Indus River has cut deeply into the Tibetan Plateau, and immediately recognized that the new data suggested the presence of a paleocanyon.

Liu-Zeng and Scherler analyzed the core data and found that at several locations there were sedimentary conglomerates, rounded gravel and larger rocks cemented together, that are associated with flowing rivers, until a depth of 800 meters or so, at which point the record clearly indicated bedrock. This suggested that the river once carved deeply into the plateau.

To establish when the river switched from incising bedrock to depositing sediments, they measured two isotopes, beryllium-10 and aluminum-26, in the lowest sediment layer. The isotopes are produced when rocks and sediment are exposed to cosmic rays at the surface and decay at different rates once buried, and so allowed the geologists to determine that the paleocanyon started to fill with sediment about 2.5 million years ago.

The researchers’ reconstruction of the former valley floor showed that the slope of the river once increased gradually from the Gangetic Plain to the Tibetan Plateau, with no sudden changes, or knickpoints. Today, the river, like most others in the area, has a steep knickpoint where it meets the Himalayas, at a place known as the Namche Barwa massif. There, the uplift of the mountains is extremely rapid (on the order of 1 centimeter per year, whereas in other areas 5 millimeters per year is more typical) and the river drops by 2 kilometers in elevation as it flows through the famous Tsangpo Gorge, known by some as the Yarlung Tsangpo Grand Canyon because it is so deep and long.

Combining the depth and age of the paleocanyon with the geometry of the valley, the geologists surmised that the river existed in this location prior to about 3 million years ago, but at that time, it was not affected by the Himalayas. However, as the Indian and Eurasian plates continued to collide and the mountain range pushed northward, it began impinging on the river. Suddenly, about 2.5 million years ago, a rapidly uplifting section of the mountain range got in the river’s way, damming it, and the canyon subsequently filled with sediment.

“This is the time when the Namche Barwa massif started to rise, and the gorge developed,” says Scherler, one of two lead authors on the paper and now at the GFZ German Research Center for Geosciences in Potsdam, Germany.

That picture of the river and the Tibetan Plateau, which involves the river incising deeply into the plateau millions of years ago, differs quite a bit from the typically accepted geologic vision. Typically, geologists believe that when rivers start to incise into a plateau, they eat at the edges, slowly making their way into the plateau over time. However, the rivers flowing across the Himalayas all have strong knickpoints and have not incised much at all into the Tibetan Plateau. Therefore, the thought has been that the rapid uplift of the Himalayas has pushed the rivers back, effectively pinning them, so that they have not been able to make their way into the plateau. But that explanation does not work with the newly discovered paleocanyon.

The team’s new hypothesis also rules out a model that has been around for about 15 years, called tectonic aneurysm, which suggests that the rapid uplift seen at the Namche Barwa massif was triggered by intense river incision. In tectonic aneurysm, a river cuts down through the earth’s crust so fast that it causes the crust to heat up, making a nearby mountain range weaker and facilitating uplift.

The model is popular among geologists, and indeed Avouac himself published a modeling paper in 1996 that showed the viability of the mechanism. “But now we have discovered that the river was able to cut into the plateau way before the uplift happened,” Avouac says, “and this shows that the tectonic aneurysm model was actually not at work here. The rapid uplift is not a response to river incision.”

###

The other lead author on the paper, “Tectonic control of the Yarlung Tsangpo Gorge, revealed by a 2.5 Myr old buried canyon in Southern Tibet,” is Ping Wang of the State Key Laboratory of Earthquake Dynamics, in Beijing, China. Additional authors include Jürgen Mey, of the University of Potsdam, in Germany; and Yunda Zhang and Dingguo Shi of the Chengdu Engineering Corporation, in China. The work was supported by the National Natural Science Foundation of China, the State Key Laboratory for Earthquake Dynamics, and the Alexander von Humboldt Foundation.

Geologists discover ancient buried canyon in South Tibet

This photo shows the Yarlung Tsangpo Valley close to the Tsangpo Gorge, where it is rather narrow and underlain by only about 250 meters of sediments. The mountains in the upper left corner belong to the Namche Barwa massif. Previously, scientists had suspected that the debris deposited by a glacier in the foreground was responsible for the formation of the steep Tsangpo Gorge -- the new discoveries falsify this hypothesis. -  Ping Wang
This photo shows the Yarlung Tsangpo Valley close to the Tsangpo Gorge, where it is rather narrow and underlain by only about 250 meters of sediments. The mountains in the upper left corner belong to the Namche Barwa massif. Previously, scientists had suspected that the debris deposited by a glacier in the foreground was responsible for the formation of the steep Tsangpo Gorge — the new discoveries falsify this hypothesis. – Ping Wang

A team of researchers from Caltech and the China Earthquake Administration has discovered an ancient, deep canyon buried along the Yarlung Tsangpo River in south Tibet, north of the eastern end of the Himalayas. The geologists say that the ancient canyon–thousands of feet deep in places–effectively rules out a popular model used to explain how the massive and picturesque gorges of the Himalayas became so steep, so fast.

“I was extremely surprised when my colleagues, Jing Liu-Zeng and Dirk Scherler, showed me the evidence for this canyon in southern Tibet,” says Jean-Philippe Avouac, the Earle C. Anthony Professor of Geology at Caltech. “When I first saw the data, I said, ‘Wow!’ It was amazing to see that the river once cut quite deeply into the Tibetan Plateau because it does not today. That was a big discovery, in my opinion.”

Geologists like Avouac and his colleagues, who are interested in tectonics–the study of the earth’s surface and the way it changes–can use tools such as GPS and seismology to study crustal deformation that is taking place today. But if they are interested in studying changes that occurred millions of years ago, such tools are not useful because the activity has already happened. In those cases, rivers become a main source of information because they leave behind geomorphic signatures that geologists can interrogate to learn about the way those rivers once interacted with the land–helping them to pin down when the land changed and by how much, for example.

“In tectonics, we are always trying to use rivers to say something about uplift,” Avouac says. “In this case, we used a paleocanyon that was carved by a river. It’s a nice example where by recovering the geometry of the bottom of the canyon, we were able to say how much the range has moved up and when it started moving.”

The team reports its findings in the current issue of Science.

Last year, civil engineers from the China Earthquake Administration collected cores by drilling into the valley floor at five locations along the Yarlung Tsangpo River. Shortly after, former Caltech graduate student Jing Liu-Zeng, who now works for that administration, returned to Caltech as a visiting associate and shared the core data with Avouac and Dirk Scherler, then a postdoc in Avouac’s group. Scherler had previously worked in the far western Himalayas, where the Indus River has cut deeply into the Tibetan Plateau, and immediately recognized that the new data suggested the presence of a paleocanyon.

Liu-Zeng and Scherler analyzed the core data and found that at several locations there were sedimentary conglomerates, rounded gravel and larger rocks cemented together, that are associated with flowing rivers, until a depth of 800 meters or so, at which point the record clearly indicated bedrock. This suggested that the river once carved deeply into the plateau.

To establish when the river switched from incising bedrock to depositing sediments, they measured two isotopes, beryllium-10 and aluminum-26, in the lowest sediment layer. The isotopes are produced when rocks and sediment are exposed to cosmic rays at the surface and decay at different rates once buried, and so allowed the geologists to determine that the paleocanyon started to fill with sediment about 2.5 million years ago.

The researchers’ reconstruction of the former valley floor showed that the slope of the river once increased gradually from the Gangetic Plain to the Tibetan Plateau, with no sudden changes, or knickpoints. Today, the river, like most others in the area, has a steep knickpoint where it meets the Himalayas, at a place known as the Namche Barwa massif. There, the uplift of the mountains is extremely rapid (on the order of 1 centimeter per year, whereas in other areas 5 millimeters per year is more typical) and the river drops by 2 kilometers in elevation as it flows through the famous Tsangpo Gorge, known by some as the Yarlung Tsangpo Grand Canyon because it is so deep and long.

Combining the depth and age of the paleocanyon with the geometry of the valley, the geologists surmised that the river existed in this location prior to about 3 million years ago, but at that time, it was not affected by the Himalayas. However, as the Indian and Eurasian plates continued to collide and the mountain range pushed northward, it began impinging on the river. Suddenly, about 2.5 million years ago, a rapidly uplifting section of the mountain range got in the river’s way, damming it, and the canyon subsequently filled with sediment.

“This is the time when the Namche Barwa massif started to rise, and the gorge developed,” says Scherler, one of two lead authors on the paper and now at the GFZ German Research Center for Geosciences in Potsdam, Germany.

That picture of the river and the Tibetan Plateau, which involves the river incising deeply into the plateau millions of years ago, differs quite a bit from the typically accepted geologic vision. Typically, geologists believe that when rivers start to incise into a plateau, they eat at the edges, slowly making their way into the plateau over time. However, the rivers flowing across the Himalayas all have strong knickpoints and have not incised much at all into the Tibetan Plateau. Therefore, the thought has been that the rapid uplift of the Himalayas has pushed the rivers back, effectively pinning them, so that they have not been able to make their way into the plateau. But that explanation does not work with the newly discovered paleocanyon.

The team’s new hypothesis also rules out a model that has been around for about 15 years, called tectonic aneurysm, which suggests that the rapid uplift seen at the Namche Barwa massif was triggered by intense river incision. In tectonic aneurysm, a river cuts down through the earth’s crust so fast that it causes the crust to heat up, making a nearby mountain range weaker and facilitating uplift.

The model is popular among geologists, and indeed Avouac himself published a modeling paper in 1996 that showed the viability of the mechanism. “But now we have discovered that the river was able to cut into the plateau way before the uplift happened,” Avouac says, “and this shows that the tectonic aneurysm model was actually not at work here. The rapid uplift is not a response to river incision.”

###

The other lead author on the paper, “Tectonic control of the Yarlung Tsangpo Gorge, revealed by a 2.5 Myr old buried canyon in Southern Tibet,” is Ping Wang of the State Key Laboratory of Earthquake Dynamics, in Beijing, China. Additional authors include Jürgen Mey, of the University of Potsdam, in Germany; and Yunda Zhang and Dingguo Shi of the Chengdu Engineering Corporation, in China. The work was supported by the National Natural Science Foundation of China, the State Key Laboratory for Earthquake Dynamics, and the Alexander von Humboldt Foundation.

Climate capers of the past 600,000 years

The researchers remove samples from a core segment taken from Lake Van at the center for Marine environmental sciences MARUM in Bremen, where all of the cores from the PALEOVAN project are stored. -  Photo: Nadine Pickarski/Uni Bonn
The researchers remove samples from a core segment taken from Lake Van at the center for Marine environmental sciences MARUM in Bremen, where all of the cores from the PALEOVAN project are stored. – Photo: Nadine Pickarski/Uni Bonn

If you want to see into the future, you have to understand the past. An international consortium of researchers under the auspices of the University of Bonn has drilled deposits on the bed of Lake Van (Eastern Turkey) which provide unique insights into the last 600,000 years. The samples reveal that the climate has done its fair share of mischief-making in the past. Furthermore, there have been numerous earthquakes and volcanic eruptions. The results of the drilling project also provide a basis for assessing the risk of how dangerous natural hazards are for today’s population. In a special edition of the highly regarded publication Quaternary Science Reviews, the scientists have now published their findings in a number of journal articles.

In the sediments of Lake Van, the lighter-colored, lime-containing summer layers are clearly distinguishable from the darker, clay-rich winter layers — also called varves. In 2010, from a floating platform an international consortium of researchers drilled a 220 m deep sediment profile from the lake floor at a water depth of 360 m and analyzed the varves. The samples they recovered are a unique scientific treasure because the climate conditions, earthquakes and volcanic eruptions of the past 600,000 years can be read in outstanding quality from the cores.

The team of scientists under the auspices of the University of Bonn has analyzed some 5,000 samples in total. “The results show that the climate over the past hundred thousand years has been a roller coaster. Within just a few decades, the climate could tip from an ice age into a warm period,” says Doctor Thomas Litt of the University of Bonn’s Steinmann Institute and spokesman for the PALEOVAN international consortium of researchers. Unbroken continental climate archives from the ice age which encompass several hundred thousand years are extremely rare on a global scale. “There has never before in all of the Middle East and Central Asia been a continental drilling operation going so far back into the past,” says Doctor Litt. In the northern hemisphere, climate data from ice-cores drilled in Greenland encompass the last 120,000 years. The Lake Van project closes a gap in the scientific climate record.

The sediments reveal six cycles of cold and warm periods


Scientists found evidence for a total of six cycles of warm and cold periods in the sediments of Lake Van. The University of Bonn paleoecologist and his colleagues analyzed the pollen preserved in the sediments. Under a microscope they were able to determine which plants around the eastern Anatolian Lake the pollen came from. “Pollen is amazingly durable and is preserved over very long periods when protected in the sediments,” Doctor Litt explained. Insight into the age of the individual layers was gleaned through radiometric age measurements that use the decay of radioactive elements as a geologic clock. Based on the type of pollen and the age, the scientists were able to determine when oak forests typical of warm periods grew around Lake Van and when ice-age steppe made up of grasses, mugwort and goosefoot surrounded the lake.

Once they determine the composition of the vegetation present and the requirements of the plants, the scientists can reconstruct with a high degree of accuracy the temperature and amount of rainfall during different epochs. These analyses enable the team of researchers to read the varves of Lake Van like thousands of pages of an archive. With these data, the team was able to demonstrate that fluctuations in climate were due in large part to periodic changes in the Earth’s orbit parameters and the commensurate changes in solar insolation levels. However, the influence of North Atlantic currents was also evident. “The analysis of the Lake Van sediments has presented us with an image of how an ecosystem reacts to abrupt changes in climate. This fundamental data will help us to develop potential scenarios of future climate effects,” says Doctor Litt.

Risks of earthquakes and volcanic eruptions in the region of Van

Such risk assessments can also be made for other natural forces. “Deposits of volcanic ash with thicknesses of up to 10 m in the Lake Van sediments show us that approximately 270,000 years ago there was a massive eruption,” the University of Bonn paleoecologist said. The team struck some 300 different volcanic events in its drillings. Statistically, that corresponds to one explosive volcanic eruption in the region every 2000 years. Deformations in the sediment layers show that the area is subject to frequent, strong earthquakes. “The area around Lake Van is very densely populated. The data from the core samples show that volcanic activity and earthquakes present a relatively high risk for the region,” Doctor Litt says. According to media reports, in 2011 a 7.2 magnitude earthquake in the Van province claimed the lives of more than 500 people and injured more than 2,500.

Publication: “Results from the PALEOVAN drilling project: A 600,000 year long continental archive in the Near East”, Quaternary Science Reviews, Volume 104, online publication: (http://dx.doi.org/10.1016/j.quascirev.2014.09.026)

Climate capers of the past 600,000 years

The researchers remove samples from a core segment taken from Lake Van at the center for Marine environmental sciences MARUM in Bremen, where all of the cores from the PALEOVAN project are stored. -  Photo: Nadine Pickarski/Uni Bonn
The researchers remove samples from a core segment taken from Lake Van at the center for Marine environmental sciences MARUM in Bremen, where all of the cores from the PALEOVAN project are stored. – Photo: Nadine Pickarski/Uni Bonn

If you want to see into the future, you have to understand the past. An international consortium of researchers under the auspices of the University of Bonn has drilled deposits on the bed of Lake Van (Eastern Turkey) which provide unique insights into the last 600,000 years. The samples reveal that the climate has done its fair share of mischief-making in the past. Furthermore, there have been numerous earthquakes and volcanic eruptions. The results of the drilling project also provide a basis for assessing the risk of how dangerous natural hazards are for today’s population. In a special edition of the highly regarded publication Quaternary Science Reviews, the scientists have now published their findings in a number of journal articles.

In the sediments of Lake Van, the lighter-colored, lime-containing summer layers are clearly distinguishable from the darker, clay-rich winter layers — also called varves. In 2010, from a floating platform an international consortium of researchers drilled a 220 m deep sediment profile from the lake floor at a water depth of 360 m and analyzed the varves. The samples they recovered are a unique scientific treasure because the climate conditions, earthquakes and volcanic eruptions of the past 600,000 years can be read in outstanding quality from the cores.

The team of scientists under the auspices of the University of Bonn has analyzed some 5,000 samples in total. “The results show that the climate over the past hundred thousand years has been a roller coaster. Within just a few decades, the climate could tip from an ice age into a warm period,” says Doctor Thomas Litt of the University of Bonn’s Steinmann Institute and spokesman for the PALEOVAN international consortium of researchers. Unbroken continental climate archives from the ice age which encompass several hundred thousand years are extremely rare on a global scale. “There has never before in all of the Middle East and Central Asia been a continental drilling operation going so far back into the past,” says Doctor Litt. In the northern hemisphere, climate data from ice-cores drilled in Greenland encompass the last 120,000 years. The Lake Van project closes a gap in the scientific climate record.

The sediments reveal six cycles of cold and warm periods


Scientists found evidence for a total of six cycles of warm and cold periods in the sediments of Lake Van. The University of Bonn paleoecologist and his colleagues analyzed the pollen preserved in the sediments. Under a microscope they were able to determine which plants around the eastern Anatolian Lake the pollen came from. “Pollen is amazingly durable and is preserved over very long periods when protected in the sediments,” Doctor Litt explained. Insight into the age of the individual layers was gleaned through radiometric age measurements that use the decay of radioactive elements as a geologic clock. Based on the type of pollen and the age, the scientists were able to determine when oak forests typical of warm periods grew around Lake Van and when ice-age steppe made up of grasses, mugwort and goosefoot surrounded the lake.

Once they determine the composition of the vegetation present and the requirements of the plants, the scientists can reconstruct with a high degree of accuracy the temperature and amount of rainfall during different epochs. These analyses enable the team of researchers to read the varves of Lake Van like thousands of pages of an archive. With these data, the team was able to demonstrate that fluctuations in climate were due in large part to periodic changes in the Earth’s orbit parameters and the commensurate changes in solar insolation levels. However, the influence of North Atlantic currents was also evident. “The analysis of the Lake Van sediments has presented us with an image of how an ecosystem reacts to abrupt changes in climate. This fundamental data will help us to develop potential scenarios of future climate effects,” says Doctor Litt.

Risks of earthquakes and volcanic eruptions in the region of Van

Such risk assessments can also be made for other natural forces. “Deposits of volcanic ash with thicknesses of up to 10 m in the Lake Van sediments show us that approximately 270,000 years ago there was a massive eruption,” the University of Bonn paleoecologist said. The team struck some 300 different volcanic events in its drillings. Statistically, that corresponds to one explosive volcanic eruption in the region every 2000 years. Deformations in the sediment layers show that the area is subject to frequent, strong earthquakes. “The area around Lake Van is very densely populated. The data from the core samples show that volcanic activity and earthquakes present a relatively high risk for the region,” Doctor Litt says. According to media reports, in 2011 a 7.2 magnitude earthquake in the Van province claimed the lives of more than 500 people and injured more than 2,500.

Publication: “Results from the PALEOVAN drilling project: A 600,000 year long continental archive in the Near East”, Quaternary Science Reviews, Volume 104, online publication: (http://dx.doi.org/10.1016/j.quascirev.2014.09.026)