Clues to one of Earth’s oldest craters revealed

The Sudbury Basin located in Ontario, Canada is one of the largest known impact craters on Earth, as well as one of the oldest due to its formation more than 1.8 billion years ago. Researchers who took samples from the site and subjected them to a detailed geochemical analysis say that a comet may have hit the area to create the crater.

“Our analysis revealed a chondritic platinum group element signature within the crater’s fallback deposits; however, the distribution of these elements within the impact structure and other constraints suggest that the impactor was a comet. Thus, it seems that a comet with a chondritic refractory component may have created the world-famous Sudbury basin,” said Joe Petrus, lead author of the Terra Nova paper.

Clues to one of Earth’s oldest craters revealed

The Sudbury Basin located in Ontario, Canada is one of the largest known impact craters on Earth, as well as one of the oldest due to its formation more than 1.8 billion years ago. Researchers who took samples from the site and subjected them to a detailed geochemical analysis say that a comet may have hit the area to create the crater.

“Our analysis revealed a chondritic platinum group element signature within the crater’s fallback deposits; however, the distribution of these elements within the impact structure and other constraints suggest that the impactor was a comet. Thus, it seems that a comet with a chondritic refractory component may have created the world-famous Sudbury basin,” said Joe Petrus, lead author of the Terra Nova paper.

Groundwater warming up in synch

For their study, the researchers were able to fall back on uninterrupted long-term temperature measurements of groundwater flows around the cities of Cologne and Karlsruhe, where the operators of the local waterworks have been measuring the temperature of the groundwater, which is largely uninfluenced by humans, for forty years. This is unique and a rare commodity for the researchers. “For us, the data was a godsend,” stresses Peter Bayer, a senior assistant at ETH Zurich’s Geological Institute. Even with some intensive research, they would not have been able to find a comparable series of measurements. Evidently, it is less interesting or too costly for waterworks to measure groundwater temperatures systematically for a lengthy period of time. “Or the data isn’t digitalised and only archived on paper,” suspects the hydrogeologist.

Damped image of atmospheric warming

Based on the readings, the researchers were able to demonstrate that the groundwater is not just warming up; the warming stages observed in the atmosphere are also echoed. “Global warming is reflected directly in the groundwater, albeit damped and with a certain time lag,” says Bayer, summarising the main results that the project has yielded. The researchers published their study in the journal Hydrology and Earth System Sciences.

The data also reveals that the groundwater close to the surface down to a depth of around sixty metres has warmed up statistically significantly in the course of global warming over the last forty years. This water heating follows the warming pattern of the local and regional climate, which in turn mirrors that of global warming.

The groundwater reveals how the atmosphere has made several temperature leaps at irregular intervals. These “regime shifts” can also be observed in the global climate, as the researchers write in their study. Bayer was surprised at how quickly the groundwater responded to climate change.

Heat exchange with the subsoil


The earth’s atmosphere has warmed up by an average of 0.13 degrees Celsius per decade in the last fifty years. And this warming doesn’t stop at the subsoil, either, as other climate scientists have demonstrated in the last two decades with drillings all over the world. However, the researchers only tended to consider soils that did not contain any water or where there were no groundwater flow.

While the fact that the groundwater has not escaped climate change was revealed by researchers from Eawag and ETH Zurich in a study published three years ago, it only concerned “artificial” groundwater. In order to enhance it, river water is trickled off in certain areas. The temperature profile of the groundwater generated as a result thus matches that of the river water.

The new study, however, examines groundwater that has barely been influenced by humans. According to Bayer, it is plausible that the natural groundwater flow is also warming up in the course of climate change. “The difference in temperature between the atmosphere and the subsoil balances out naturally.” The energy transfer takes place via thermal conduction and the groundwater flow, much like a heat exchanger, which enables the heat transported to spread in the subsoil and level out.

The consequences of these findings, however, are difficult to gauge. The warmer temperatures might influence subterranean ecosystems on the one hand and groundwater-dependent biospheres on the other, which include cold areas in flowing waters where the groundwater discharges. For cryophilic organisms such as certain fish, groundwater warming could have negative consequences.

Consequences difficult to gauge

Higher groundwater temperatures also influence the water’s chemical composition, especially the chemical equilibria of nitrate or carbonate. After all, chemical reactions usually take place more quickly at higher temperatures. Bacterial activity might also increase at rising water temperatures. If the groundwater becomes warmer, undesirable bacteria such as gastro-intestinal disease pathogens might multiply more effectively. However, the scientists can also imagine positive effects. “The groundwater’s excess heat could be used geothermally for instance,” adds Kathrin Menberg, the first author of the study.

Groundwater warming up in synch

For their study, the researchers were able to fall back on uninterrupted long-term temperature measurements of groundwater flows around the cities of Cologne and Karlsruhe, where the operators of the local waterworks have been measuring the temperature of the groundwater, which is largely uninfluenced by humans, for forty years. This is unique and a rare commodity for the researchers. “For us, the data was a godsend,” stresses Peter Bayer, a senior assistant at ETH Zurich’s Geological Institute. Even with some intensive research, they would not have been able to find a comparable series of measurements. Evidently, it is less interesting or too costly for waterworks to measure groundwater temperatures systematically for a lengthy period of time. “Or the data isn’t digitalised and only archived on paper,” suspects the hydrogeologist.

Damped image of atmospheric warming

Based on the readings, the researchers were able to demonstrate that the groundwater is not just warming up; the warming stages observed in the atmosphere are also echoed. “Global warming is reflected directly in the groundwater, albeit damped and with a certain time lag,” says Bayer, summarising the main results that the project has yielded. The researchers published their study in the journal Hydrology and Earth System Sciences.

The data also reveals that the groundwater close to the surface down to a depth of around sixty metres has warmed up statistically significantly in the course of global warming over the last forty years. This water heating follows the warming pattern of the local and regional climate, which in turn mirrors that of global warming.

The groundwater reveals how the atmosphere has made several temperature leaps at irregular intervals. These “regime shifts” can also be observed in the global climate, as the researchers write in their study. Bayer was surprised at how quickly the groundwater responded to climate change.

Heat exchange with the subsoil


The earth’s atmosphere has warmed up by an average of 0.13 degrees Celsius per decade in the last fifty years. And this warming doesn’t stop at the subsoil, either, as other climate scientists have demonstrated in the last two decades with drillings all over the world. However, the researchers only tended to consider soils that did not contain any water or where there were no groundwater flow.

While the fact that the groundwater has not escaped climate change was revealed by researchers from Eawag and ETH Zurich in a study published three years ago, it only concerned “artificial” groundwater. In order to enhance it, river water is trickled off in certain areas. The temperature profile of the groundwater generated as a result thus matches that of the river water.

The new study, however, examines groundwater that has barely been influenced by humans. According to Bayer, it is plausible that the natural groundwater flow is also warming up in the course of climate change. “The difference in temperature between the atmosphere and the subsoil balances out naturally.” The energy transfer takes place via thermal conduction and the groundwater flow, much like a heat exchanger, which enables the heat transported to spread in the subsoil and level out.

The consequences of these findings, however, are difficult to gauge. The warmer temperatures might influence subterranean ecosystems on the one hand and groundwater-dependent biospheres on the other, which include cold areas in flowing waters where the groundwater discharges. For cryophilic organisms such as certain fish, groundwater warming could have negative consequences.

Consequences difficult to gauge

Higher groundwater temperatures also influence the water’s chemical composition, especially the chemical equilibria of nitrate or carbonate. After all, chemical reactions usually take place more quickly at higher temperatures. Bacterial activity might also increase at rising water temperatures. If the groundwater becomes warmer, undesirable bacteria such as gastro-intestinal disease pathogens might multiply more effectively. However, the scientists can also imagine positive effects. “The groundwater’s excess heat could be used geothermally for instance,” adds Kathrin Menberg, the first author of the study.

Volcano hazards and the role of westerly wind bursts in El Niño

On June 27, lava from Kīlauea, an active volcano on the island of Hawai'i, began flowing to the northeast, threatening the residents in a community in the District of Puna. -  USGS
On June 27, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in a community in the District of Puna. – USGS

On 27 June, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in Pāhoa, a community in the District of Puna, as well as the only highway accessible to this area. Scientists from the U.S. Geological Survey’s Hawaiian Volcano Observatory (HVO) and the Hawai’i County Civil Defense have been monitoring the volcano’s lava flow and communicating with affected residents through public meetings since 24 August. Eos recently spoke with Michael Poland, a geophysicist at HVO and a member of the Eos Editorial Advisory Board, to discuss how he and his colleagues communicated this threat to the public.

Drilling a Small Basaltic Volcano to Reveal Potential Hazards


Drilling into the Rangitoto Island Volcano in the Auckland Volcanic Field in New Zealand offers insight into a small monogenetic volcano, and may improve understanding of future hazards.

From AGU’s journals: El Niño fades without westerly wind bursts

The warm and wet winter of 1997 brought California floods, Florida tornadoes, and an ice storm in the American northeast, prompting climatologists to dub it the El Niño of the century. Earlier this year, climate scientists thought the coming winter might bring similar extremes, as equatorial Pacific Ocean conditions resembled those seen in early 1997. But the signals weakened by summer, and the El Niño predictions were downgraded. Menkes et al. used simulations to examine the differences between the two years.

The El Niño-Southern Oscillation is defined by abnormally warm sea surface temperatures in the eastern Pacific Ocean and weaker than usual trade winds. In a typical year, southeast trade winds push surface water toward the western Pacific “warm pool”–a region essential to Earth’s climate. The trade winds dramatically weaken or even reverse in El Niño years, and the warm pool extends its reach east.

Scientists have struggled to predict El Niño due to irregularities in the shape, amplitude, and timing of the surges of warm water. Previous studies suggested that short-lived westerly wind pulses (i.e. one to two weeks long) could contribute to this irregularity by triggering and sustaining El Niño events.

To understand the vanishing 2014 El Niño, the authors used computer simulations and examined the wind’s role. The researchers find pronounced differences between 1997 and 2014. Both years saw strong westerly wind events between January and March, but those disappeared this year as spring approached. In contrast, the westerly winds persisted through summer in 1997.

In the past, it was thought that westerly wind pulses were three times as likely to form if the warm pool extended east of the dateline. That did not occur this year. The team says their analysis shows that El Niño’s strength might depend on these short-lived and possibly unpredictable pulses.

###

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing more than 62,000 members in 144 countries. Join our conversation on Facebook, Twitter, YouTube, and other social media channels.

Volcano hazards and the role of westerly wind bursts in El Niño

On June 27, lava from Kīlauea, an active volcano on the island of Hawai'i, began flowing to the northeast, threatening the residents in a community in the District of Puna. -  USGS
On June 27, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in a community in the District of Puna. – USGS

On 27 June, lava from Kīlauea, an active volcano on the island of Hawai’i, began flowing to the northeast, threatening the residents in Pāhoa, a community in the District of Puna, as well as the only highway accessible to this area. Scientists from the U.S. Geological Survey’s Hawaiian Volcano Observatory (HVO) and the Hawai’i County Civil Defense have been monitoring the volcano’s lava flow and communicating with affected residents through public meetings since 24 August. Eos recently spoke with Michael Poland, a geophysicist at HVO and a member of the Eos Editorial Advisory Board, to discuss how he and his colleagues communicated this threat to the public.

Drilling a Small Basaltic Volcano to Reveal Potential Hazards


Drilling into the Rangitoto Island Volcano in the Auckland Volcanic Field in New Zealand offers insight into a small monogenetic volcano, and may improve understanding of future hazards.

From AGU’s journals: El Niño fades without westerly wind bursts

The warm and wet winter of 1997 brought California floods, Florida tornadoes, and an ice storm in the American northeast, prompting climatologists to dub it the El Niño of the century. Earlier this year, climate scientists thought the coming winter might bring similar extremes, as equatorial Pacific Ocean conditions resembled those seen in early 1997. But the signals weakened by summer, and the El Niño predictions were downgraded. Menkes et al. used simulations to examine the differences between the two years.

The El Niño-Southern Oscillation is defined by abnormally warm sea surface temperatures in the eastern Pacific Ocean and weaker than usual trade winds. In a typical year, southeast trade winds push surface water toward the western Pacific “warm pool”–a region essential to Earth’s climate. The trade winds dramatically weaken or even reverse in El Niño years, and the warm pool extends its reach east.

Scientists have struggled to predict El Niño due to irregularities in the shape, amplitude, and timing of the surges of warm water. Previous studies suggested that short-lived westerly wind pulses (i.e. one to two weeks long) could contribute to this irregularity by triggering and sustaining El Niño events.

To understand the vanishing 2014 El Niño, the authors used computer simulations and examined the wind’s role. The researchers find pronounced differences between 1997 and 2014. Both years saw strong westerly wind events between January and March, but those disappeared this year as spring approached. In contrast, the westerly winds persisted through summer in 1997.

In the past, it was thought that westerly wind pulses were three times as likely to form if the warm pool extended east of the dateline. That did not occur this year. The team says their analysis shows that El Niño’s strength might depend on these short-lived and possibly unpredictable pulses.

###

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing more than 62,000 members in 144 countries. Join our conversation on Facebook, Twitter, YouTube, and other social media channels.

Some plants regenerate by duplicating their DNA

Animal biology professor Ken Paige (left) and postdoctoral fellow Daniel Scholes found that a plant's ability to duplicate its genome within individual cells influences its ability to regenerate. -  L. Brian Stauffer
Animal biology professor Ken Paige (left) and postdoctoral fellow Daniel Scholes found that a plant’s ability to duplicate its genome within individual cells influences its ability to regenerate. – L. Brian Stauffer

When munched by grazing animals (or mauled by scientists in the lab), some herbaceous plants overcompensate – producing more plant matter and becoming more fertile than they otherwise would. Scientists say they now know how these plants accomplish this feat of regeneration.

They report their findings in the journal Molecular Ecology.

Their study is the first to show that a plant’s ability to dramatically rebound after being cut down relies on a process called genome duplication, in which individual cells make multiple copies of all of their genetic content.

Genome duplication is not new to science; researchers have known about the phenomenon for decades. But few have pondered its purpose, said University of Illinois animal biology professor Ken Paige, who conducted the study with postdoctoral researcher Daniel Scholes.

“Most herbaceous plants – 90 percent – duplicate their genomes,” Paige said. “We wanted to know what this process was for.”

In a 2011 study, Paige and Scholes demonstrated that plants that engage in rampant genome duplication also rebound more vigorously after being damaged. The researchers suspected that genome duplication was giving the plants the boost they needed to overcome adversity.

That study and the new one focused on Arabidopsis thaliana, a plant in the mustard family that often is used as a laboratory subject. Some Arabidopsis plants engage in genome duplication and others don’t. Those that do can accumulate dozens of copies of all of their chromosomes in individual cells.

In the new study, Scholes crossed Arabidopsis plants that had the ability to duplicate their genomes with those that lacked this ability. If the relationship between DNA duplication and regeneration was mere happenstance, the association between the two should disappear in their offspring, Scholes said.

“But the association persisted in the offspring,” he said. “That’s the first line of evidence that these two traits seem to be influencing each other.”

To further test the hypothesis, Scholes experimentally enhanced an Arabidopsis plant’s ability to duplicate its genome. He chose a line that lacked that ability and that also experienced a major reduction in fertility after being grazed.

As expected, the altered plant gained the ability to vigorously rebound after being damaged, the researchers reported.

“We were able to completely mitigate the otherwise detrimental effects of damage,” Scholes said. “There was no difference in fertility between damaged and undamaged plants.”

Genome duplication enlarges cells and provides more copies of individual genes, likely increasing the production of key proteins and other molecules that drive cell growth, Scholes said. Future studies will test these ideas, he said.

The National Science Foundation and U. of I. Research Board funded this research.

Some plants regenerate by duplicating their DNA

Animal biology professor Ken Paige (left) and postdoctoral fellow Daniel Scholes found that a plant's ability to duplicate its genome within individual cells influences its ability to regenerate. -  L. Brian Stauffer
Animal biology professor Ken Paige (left) and postdoctoral fellow Daniel Scholes found that a plant’s ability to duplicate its genome within individual cells influences its ability to regenerate. – L. Brian Stauffer

When munched by grazing animals (or mauled by scientists in the lab), some herbaceous plants overcompensate – producing more plant matter and becoming more fertile than they otherwise would. Scientists say they now know how these plants accomplish this feat of regeneration.

They report their findings in the journal Molecular Ecology.

Their study is the first to show that a plant’s ability to dramatically rebound after being cut down relies on a process called genome duplication, in which individual cells make multiple copies of all of their genetic content.

Genome duplication is not new to science; researchers have known about the phenomenon for decades. But few have pondered its purpose, said University of Illinois animal biology professor Ken Paige, who conducted the study with postdoctoral researcher Daniel Scholes.

“Most herbaceous plants – 90 percent – duplicate their genomes,” Paige said. “We wanted to know what this process was for.”

In a 2011 study, Paige and Scholes demonstrated that plants that engage in rampant genome duplication also rebound more vigorously after being damaged. The researchers suspected that genome duplication was giving the plants the boost they needed to overcome adversity.

That study and the new one focused on Arabidopsis thaliana, a plant in the mustard family that often is used as a laboratory subject. Some Arabidopsis plants engage in genome duplication and others don’t. Those that do can accumulate dozens of copies of all of their chromosomes in individual cells.

In the new study, Scholes crossed Arabidopsis plants that had the ability to duplicate their genomes with those that lacked this ability. If the relationship between DNA duplication and regeneration was mere happenstance, the association between the two should disappear in their offspring, Scholes said.

“But the association persisted in the offspring,” he said. “That’s the first line of evidence that these two traits seem to be influencing each other.”

To further test the hypothesis, Scholes experimentally enhanced an Arabidopsis plant’s ability to duplicate its genome. He chose a line that lacked that ability and that also experienced a major reduction in fertility after being grazed.

As expected, the altered plant gained the ability to vigorously rebound after being damaged, the researchers reported.

“We were able to completely mitigate the otherwise detrimental effects of damage,” Scholes said. “There was no difference in fertility between damaged and undamaged plants.”

Genome duplication enlarges cells and provides more copies of individual genes, likely increasing the production of key proteins and other molecules that drive cell growth, Scholes said. Future studies will test these ideas, he said.

The National Science Foundation and U. of I. Research Board funded this research.

Re-learning how to read a genome

New research has revealed that the initial steps of reading DNA are actually remarkably similar at both the genes that encode proteins (here, on the right) and regulatory elements (on the left). The main differences seem to occur after this initial step. Gene messages are long and stable enough to ensure that genes become proteins, whereas regulatory messages are short and unstable, and are rapidly 'cleaned up' by the cell. -  Adam Siepel, Cold Spring Harbor Laboratory
New research has revealed that the initial steps of reading DNA are actually remarkably similar at both the genes that encode proteins (here, on the right) and regulatory elements (on the left). The main differences seem to occur after this initial step. Gene messages are long and stable enough to ensure that genes become proteins, whereas regulatory messages are short and unstable, and are rapidly ‘cleaned up’ by the cell. – Adam Siepel, Cold Spring Harbor Laboratory

There are roughly 20,000 genes and thousands of other regulatory “elements” stored within the three billion letters of the human genome. Genes encode information that is used to create proteins, while other genomic elements help regulate the activation of genes, among other tasks. Somehow all of this coded information within our DNA needs to be read by complex molecular machinery and transcribed into messages that can be used by our cells.

Usually, reading a gene is thought to be a lot like reading a sentence. The reading machinery is guided to the start of the gene by various sequences in the DNA – the equivalent of a capital letter – and proceeds from left to right, DNA letter by DNA letter, until it reaches a sequence that forms a punctuation mark at the end. The capital letter and punctuation marks that tell the cell where, when, and how to read a gene are known as regulatory elements.

But scientists have recently discovered that genes aren’t the only messages read by the cell. In fact, many regulatory elements themselves are also read and transcribed into messages, the equivalent of pronouncing the words “capital letter,” “comma,” or “period.” Even more surprising, genes are read bi-directionally from so-called “start sites” – in effect, generating messages in both forward and backward directions.

With all these messages, how does the cell know which one encodes the information needed to make a protein? Is there something different about the reading process at genes and regulatory elements that helps avoid confusion? New research, published today in Nature Genetics, has revealed that the initial steps of the reading process itself are actually remarkably similar at both genes and regulatory elements. The main differences seem to occur after this initial step, in the length and stability of the messages. Gene messages are long and stable enough to ensure that genes becomes proteins, whereas regulatory messages are short and unstable, and are rapidly “cleaned up” by the cell.

To make the distinction, the team, which was co-led by CSHL Professor Adam Siepel and Cornell University Professor John Lis, looked for differences between the initial reading processes at genes and a set of regulatory elements called enhancers. “We took advantage of highly sensitive experimental techniques developed in the Lis lab to measure newly made messages in the cell,” says Siepel. “It’s like having a new, more powerful microscope for observing the process of transcription as it occurs in living cells.”

Remarkably, the team found that the reading patterns for enhancer and gene messages are highly similar in many respects, sharing a common architecture. “Our data suggests that the same basic reading process is happening at genes and these non-genic regulatory elements,” explains Siepel. “This points to a unified model for how DNA transcription is initiated throughout the genome.”

Working together, the biochemists from Lis’s laboratory and the computer jockeys from Siepel’s group carefully compared the patterns at enhancers and genes, combining their own data with vast public data sets from the NIH’s Encyclopedia of DNA Elements (ENCODE) project. “By many different measures, we found that the patterns of transcription initiation are essentially the same at enhancers and genes,” says Siepel. “Most RNA messages are rapidly targeted for destruction, but the messages at genes that are read in the right direction – those destined to be a protein – are spared from destruction.” The team was able to devise a model to mathematically explain the difference between stable and unstable transcripts, offering insight into what defines a gene. According to Siepel, “Our analysis shows that the ‘code’ for stability is, in large part, written in the DNA, at enhancers and genes alike.”

This work has important implications for the evolutionary origins of new genes, according to Siepel. “Because DNA is read in both directions from any start site, every one of these sites has the potential to generate two protein-coding genes with just a few subtle changes. The genome is full of potential new genes.”

This work was supported by the National Institutes of Health.

“Analysis of transcription start sites from nascent RNA identifies a unified architecture of initiation regions at mammalian promoters and enhancers.” appears online in Nature Genetics on November 10, 2014. The authors are: Leighton Core, André Martins, Charles Danko, Colin Waters, Adam Siepel, and John Lis. The paper can be obtained online at: http://dx.doi.org/10.1038/ng.3142

About Cold Spring Harbor Laboratory

Founded in 1890, Cold Spring Harbor Laboratory (CSHL) has shaped contemporary biomedical research and education with programs in cancer, neuroscience, plant biology and quantitative biology. CSHL is ranked number one in the world by Thomson Reuters for the impact of its research in molecular biology and genetics. The Laboratory has been home to eight Nobel Prize winners. Today, CSHL’s multidisciplinary scientific community is more than 600 researchers and technicians strong and its Meetings & Courses program hosts more than 12,000 scientists from around the world each year to its Long Island campus and its China center. For more information, visit http://www.cshl.edu.

Good vibrations give electrons excitations that rock an insulator to go metallic

Vanadium atoms (blue) have unusually large thermal vibrations that stabilize the metallic state of a vanadium dioxide crystal. Red depicts oxygen atoms. -  ORNL
Vanadium atoms (blue) have unusually large thermal vibrations that stabilize the metallic state of a vanadium dioxide crystal. Red depicts oxygen atoms. – ORNL

For more than 50 years, scientists have debated what turns particular oxide insulators, in which electrons barely move, into metals, in which electrons flow freely. Some scientists sided with Nobel Prize-winning physicist Nevill Mott in thinking direct interactions between electrons were the key. Others believed, as did physicist Rudolf Peierls, that atomic vibrations and distortions trumped all. Now, a team led by the Department of Energy’s Oak Ridge National Laboratory has made an important advancement in understanding a classic transition-metal oxide, vanadium dioxide, by quantifying the thermodynamic forces driving the transformation. The results are published in the Nov. 10 advance online issue of Nature.

“We proved that phonons–the vibrations of the atoms–provide the driving force that stabilizes the metal phase when the material is heated,” said John Budai, who co-led the study with Jiawang Hong, a colleague in ORNL’s Materials Science and Technology Division.

Hong added, “This insight into how lattice vibrations can control phase stability in transition-metal oxides is needed to improve the performance of many multifunctional materials, including colossal magnetoresistors, superconductors and ferroelectrics.”

Today vanadium dioxide improves recording and storage media, strengthens structural alloys, and colors synthetic jewels. Tomorrow it may find its way into nanoscale actuators for switches, optical shutters that turn opaque on satellites to thwart intruding signals, and field-effect transistors to manipulate electronics in semiconductors and spintronics in devices that manipulate magnetic spin.

The next application we see may be energy-efficient “smart windows” coated with vanadium dioxide peppered with an impurity to control the transmission of heat and light. On cool days, windows would be transparent insulators that let in heat. On warm days, they would turn shiny and reflect the outside heat.

Complete thermodynamics


Materials are stabilized by a competition between internal energy and entropy (a measure of disorder that increases with temperature). While Mott and Peierls focused on energy, the ORNL-led team focused on the entropy.

Before the ORNL-led experiments, scientists knew the total amount of heat absorbed during vanadium dioxide’s transition from insulator to metal. But they didn’t know how much entropy was due to electrons and how much was due to atomic vibrations.

“This is the first complete description of thermodynamic forces controlling this archetypical metal-insulator transition,” said Budai.

The team’s current accomplishment was made possible by a novel combination of X-ray and neutron scattering tools, developed within the decade, that enabled lattice dynamics measurements and a calculation technique that Olle Hellman of Linköping University in Sweden recently developed to capture anharmonicity (a measure of nonlinearity in bond forces between atoms). It’s especially important that the calculations, performed by Hong, agree well with experiments because they can now be used to make new predictions for other materials.

The ORNL team came up with the idea to measure “incoherent” neutron scattering (each atom scatters independently) at ORNL’s Spallation Neutron Source (SNS) to determine the phonon spectra at many temperatures, and to measure coherent inelastic and diffuse X-ray scattering at Argonne National Laboratory’s Advanced Photon Source (APS) to probe collective vibrations in pristine crystals. Neutron measurements were enabled by the SNS’s large neutron flux, and X-ray measurements benefited from the high-resolution enabled by the high APS brightness. SNS and APS are DOE Office of Science User Facilities.

Among ORNL collaborators, Robert McQueeney made preliminary X-ray measurements and Lynn Boatner grew crystals for the experiment. Eliot Specht mapped phonon dispersions with diffuse X-ray scattering. Michael Manley and Olivier Delaire determined the phonon spectra using inelastic neutron scattering. Postdoctoral researcher Chen Li helped make experimental measurements and provided neutron expertise. Douglas Abernathy provided expertise with experimental beam lines, as did Argonne’s Ayman Said, Bogdan Leu and Jonathan Tischler.

Their measurements revealed that phonons with unusually large atomic vibrations and strong anharmonicity are responsible for about two-thirds of the total heat that each atom transfers during the lattice’s transition to a metallic phase.

“The entropy of the lattice vibrations competes against and overcomes the electronic energy, and that’s why the metallic phase is stabilized at high temperatures in vanadium dioxide,” Budai summed up. “Using comprehensive measurements and new calculations, we’re the first to close this gap and present convincing arguments for the dominant influence of low-energy, strongly anharmonic phonons.”

Atomic underpinnings


The findings reveal that the vanadium-dioxide lattice is anharmonic in the metal state. Think of atoms connected by bonds in a lattice as masses connected by springs. Pull on a mass and let go; it bounces. If the force is proportional to the distance a mass is pulled, the interaction is harmonic. Vanadium dioxide’s anharmonicity greatly complicates the way the lattice wiggles upon heating.

“A material that only had harmonic connections between atoms would have no thermal expansion; if you heat it up, it would stay the same size,” said Budai. Most materials, it turns out, are somewhat anharmonic. Metals, for example, expand when heated.

When heated to 340 kelvin (just above room temperature), vanadium dioxide turns from insulator to metal. Below 340 K, its lowest-energy lattice configuration is akin to a leaning cardboard box. Above 340 K, where entropy due to phonon vibrations dominates, its preferred state has all bond angles at 90 degrees. The phase change is fully reversible, so cooling a metal below the transition temperature reverts it to an insulator, and heating it past this point turns it metallic.

In metallic vanadium dioxide, each vanadium atom has one electron that is free to roam. In contrast, in insulating vanadium dioxide, that electron gets trapped in a chemical bond that forms vanadium dimers. “For understanding the atomic mechanisms, we needed theory,” Budai said.

That’s where Hong, a theorist at ORNL’s Center for Accelerating Materials Modeling, made critical contributions with quantum molecular dynamics calculations. He ran large-scale simulations at the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility at Lawrence Berkeley National Laboratory, using 1 million computing-core hours to simulate the lattice dynamics of metal and insulator phases of vanadium dioxide. All three types of experiments agreed well with Hong’s simulations. In addition, his calculation further reveals how phonon and electron contributions compete in the different phases.

Predicting new materials


“The theory not only provides us deep understanding of the experimental observations and reveals fundamental principles behind them,” said Hong, “but also gives us predictive modeling, which will accelerate fundamental and technological innovation by giving efficient strategies to design new materials with remarkable properties.”

Many other materials besides vanadium dioxide show a metal-to-insulator transition; however, the detailed role of lattice vibrations in controlling phase stability remains largely unknown. In future studies of other transition metal oxides, the researchers will continue to investigate the impact of anharmonic phonons on physical properties such as electrical conductivity and thermal transport. This fundamental research will help guide the development of improved energy-efficient materials.