The North Pacific, a global backup generator for past climate change

The left panel shows the glacial conveyor belt flow 21,000 years ago.
The right panel shows a reorganized conveyor belt flow 17,500-15,000 years ago with deep-water sinking in the North Pacific. -  IPRC/SOEST
The left panel shows the glacial conveyor belt flow 21,000 years ago.
The right panel shows a reorganized conveyor belt flow 17,500-15,000 years ago with deep-water sinking in the North Pacific. – IPRC/SOEST

Toward the end of the last ice age, a major reorganization took place in the current system of the North Pacific with far-reaching implications for climate, according to a new study published in the July 9, 2010, issue of Science by an international team of scientists from Japan, Hawaii, and Belgium.

Earth’s climate is regulated largely by the world ocean’s density-driven circulation, which brings warm surface water to the polar regions and transports cold water away from there at depth. As poleward flowing salty waters cool in the North Atlantic, they become so heavy that they sink. This sinking acts as a pump for the ocean’s conveyor belt circulation.

A well-established fact by now is that there have been times in the past when the North Atlantic branch of the conveyor belt circulation was shut down by melting ice sheets, which released so much fresh glacial meltwater that the sinking of cold water in the Nordic Seas stopped and the Northern Hemisphere was plunged into a deep freeze. The last time such a collapse took place was toward the end of the last ice age, from around 17,500 to 15,000 years ago, the first stage of what scientists call the Mystery Interval.

About that time, the North Pacific branch of the conveyor belt changed drastically, according to this study in Science. “The reconstructed changes in the North Pacific current system may have buffered the global impacts of the collapsed circulation in the Atlantic and possibly prevented further cooling of the Northern Hemisphere,” says Axel Timmermann at the International Pacific Research Center, University of Hawaii at Manoa, and corresponding author of the paper.

“Around 17,000 years ago, the North Pacific surface waters grew saltier, and the resulting higher density there caused massive sinking. Newly formed icy deep water spilled out of the subarctic North Pacific at depths of 2000-3000 meters merging into a southward flowing deep western boundary current. A warm, strong poleward current, moreover, formed at the surface. It released much heat into the atmosphere and supplied water for the Pacific deep overturning circulation,” explains Yusuke Okazaki of the Japan Agency for Marine-Earth Science and Technology and lead author on the paper.

The deep overturning circulation in the Pacific may have also stirred up old carbon-rich deep waters, contributing to the increase in atmospheric CO2 concentration during the last glacial termination. “This could have catalyzed further warming and accelerated the glacial meltdown,” says Laurie Menviel, also at the International Pacific Research Center and a co-author on this study.

The observational evidence for these circulation changes comes from analysis of radiocarbon data taken from 30 sediment cores at various locations in the North Pacific. A comparison of the concentrations of radioactively decaying carbon in marine organisms (foraminifera) living at the surface and ocean bottom in various regions of the North Pacific Ocean yields information about the ages of water masses over this time period. From this data, the scientists could reconstruct and draw a map of the altered circulation.

To complement these observational analyses, the authors used a computer model that simulates the interactions among the ocean basins, seaice, the atmosphere, land vegetation, and the global marine-carbon cycle. This “earth system model” was run under conditions that mimicked the catastrophic meltwater discharge from the retreating ice sheets 17,500 – 15,000 years ago and disrupted the heat engine in the North Atlantic.

The computer simulation pointed to the same reorganization of the North Pacific overturning circulation as the sediment core data. And both suggest that during this period, the North Pacific Ocean served as a kind of global backup generator to partly offset the global effects of plunging temperatures in the North Atlantic.

“An ultimate test for the proposed mechanisms would be a sediment-core transect through Kamchatka Strait. It would show changes in the water mass ages and flow rates in what would have been a bottleneck for the southward flowing deep currents in the Pacific during the early Mystery Interval,” concludes Timmermann. “In the meantime our findings caution against the Atlantic-centric view of abrupt climate change that has prevailed amongst climate scientists for the last 20 years. They highlight the complicated adjustments happening in the global ocean during these periods of climate change, in which the North Pacific was definitely a player to be considered.”

New findings indicate sediment composition affected the strength of Sumatran earthquake

Researchers found differences between the rocks in the regions where the 2004 and 2005 Sumatran earthquakes occurred. In the southern part of the region where the 2004 earthquake occurred, the earthquake rupture began closer to shore and did not reach as far seaward. Comparatively, the 2005 earthquake ruptured farther seaward beneath a thick wedge of compacted sedimentary rocks. These differences help explain why the 2004 earthquake and ensuing tsunami were more severe than subsequent events in 2005. - Credit: Nicolle Rager Fuller, National Science Foundation
Researchers found differences between the rocks in the regions where the 2004 and 2005 Sumatran earthquakes occurred. In the southern part of the region where the 2004 earthquake occurred, the earthquake rupture began closer to shore and did not reach as far seaward. Comparatively, the 2005 earthquake ruptured farther seaward beneath a thick wedge of compacted sedimentary rocks. These differences help explain why the 2004 earthquake and ensuing tsunami were more severe than subsequent events in 2005. – Credit: Nicolle Rager Fuller, National Science Foundation

Sumatra experiences frequent seismic activity because it is located near the boundary of two of Earth’s tectonic plates. Earthquakes occur at ‘subduction zones,’ such as the one west of Indonesia, when one tectonic plate is forced under another–or subducts. Instead of sliding across one another smoothly, the plates stick, and energy builds up until they finally slip or ‘rupture’, releasing that stored energy as an earthquake.

These earthquakes can generate tsunamis when the seafloor moves up or down rapidly. But why do some earthquakes, such as the 2004 Sumatra “Boxing Day Tsunami”, create large hazards, while others do not?

Three months after the catastrophic December 2004 earthquake and tsunami events, another strong, albeit smaller, quake occurred immediately to the south, but this earthquake triggered only a local tsunami.

“Many people wondered why the 2004 quake was so large,” said Sean Gulick, a geophysicist from the University of Texas at Austin. “Perhaps a more interesting question is: why wasn’t it larger? Why did the rupture occur as two events instead of one large one?

With support from the National Science Foundation (NSF), Gulick joined an international research team to try to figure out why there were two quakes, and what made them so different. Working aboard the research vessel Sonne, the scientists used seismic instruments to study layers of sediment beneath the seafloor with sound waves.

The researchers found that the fault surface where the two tectonic plates meet, called a décollement, has different properties in the two earthquake rupture regions. In the southern part of the 2004 area, the décollement imaging results show a bright reflection, while the décollement in the 2005 area does not. This difference in the images suggests changes in the composition of the rocks–or the fault itself–between the rupture areas. These characteristics may partially explain why the areas did not rupture together and may also contribute to differences in the tsunamis produced by both events.

Scientists believe this difference in composition, combined with several other factors, resulted in the fault slipping over a much wider part of the margin and farther seaward in the 2004 event.They suspect that because more earth moved, more of the seafloor moved and more water was displaced, resulting in a larger tsunami.

Compared to similar studies of other subduction zones around the world, the team believes the region of the 2004 Sumatra earthquake is very unusual, and that tsunami hazards may be particularly high in this area.

Heat waves could be commonplace in the US by 2039, Stanford study finds

By 2039, most of the US could experience at least four seasons equally as intense as the hottest season ever recorded from 1951-1999, according to Stanford University climate scientists. In most of Utah, Colorado, Arizona and New Mexico, the number of extremely hot seasons could be as high as seven. -  Noah Diffenbaugh, Stanford University
By 2039, most of the US could experience at least four seasons equally as intense as the hottest season ever recorded from 1951-1999, according to Stanford University climate scientists. In most of Utah, Colorado, Arizona and New Mexico, the number of extremely hot seasons could be as high as seven. – Noah Diffenbaugh, Stanford University

Exceptionally long heat waves and other hot events could become commonplace in the United States in the next 30 years, according to a new study by Stanford University climate scientists.

“Using a large suite of climate model experiments, we see a clear emergence of much more intense, hot conditions in the U.S. within the next three decades,” said Noah Diffenbaugh, an assistant professor of environmental Earth system science at Stanford and the lead author of the study.

Writing in the journal Geophysical Research Letters (GRL), Diffenbaugh concluded that hot temperature extremes could become frequent events in the U.S. by 2039, posing serious risks to agriculture and human health.

“In the next 30 years, we could see an increase in heat waves like the one now occurring in the eastern United States or the kind that swept across Europe in 2003 that caused tens of thousands of fatalities,” said Diffenbaugh, a center fellow at Stanford’s Woods Institute for the Environment. “Those kinds of severe heat events also put enormous stress on major crops like corn, soybean, cotton and wine grapes, causing a significant reduction in yields.”

The GRL study took two years to complete and is co-authored by Moetasim Ashfaq, a former Stanford postdoctoral fellow now at the Oak Ridge National Laboratory. The study comes on the heels of a recent NASA report, which concluded that the previous decade, January 2000 to December 2009, was the warmest on record.

2-degree threshold

In the study, Diffenbaugh and Ashfaq used two dozen climate models to project what could happen in the U.S. if increased carbon dioxide emissions raised the Earth’s temperature by 1.8 degrees Fahrenheit (1 degree Celsius) between 2010 and 2039 – a likely scenario, according to the International Panel on Climate Change.

In that scenario, the mean global temperature in 30 years would be about 3.6 degrees F (2 degrees C) hotter than in the preindustrial era of the 1850s. Many climate scientists and policymakers have targeted a 2-degree C temperature increase as the maximum threshold beyond which the planet is likely to experience serious environmental damage. For example, in the 2009 Copenhagen Climate Accord, the United States and more than 100 other countries agreed to consider action to reduce greenhouse gas emissions “so as to hold the increase in global temperature below 2 degrees Celsius.”

But that target may be too high to avoid dangerous climate change, Diffenbaugh said, noting that millions of Americans could see a sharp rise in the number of extreme temperature events before 2039, when the 2-degree threshold is expected to be reached.

“Our results suggest that limiting global warming to 2 degrees Celsius above preindustrial conditions may not be sufficient to avoid serious increases in severely hot conditions,” Diffenbaugh said.

Record heat

For the GRL study, the researchers analyzed temperature data for the continental U.S. from 1951-1999. Their goal was to determine the longest heat waves and hottest seasons on record in the second half of the 20th century.

Those results were fed into an ensemble of climate forecasting models, including the high-resolution RegCM3, which is capable of simulating daily temperatures across small sections of the U.S.

“This was an unprecedented experiment,” Diffenbaugh said. “With the high-resolution climate model, we can analyze geographic quadrants that are only 15.5 miles (25 kilometers) to a side. No one has ever completed this kind of climate analysis at such a high resolution.”

The results were surprising. According to the climate models, an intense heat wave – equal to the longest on record from 1951 to 1999 – is likely to occur as many as five times between 2020 and 2029 over areas of the western and central United States.

The 2030s are projected to be even hotter. “Occurrence of the longest historical heat wave further intensifies in the 2030-2039 period, including greater than five occurrences per decade over much of the western U.S. and greater than three exceedences per decade over much of the eastern U.S.,” the authors wrote.

Seasonal records

The Stanford team also forecast a dramatic spike in extreme seasonal temperatures during the current decade. Temperatures equaling the hottest season on record from 1951 to 1999 could occur four times between now and 2019 over much of the U.S., according to the researchers.

The 2020s and 2030s could be even hotter, particularly in the American West. From 2030 to 2039, most areas of Utah, Colorado, Arizona and New Mexico could endure at least seven seasons equally as intense as the hottest season ever recorded between 1951 and 1999, the researchers concluded.

“Frankly, I was expecting that we’d see large temperature increases later this century with higher greenhouse gas levels and global warming,” Diffenbaugh said. “I did not expect to see anything this large within the next three decades. This was definitely a surprise.”

The researchers also determined that the hottest daily temperatures of the year from 1980 to 1999 are likely to occur at least twice as often across much of the U.S. during the decade of the 2030s.

“By the decade of the 2030s, we see persistent, drier conditions over most of the U.S.,” Diffenbaugh said. “Not only will the atmosphere heat up from more greenhouse gases, but we also expect changes in the precipitation and soil moisture that are very similar to what we see in hot, dry periods historically. In our results for the U.S., these conditions amplify the effects of rising greenhouse gas concentrations.”

Besides harming human health and agriculture, these hot, dry conditions could lead to more droughts and wildfires in the near future, he said. And many of these climate change impacts could occur within the next two decades – years before the planet is likely to reach the 2-degree C threshold targeted by some governments and climate experts, he added.

“It’s up to the policymakers to decide the most appropriate action,” Diffenbaugh said. “But our results suggest that limiting global warming to 2 degrees C does not guarantee that there won’t be damaging impacts from climate change.”