Pacific tsunami threat greater than expected

The potential for a huge Pacific Ocean tsunami on the West Coast of America may be greater than previously thought, according to a new study of geological evidence along the Gulf of Alaska coast.

The new research suggests that future tsunamis could reach a scale far beyond that suffered in the tsunami generated by the great 1964 Alaskan earthquake. Official figures put the number of deaths caused by the earthquake at around 130: 114 in Alaska and 16 in Oregon and California. The tsunami killed 35 people directly and caused extensive damage in Alaska, British Columbia, and the US Pacific region*.

The 1964 Alaskan earthquake – the second biggest recorded in history with a magnitude of 9.2 – triggered a series of massive waves with run up heights of as much as 12.7 metres in the Alaskan Gulf region and 52 metres in the Shoup Bay submarine slide in Valdez Arm.

The study suggests that rupture of an even larger area than the 1964 rupture zone could create an even bigger tsunami. Warning systems are in place on the west coast of North America but the findings suggest a need for a review of evacuation plans in the region.

The research team from Durham University in the UK, the University of Utah and Plafker Geohazard Consultants, gauged the extent of earthquakes over the last 2,000 years by studying subsoil samples and sediment sequences at sites along the Alaskan coast. The team radiocarbon-dated peat layers and sediments, and analysed the distribution of mud, sand and peat within them. The results suggest that earthquakes in the region may rupture even larger segments of the coast and sea floor than was previously thought.

The study published in the academic journal Quaternary Science Reviews and funded by the National Science Foundation, NASA, and the US Geological Survey shows that the potential impact in terms of tsunami generation, could be significantly greater if both the 800-km-long 1964 segment and the 250-km-long adjacent Yakataga segment to the east were to rupture simultaneously.

Lead author, Professor Ian Shennan, from Durham University’s Geography Department said: “Our radiocarbon-dated samples suggest that previous earthquakes were fifteen per cent bigger in terms of the area affected than the 1964 event. This historical evidence of widespread, simultaneous plate rupturing within the Alaskan region has significant implications for the tsunami potential of the Gulf of Alaska and the Pacific region as a whole.

“Peat layers provide a clear picture of what’s happened to the Earth. Our data indicate that two major earthquakes have struck Alaska in the last 1,500 years and our findings show that a bigger earthquake and a more destructive tsunami than the 1964 event are possible in the future. The region has been hit by large single event earthquakes and tsunamis before, and our evidence indicates that multiple and more extensive ruptures can happen.”

Tsunamis can be created by the rapid displacement of water when the sea floor lifts and/or falls due to crustal movements that accompany very large earthquakes. The shallow nature of the sea floor off the coast of Alaska could increase the destructive potential of a tsunami wave in the Pacific.

Earthquake behaviour is difficult to predict in this region which is a transition zone between two of the world’s most active plate boundary faults; the Fairweather fault, and the Aleutian subduction zone. In 1899 and 1979, large earthquakes occurred in the region but did not trigger a Tsunami because the rupturing was localized beneath the land instead of the sea floor.

Prof Ron Bruhn from the University of Utah said: “If the larger earthquake that is suggested by our work hits the region, the size of the potential tsunamicould be signficantly larger than in 1964 because a multi-rupture quake would displace the shallow continental shelf of the Yakutat microplate.

“In the case of a multi-rupture event, the energy imparted to the tsunami will be larger but spread out over a longer strike distance. Except for the small communities at the tsunami source in Alaska, the longer length will have more of an effect on areas farther from the source such as southeastern Alaska, British Columbia, and the US west coast from Washington to California.”

Warning systems have been in place on the US western seaboard and Hawaii since the 1946 Aleutian Islands tsunami. Improvements were made following the 2004 earthquake under the Indian Ocean that triggered the most deadly tsunami in recorded history, killing more than 230,000 people.

Prof Shennan said: “Earthquakes can hit at any time of the day or night, and that’s a big challenge for emergency planners. A tsunami in this region could cause damage and threaten life from Alaska to California and beyond; in 1964 the effects of the tsunami waves were felt as far away as southern California and were recorded on tide gages throughout the Pacific Ocean.”

Dr George Plafker from Plafker Geohazard Consultants said: “A large scale earthquake will not necessarily create a large wave. Tsunami height is a function of bathymetry, and the amount of slip and dip of the faults that take up the displacement, and all these factors can vary greatly along the strike.

“Tsunamis will occur in the future. There are issues in warning and evacuating large numbers of people in coastal communities quickly and safely. The US has excellent warning systems in place but awareness is vital.”

Early initiation of Arctic sea-ice formation

Significant sea ice formation occurred in the Arctic earlier than previously thought is the conclusion of a study published this week in Nature. “The results are also especially exciting because they suggest that sea ice formed in the Arctic before it did in Antarctica, which goes against scientific expectation,” says scientific team member Dr Richard Pearce of the University of Southampton’s School of Ocean and Earth Science based at the National Oceanography Centre, Southampton (NOCS).

The international collaborative research team led by Dr Catherine Stickley and Professor Nalân Koç of the University of Tromsø and Norwegian Polar Insitute (Tromsø) analysed oceanic sediment cores collected from the Lomonosov ridge in the central Arctic by Integrated Ocean Drilling Program Expedition 302 (‘ACEX’). Previous analyses of cores drilled in this region revealed ice-rafted debris dating back to the middle Eocene epoch, prompting suggestions that ice appeared in the Arctic about 46 million years ago. But records of ice-rafted debris do not differentiate sea ice from glacial (continental) ice, which is important because sea ice influences climate by directly affecting ocean-atmosphere exchanges, whereas land-based ice affects sea level and consequently ocean acidity.

Instead of focusing solely on ice-rafted debris, Stickley and her colleagues also garner information about ancient climate by analysing fossilised remains of tiny single-celled plants called diatoms in the sediment cores. Today, different living diatom species are adapted to particular environmental conditions. Assuming that this was also true in the past – for which there is ample evidence – the presence of particular diatom species in sediment cores is diagnostic of conditions prevailing at the time.

Coincident with ice-rafted debris in the cores, the researchers found high abundances of delicately silicified diatoms belong to the genus Synedropsis. “We were astonished by this”, said team member Richard Pearce of NOCS, who imaged the samples using a scanning electron microscope at the NOCS: “Weakly silicified diatoms are preserved only under exceptional circumstances, so to find fossilised Synedropsis species so well preserved and in such abundance is truly remarkable.” In fact, the ACEX Synedropsis species represent the earliest known fossil record of sea-ice diatoms.

The researchers attribute the presence of Synedropsis fossils in these sediments to the presence of sea ice, and silica-enriched waters that favour their preservation. They propose that, like Synedropsis species found in polar regions today, the ACEX species were also sea-ice specialists uniquely adapted for surviving the lengthy polar darkness and freezing temperatures. “These diatoms provide the most compelling evidence for ancient sea ice, as they rely on this medium for their survival,” said Catherine Stickley. Moreover, their analysis of quartz grain textural characteristics further supports sea ice as the dominant transporter of ice-rafted debris at this time.

“It is likely that sea ice formed in autumn and winter and melted in spring and summer, as seasonal sea ice does today,” they say. Synedropsis species probably over-wintered within the sea ice and then bloomed there in the spring when there was enough sunlight. They would have been released into stratified surface waters as the ice melted, rapidly sinking to the sea bottom as aggregates, leaving other diatom species to dominate summer production. And, indeed, these seasonal changes can be discerned in the sediment cores.

The researchers conclude from their analysis, which cover a two-million year period, that episodic sea ice formation in marginal shelf areas of the Arctic started around 47.5 million years ago, about a million years earlier than previous estimates based on ice-raft debris evidence only. This appears to have been followed half a million years later by the onset of seasonal sea-ice formation in offshore areas of the central Arctic, and about 24 million years before major ice-sheet expansion in the region.

The findings have potentially important implications for climate. Spring sea ice and summer cloud formation would have reduced oceanic heat loss to the atmosphere and increased the amount of solar radiation reflected back out into space. “A stable sea-ice regime also suggests the possibility of concomitant glacial ice,” say the researchers, and indeed they find some evidence for the presence of small isolated glaciers at the time.

Furthermore, their data indicate that sea ice formed in the Arctic before it did in Antarctica. Atmospheric levels of the greenhouse gas carbon dioxide were declining in the middle Eocene, one of the reasons postulated in causing the Earth to cool. However, the new findings imply that the threshold for sea-ice formation was first crossed in the Arctic, which, say the authors, is “a hypothesis opposite to that modelled for glacial ice, whereby Antarctica is shown to glaciate much earlier (that is, at higher levels of carbon dioxide) than circum-Arctic continents.”

Arctic sea ice images derived from classified data should be made public

Hundreds of images derived from classified data that could be used to better understand rapid loss and transformation of Arctic sea ice should be immediately released and disseminated to the scientific research community, says a new report from the National Research Council. The committee that wrote the report emphasized that these Arctic images show detailed melting and freezing processes and also provide information at scales, locations, and time periods that are important for studying effects of climate change on sea ice and habitat — data that are not available elsewhere.

“To prepare for a possibly ice-free Arctic and its subsequent effects on the environment, economy, and national security, it is critical to have accurate projections of changes over the next several decades,” said committee chair Stephanie Pfirman, professor and chair of the department of environmental science at Barnard College, New York City. “Forecasts of regional sea-ice conditions can help officials plan for and adapt to the impact of climate change and minimize environmental risks.”

Projections of future Arctic ice cover are hampered by poor understanding of sea-ice physical processes because few observations exist at appropriate times and scales. Readily available satellite images are too coarse to capture the details, the report says. In addition, collecting ground-based data by maintaining manned-drifting stations is challenging due to rapidly changing environmental conditions and the weak platform of ice, and collecting data from observational aircraft flights is difficult and expensive.

“At a time when there is concern that Earth observation systems are decreasing and aging, releasing these images would be a step toward continuing the flow of critical information to the scientific community,” said Ralph J. Cicerone, president of the National Academy of Sciences. “We hope that these images are the first of many that could help scientists learn how the changing climate could impact the environment and our society.”

During the 1990s, a program was started in which scientists recommended collection and archival of high-resolution classified imagery from intelligence sources at environmentally sensitive locations around the globe, with the eventual goal of declassifying and releasing the images to the broader scientific community for research purposes. In 1999, scientists requested that images of sea ice at four locations in the Arctic basin be collected during the summer months; two additional locations were added in 2005. Data have been collected at these sites during the summer months until the present day.

In later years of the program, images called Literal Imagery Derived Products (LIDPs) were produced from the classified data at a resolution deemed suitable for unclassified release. To date, several hundred unclassified LIDPs have been produced from the images collected at the six Arctic sites. If these sea-ice LIDPs are publicly released and disseminated to the Arctic research community, scientists could use them in conjunction with available commercial and civilian satellite data to provide new insight into critical physical processes and how these processes are represented in climate models, the committee said.

Some of the specific processes that could be explored from the images include the relationship of snow to ice-surface topography, the initiation and development of meltwater ponds in summer, and the relationship of stress and strain and how they are reflected in the pattern of cracks and other features in the ice. Moreover, the report says that the 2007 and 2008 images would greatly enhance the benefits and value of a broad range of intensive ground-based observations carried out during the Fourth International Polar Year (March 2007-March 2009). The summer 2007 sea-ice coverage minimum was a record-breaking low — more than 20 percent below the previous low in 2005 and nearly 40 percent below the 1979-2000 average minimum. Such a dramatic loss of sea ice could be investigated in more detail using the high-resolution imagery.

To maximize the fullest potential of the LIDP dataset in scientific research, the committee recommended that the release include thumbnail copies of the images, exact information on the location of the images, calibration information, the time of acquisition, and the information on the pointing angle.

Scientists say that microbial mats built 3.4-billion-year-old stromatolites

This is a rare paleosurface view of what conical stromatolites would have looked like if you snorkeled in the shallows of the reef. -  Abigail Allwood
This is a rare paleosurface view of what conical stromatolites would have looked like if you snorkeled in the shallows of the reef. – Abigail Allwood

Stromatolites are dome- or column-like sedimentary rock structures that are formed in shallow water, layer by layer, over long periods of geologic time. Now, researchers from the California Institute of Technology (Caltech) and the Jet Propulsion Laboratory (JPL) have provided evidence that some of the most ancient stromatolites on our planet were built with the help of communities of equally ancient microorganisms, a finding that “adds unexpected depth to our understanding of the earliest record of life on Earth,” notes JPL astrobiologist Abigail Allwood, a visitor in geology at Caltech.

Their research, published in a recent issue of the Proceedings of the National Academy of Sciences (PNAS), might also provide a new avenue for exploration in the search for signs of life on Mars.

“Stromatolites grow by accreting sediment in shallow water,” says John Grotzinger, the Fletcher Jones Professor of Geology at Caltech. “They get molded into these wave forms and, over time, the waves turn into discrete columns that propagate upward, like little knobs sticking up.”

Geologists have long known that the large majority of the relatively young stromatolites they study-those half a billion years old or so-have a biological origin; they’re formed with the help of layers of microbes that grow in a thin film on the seafloor.

How? The microbes’ surface is coated in a mucilaginous substance to which sediment particles rolling past get stuck. “It has a strong flypaper effect,” says Grotzinger. In addition, the microbes sprout a tangle of filaments that almost seem to grab the particles as they move along.

“The end result,” says Grotzinger, “is that wherever the mat is, sediment gets trapped.”

Thus it has become accepted that a dark band in a young stromatolite is indicative of organic material, he adds. “It’s matter left behind where there once was a mat.”

But when you look back 3.45 billion years, to the early Archean period of geologic history, things aren’t quite so simple.

“Because stromatolites from this period of time have been around longer, more geologic processing has happened,” Grotzinger says. Pushed deeper toward the center of Earth as time went by, these stromatolites were exposed to increasing, unrelenting heat. This is a problem when it comes to examining the stromatolites’ potential biological beginnings, he explains, because heat degrades organic matter. “The hydrocarbons are driven off,” he says. “What’s left behind is a residue of nothing but carbon.”

This is why there has been an ongoing debate among geologists as to whether or not the carbon found in these ancient rocks is diagnostic of life or not.

Proving the existence of life in younger rocks is fairly simple-all you have to do is extract the organic matter, and show that it came from the microorganisms. But there’s no such cut-and-dried method for analyzing the older stromatolites. “When the rocks are old and have been heated up and beaten up,” says Grotzinger, “all you have to look at is their texture and morphology.”

Which is exactly what Allwood and Grotzinger did with samples gathered at the Strelley Pool stromatolite formation in Western Australia. The samples, says Grotzinger, were “incredibly well preserved.” Dark lines of what was potentially organic matter were “clearly associated with the lamination, just like we see in younger rocks. That sort of relationship would be hard to explain without a biological mechanism.”

“We already knew from our earlier work that we had an assemblage of stromatolites that was most plausibly interpreted as a microbial reef built by Early Archean microorganisms,” adds Allwood, “but direct evidence of actual microorganisms was lacking in these ancient, altered rocks. There were no microfossils, no organic material, not even any of the microtextural hallmarks typically associated with microbially mediated sedimentary rocks.”

So Allwood set about trying to find other types of evidence to test the biological hypothesis. To do so, she looked at what she calls the “microscale textures and fabrics in the rocks, patterns of textural variation through the stromatolites and-importantly-organic layers that looked like actual fossilized organic remnants of microbial mats within the stromatolites.”

What she saw were “discrete, matlike layers of organic material that contoured the stromatolites from edge to edge, following steep slopes and continuing along low areas without thickening.” She also found pieces of microbial mat incorporated into storm deposits, which disproved the idea that the organic material had been introduced into the rock more recently, rather than being laid down with the original sediment. “In addition,” Allwood notes, “Raman spectroscopy showed that the organics had been ‘cooked’ to the same burial temperature as the host rock, again indicating the organics are not young contaminants.”

Allwood says she, Grotzinger, and their team have collected enough evidence that it’s no longer any “great leap” to accept these stromatolites as biological in origin. “I think the more we dig at these stromatolites, the more evidence we’ll find of Early Archean life and the nature of Earth’s early ecosystems,” she says.

That’s no small feat, since it’s been difficult to prove that life existed at all that far back in the geologic record. “Recently there has been increasing but still indirect evidence suggesting life existed back then, but direct evidence of microorganisms, at the microscale, remained elusive due to poor preservation of the rocks,” Allwood notes. “I think most people probably thought that these Early Archean rocks were too poorly preserved to yield such information.”

The implications of the findings don’t stop at life on Earth.

“One of my motivations for understanding stromatolites,” Allwood says, “is the knowledge that if microbial communities once flourished on Mars, of all the traces they might leave in the rock record for us to discover, stromatolite and microbial reefs are arguably the most easily preserved and readily detected. Moreover, they’re particularly likely to form in evaporative, mineral-precipitating settings such as those that have been identified on Mars. But to be able to interpret stromatolitic structures, we need a much more detailed understanding of how they form.”

‘Motion picture’ of past warming paves way for snapshots of future climate change

By accurately modeling Earth’s last major global warming – and answering pressing questions about its causes – scientists led by a University of Wisconsin-Madison climatologist are unraveling the intricacies of the kind of abrupt climate shifts that may occur in the future.

“We want to know what will happen in the future, especially if the climate will change abruptly,” says Zhengyu Liu, a UW-Madison professor of atmospheric and oceanic sciences and director of the Center for Climatic Research in the Nelson Institute for Environmental Studies. “The problem is, you don’t know if your model is right for this kind of change. The important thing is validating your model.”

To do so, Liu and his colleagues run their mode back in time and match the results of the climate simulation with the physical evidence of past climate.

Starting with the last glacial maximum about 21,000 years ago, Liu’s team simulated atmospheric and oceanic conditions through what scientists call the Bølling-Allerød warming, the Earth’s last major temperature hike, which occurred about 14,500 years ago. The simulation fell in close agreement with conditions – temperatures, sea levels and glacial coverage – collected from fossil and geologic records.

“It’s our most serious attempt to simulate this last major global warming event, and it’s a validation of the model itself, as well,” Liu says.

The results of the new climate modeling experiments are presented today (July 17) in the journal Science.

The group’s simulations were executed on “Phoenix” and “Jaguar,” a pair of Cray supercomputers at Oak Ridge National Laboratory in Oak Ridge, Tenn., and helped pin down the contributions of three environmental factors as drivers of the Bølling-Allerød warming: an increase in atmospheric carbon dioxide, the jump-start of stalled heat-moving ocean currents and a large buildup of subsurface heat in the ocean while those currents were dormant.

The climate dominoes began to fall during that period after glaciers reached their maximum coverage, blanketing most of North America, Liu explains. As glaciers melted, massive quantities of water poured into the North Atlantic, lowering the ocean salinity that helps power a major convection current that acts like a conveyor belt to carry warm tropical surface water north and cooler, heavier subsurface water south.

As a result, according to the model, ocean circulation stopped. Without warm tropical water streaming north, the North Atlantic cooled and heat backed up in southern waters. Subsequently, glacial melt slowed or stopped as well, and eventually restarted the overturning current – which had a much larger reserve of heat to haul north.

“All that stored heat is released like a volcano, and poured out over decades,” Liu explains. “That warmed up Greenland and melted (arctic) sea ice.”

The model showed a 15-degree Celsius increase in average temperatures in Greenland and a 5-meter increase in sea level over just a few centuries, findings that squared neatly with the climate of the period as represented in the physical record.

“Being able to successfully simulate thousands of years of past climate for the first time with a comprehensive climate model is a major scientific achievement,” notes Bette Otto-Bliesner, an atmospheric scientist and climate modeler at National Center for Atmospheric Research (NCAR) and co-author of the Science report. “This is an important step toward better understanding how the world’s climate could change abruptly over the coming centuries with increasing melting of the ice caps.”

The rate of ice melt during the Bølling-Allerød warming is still at issue, but its consequences are not, Liu says. The modelers simulated both a slow decrease in melt and a sudden end to melt run-off. In both cases, the result was a 15-degree warming.

“That happened in the past,” Liu says. “The question is, in the future, if you have a global warming and Greenland melts, will it happen again?”

Time – both actual and computing – will tell. In 2008, the group simulated about one-third of the last 21,000 years. With another 4 million processor hours to go, the simulations being conducted by the Wisconsin group will eventually run up to the present and 200 years into the future.

Traditional climate modeling approaches were limited by computer time and capabilities, Lieu explains.

“They did slides, like snapshots,” Liu says. “You simulate 100 years, and then you run another 100 years, but those centuries may be 2,000 years apart (in the model). To look at abrupt change, there is no shortcut.”

Using the interactions between land, water, atmosphere and ice in the Community Climate System Model developed at NCAR, the researchers have been able to create a much more detailed and closely spaced book of snapshots, “giving us more of a motion picture of the climate” over millennia, Liu said.

He stressed the importance of drawing together specialists in computing, oceanography, atmospheric science and glaciers – including John Kutzbach, a UW-Madison climate modeler, and UW-Madison doctoral student Feng He, responsible for modeling the glacial melt. All were key to attaining the detail necessary in recreating historical climate conditions, Liu says.

“All this data, it’s from chemical proxies and bugs in the sediment,” Liu said. “You really need a very interdisciplinary team: people on deep ocean, people on geology, people who know those bugs. It is a huge – and very successful – collaboration.”

Airborne expedition chases Arctic sea ice questions

CU-Boulder and NASA are teaming up this summer on a series of unmanned aircraft flights to study the receding Arctic sea ice and to better understand its life cycle and the long-term stability of the Arctic ice cover. -  Image courtesy James Maslanik, University of Colorado
CU-Boulder and NASA are teaming up this summer on a series of unmanned aircraft flights to study the receding Arctic sea ice and to better understand its life cycle and the long-term stability of the Arctic ice cover. – Image courtesy James Maslanik, University of Colorado

A small NASA aircraft completed its first successful science flight Thursday in partnership with the University of Colorado at Boulder as part of an expedition to study the receding Arctic sea ice and improve understanding of its life cycle and the long-term stability of the Arctic ice cover. The mission continues through July 24.

NASA’s Characterization of Arctic Sea Ice Experiment, known as CASIE, began a series of unmanned aircraft system flights in coordination with satellites. Working with CU-Boulder and its research partners, NASA is using the remotely piloted aircraft to image thick, old slabs of ice as they drift from the Arctic Ocean south through the Fram Strait — which lies between Greenland and Svalbard, Norway — and into the North Atlantic Ocean.

NASA’s Science Instrumentation Evaluation Remote Research Aircraft, or SIERRA, will weave a pattern over open ocean and sea ice to map and measure ice conditions below cloud cover to as low as 300 feet.

“Our project is attempting to answer some of the most basic questions regarding the most fundamental changes in sea-ice cover in recent years,” said CU-Boulder Research Professor James Maslanik of the aerospace engineering sciences department and principal investigator for the NASA mission. “Our analysis of satellite data shows that in 2009 the amount of older ice is just 12 percent of what it was in 1988 — a decline of 74 percent. The oldest ice types now cover only 2 percent of the Arctic Ocean as compared to 20 percent in the 1980s.”

SIERRA, laden with scientific instruments, travels long distances at low altitudes, flying below the clouds. The aircraft has high maneuverability and slow flight speed. SIERRA’s relatively large payload, approximately 100 pounds, combined with a significant range of 500 miles and a small, 20-foot wingspan makes it the ideal aircraft for the expedition.

The mission is conducted from the Ny-Alesund research base on the island of Svalbard, Norway, located near the northeastern tip of Greenland. Mission planners are using satellite data to direct flights of the aircraft.

“We demonstrated the utility of small- to medium-class unmanned aircraft systems for gathering science data in remote, harsh environments during the CASIE mission,” said Matt Fladeland, CASIE project and SIERRA manager at NASA’s Ames Research Center in Moffett Field, Calif.

The aircraft observations will be complemented by NASA satellite large-scale views of many different features of the Arctic ice. The Moderate Resolution Imaging Spectroradiometer aboard NASA’s Aqua satellite will be used to identify the ice edge location, ice features of interest and cloud cover. Other sensors such as the Advanced Microwave Scanning Radiometer-Earth Observing System on Aqua and the Quick Scatterometer satellite can penetrate cloud cover and analyze the physical properties of ice.

By using multiple types of satellite data, in conjunction with high-resolution aircraft products, more can be learned about ice conditions than is possible by using one or two data analysis methods.

NASA’s CASIE mission supports a larger NASA-funded research effort titled “Sea Ice Roughness as an Indicator of Fundamental Changes in the Arctic Ice Cover: Observations, Monitoring, and Relationships to Environmental Factors.” The project also supports the goals of the International Polar Year, a major international scientific research effort involving many NASA research efforts to study large-scale environmental changes in Earth’s polar regions.

Solar cycle linked to global climate, drives events similar to El Nino, La Nina




Sunrise over the ocean (©UCAR, photo by Carlye Calvin.)
Sunrise over the ocean (©UCAR, photo by Carlye Calvin.)

Establishing a key link between the solar cycle and global climate, new research led by the National Center for Atmospheric Research (NCAR) shows that maximum solar activity and its aftermath have impacts on Earth that resemble La Nina and El Nino events in the tropical Pacific Ocean. The research may pave the way toward better predictions of temperature and precipitation patterns at certain times during the Sun’s cycle, which lasts approximately 11 years.

The total energy reaching Earth from the Sun varies by only 0.1 percent across the solar cycle. Scientists have sought for decades to link these ups and downs to natural weather and climate variations and distinguish their subtle effects from the larger pattern of human-caused global warming.

Building on previous work, NCAR researchers used computer models of global climate and more than a century of ocean temperature data to answer longstanding questions about the connection between solar activity and global climate. Changes in greenhouse gases were also included in the model, but the main focus of the study is to examine the role of solar variability in climate change.

The research, published this month in the Journal of Climate, was funded by the National Science Foundation, NCAR’s sponsor, and by the Department of Energy.

“We have fleshed out the effects of a new mechanism to understand what happens in the tropical Pacific when there is a maximum of solar activity,” says NCAR scientist Gerald Meehl, the lead author. “When the Sun’s output peaks, it has far-ranging and often subtle impacts on tropical precipitation and on weather systems around much of the world.”

The new paper, along with an earlier one by Meehl and colleagues, shows that as the Sun reaches maximum activity, it heats cloud-free parts of the Pacific Ocean enough to increase evaporation, intensify tropical rainfall and the trade winds, and cool the eastern tropical Pacific. The result of this chain of events is similar to a La Nina event, although the cooling of about 1-2 degrees Fahrenheit is focused further east and is only about half as strong as for a typical La Nina.

Over the following year or two, the La Nina-like pattern triggered by the solar maximum tends to evolve into an El Nino-like pattern, as slow-moving currents replace the cool water over the eastern tropical Pacific with warmer-than-usual water. Again, the ocean response is only about half as strong as with El Nino.

True La Nina and El Nino events are associated with changes in the temperatures of surface waters of the eastern Pacific Ocean. They can affect weather patterns worldwide.

The new paper does not analyze the weather impacts of the solar-driven events. But Meehl and his co-author, Julie Arblaster of both NCAR and the Australian Bureau of Meteorology, found that the solar-driven La Nina tends to cause relatively warm and dry conditions across parts of western North America. More research will be needed to determine the additional impacts of these events on weather across the world.

“Building on our understanding of the solar cycle, we may be able to connect its influences with weather probabilities in a way that can feed into longer-term predictions, a decade at a time,” Meehl says.

An elusive puzzle


Scientists have known for years that long-term solar variations affect certain weather patterns, including droughts and regional temperatures. But establishing a physical connection between the decadal solar cycle and global climate patterns has proven elusive. One reason is that only in recent years have computer models been able to realistically simulate the processes underlying tropical Pacific warming and cooling associated with El Nino and La Nina. With those models now in hand, scientists can reproduce the last century’s solar behavior and see how it affects the Pacific.

To tease out these sometimes subtle connections between the Sun and Earth, Meehl and his colleagues analyzed sea surface temperatures from 1890 to 2006. They then used two computer models based at NCAR to simulate the response of the oceans to changes in solar output.

They found that, as the Sun’s output reaches a peak, the small amount of extra sunshine over several years causes a slight increase in local atmospheric heating, especially across parts of the tropical and subtropical Pacific where Sun-blocking clouds are normally scarce. That small amount of extra heat leads to more evaporation, producing extra water vapor. In turn, the moisture is carried by trade winds to the normally rainy areas of the western tropical Pacific, fueling heavier rains.

As this climatic loop intensifies, the trade winds strengthen. That keeps the eastern Pacific even cooler and drier than usual, producing La Nina-like conditions.

Although this Pacific pattern is produced by the solar maximum, the authors found that its switch to an El Nino-like state is likely triggered by the same kind of processes that normally lead from La Nina to El Nino. The transition starts when changes in the strength of the trade winds produce slow-moving off-equatorial pulses known as Rossby waves in the upper ocean, which take about a year to travel back west across the Pacific.

The energy then reflects from the western boundary of the tropical Pacific and ricochets eastward along the equator, deepening the upper layer of water and warming the ocean surface. As a result, the Pacific experiences an El Nino-like event about two years after solar maximum. The event settles down after about a year, and the system returns to a neutral state.

“El Nino and La Nina seem to have their own separate mechanisms,” says Meehl, “but the solar maximum can come along and tilt the probabilities toward a weak La Nina. If the system were heading toward a La Nina anyway,” he adds, “it would presumably be a larger one.”

Researchers to participate in seismic test of 7-story building

Rensselaer Associate Professor Michael Symans and incoming Dean of Engineering David Rosowsky are among the team of researchers who will converge in Japan next week to perform the largest earthquake simulation ever attempted on a wooden structure. The multi-university team has placed this seven-story building on the world's largest shake table, and will expose it to the force of an earthquake that hits only once every 2,500 years. -  Colorado State University
Rensselaer Associate Professor Michael Symans and incoming Dean of Engineering David Rosowsky are among the team of researchers who will converge in Japan next week to perform the largest earthquake simulation ever attempted on a wooden structure. The multi-university team has placed this seven-story building on the world’s largest shake table, and will expose it to the force of an earthquake that hits only once every 2,500 years. – Colorado State University

A destructive earthquake will strike a lone, wooden condominium in Japan next week, and Rensselaer Polytechnic Institute Professor Michael Symans will be on site to watch it happen.

Symans is among the team of researchers who will converge in the Japanese city of Miki to perform the largest earthquake simulation ever attempted on a wooden structure. The multi-university team, led by Colorado State University, has placed a seven-story building – loaded with sensing equipment and video cameras – on a massive shake table, and will expose the building to the force of an earthquake that hits once every 2,500 years.

The experiment will be Webcast live on Tuesday, July 14 at 11 a.m. EDT at www.nsf.gov/neeswood, and should yield critical data and insight on how to make wooden structures stronger and better able to withstand major earthquakes.

“Right now, wood can’t compete with steel and concrete as building materials for mid-rise buildings, partly because we don’t have a good understanding of how taller wood-framed structures will perform in a strong earthquake,” said Symans, associate professor in Rensselaer’s Department of Civil and Environmental Engineering. “With this shaking table test, we’ll be collecting data that will help us to further the development of design approaches for such structures, which is one of the major goals of the project.”

The 1994 magnitude 6.7 earthquake in Northridge, Calif., and 1995 magnitude 6.9 earthquake in Kobe, Japan, clearly demonstrate the seismic vulnerability of wood-framed construction, Symans said. The shake table experiment will offer researchers a chance to better understand how wood reacts in an earthquake, he said, and the resulting data could lead to the advancement of engineering techniques for mitigating earthquake damage.

As the ground shakes, the energy that goes into a building needs to flow somewhere, Symans said. Typically, a large portion of this energy is spent moving – and damaging – the building. There are proven engineering techniques for absorbing or displacing some of this energy in order to minimize damage, but the technology for doing so has not yet been thoroughly evaluated for wooden structures. Next week’s shake should produce sufficient data to allow the research team to develop accurate computer models of mid-rise wood buildings, which can subsequently be used to advance and validate some of these seismic protection techniques.

As one example, Symans is working on the application of seismic damping systems for wooden buildings. These systems, which can be installed inside the walls of most wooden buildings, include metal bracing and dampers filled with viscous fluid. A portion of the energy generated by the earthquake is spent shaking the fluid back and forth in the dampers, which in turn reduces the energy available to damage the wall or building structure. Recently completed shaking table tests at Rensselaer on wooden walls outfitted with such a damping system have demonstrated the viability of such an approach to mitigating damage in wooden buildings.

“The system allows a significant portion of the wood-frame displacement to be transferred to the dampers where the energy can be harmlessly dissipated,” Symans said. “With dampers in place, we have a better ability to predict how a structure will react to and perform during an earthquake.”

In the 1994 Northridge earthquake, all but one of the 25 fatalities caused by building damage occurred in wooden buildings, and at least half of the $40 billion in property damage was attributed to wood buildings. The quake resulted in nearly 50,000 housing units rendered uninhabitable, most of them wood-framed buildings. The advancement of seismic protection systems could help to save lives and prevent or limit damage in similar future earthquakes, Symans said. This is particularly important considering that most residential structures in the United States, even in seismically active areas, have wooden frames.

The Miki shake is the capstone experiment of the four-year NEESWood project, which receives its primary support from the U.S. National Science Foundation Network for Earthquake Engineering Simulation (NEES) Program. NEESWood is led by Colorado State University, in collaboration with Rensselaer, the University at Buffalo, the University of Delaware, and Texas A&M University. One intended end result of NEESWood is the development of new tools, software, and best practices that result in building code revisions and allow engineers and architects to design wooden structures which can better withstand earthquakes.

The seven-story structure has been built with new seismic design methods informed by NEESWood research for mid-rise wood frame construction. The tests in Miki, to be performed at the Hyogo Earthquake Engineering Research Center, home of the world’s largest seismic shaking table, will be used to evaluate the performance of the building and, in turn, the new design methods.

David Rosowsky, who will join Rensselaer in August as the new dean of engineering, is also a co-investigator of the NEESWood project and will attend the shake in Miki next week.

“NEESWood aims to develop a new seismic design philosophy that will provide the necessary mechanisms to safely increase the height of wood-frame structures in active seismic zones of the United States, as well as mitigate earthquake damage to low-rise wood-frame structures. When this challenge is successfully met, mid-rise wood-frame construction will be an economic option in seismic regions in the United States and around the world,” said Rosowsky, currently the head of the Department of Civil Engineering at Texas A&M.

“It’s exciting for Rensselaer to be a part of the international team participating in the NEESWood project. This project has already brought tremendous visibility to the School of Engineering at Rensselaer which, with its geotechnical centrifuge facility, already is a part of the NEES network of world-class laboratories for earthquake engineering,” Rosowsky said.

Tremors on southern San Andreas Fault may mean increased earthquake risk

Parkfield is at the northern end of a locked segment of the San Andreas Fault (SAF) that, in 1857, ruptured south from Monarch Peak (MP) in the great 7.8 magnitude Ft. Tejon quake. As a result of nearby earthquakes in 2003 and 2004, tremors developed under Cholame and Monarch Peak. The black dots pinpoint 1250 well-located tremors. The square boxes are 30 kilometers (19 miles) on a side.  Color contours give regional shear-stress change at 20 km depth from the Parkfield earthquake (green segment) along the SAF. The thrust-type San Simeon earthquake rupture is represented by the gray rectangle and line with triangles labeled SS. The currently locked Cholame segment is about 63 km long (solid portion of the arrow) and is believed capable of rupturing on its own in a magnitude 7 earthquake. The gray lines within the Cholame box bound the west quadrant, where quasiperiodic episodes predominate. (Robert Nadeau/UC Berkeley, courtesy Science magazine)
Parkfield is at the northern end of a locked segment of the San Andreas Fault (SAF) that, in 1857, ruptured south from Monarch Peak (MP) in the great 7.8 magnitude Ft. Tejon quake. As a result of nearby earthquakes in 2003 and 2004, tremors developed under Cholame and Monarch Peak. The black dots pinpoint 1250 well-located tremors. The square boxes are 30 kilometers (19 miles) on a side. Color contours give regional shear-stress change at 20 km depth from the Parkfield earthquake (green segment) along the SAF. The thrust-type San Simeon earthquake rupture is represented by the gray rectangle and line with triangles labeled SS. The currently locked Cholame segment is about 63 km long (solid portion of the arrow) and is believed capable of rupturing on its own in a magnitude 7 earthquake. The gray lines within the Cholame box bound the west quadrant, where quasiperiodic episodes predominate. (Robert Nadeau/UC Berkeley, courtesy Science magazine)

Increases in mysterious underground tremors observed in several active earthquake fault zones around the world could signal a build-up of stress at locked segments of the faults and presumably an increased likelihood of a major quake, according to a new University of California, Berkeley, study.

Seismologist Robert M. Nadeau and graduate student Aurélie Guilhem of UC Berkeley draw these conclusions from a study of tremors along a heavily instrumented segment of the San Andreas Fault near Parkfield, Calif. The research is reported in the July 10 issue of Science.

They found that after the 6.5-magnitude San Simeon quake in 2003 and the 6.0-magnitude Parkfield quake in 2004, underground stress increased at the end of a locked segment of the San Andreas Fault near Cholame, Calif., at the same time as tremors became more frequent. The tremors have continued to this day at a rate significantly higher than the rate before the two quakes.

The researchers conclude that the increased rate of tremors may indicate that stress is accumulating more rapidly than in the past along this segment of the San Andreas Fault, which is at risk of breaking like it did in 1857 to produce the great 7.8 magnitude Fort Tejon earthquake. Strong quakes have also occurred just to the northwest along the Parkfield segment of the San Andreas about every 20 to 30 years.

“We’ve shown that earthquakes can stimulate tremors next to a locked zone, but we don’t yet have evidence that this tells us anything about future quakes,” Nadeau said. “But if earthquakes trigger tremors, the pressure that stimulates tremors may also stimulate earthquakes.”

While earthquakes are brief events originating, typically, no deeper than 15 kilometers (10 miles) underground in California, tremors are an ongoing, low-level rumbling from perhaps 15 to 30 kilometers (10-20 miles) below the surface. They are common near volcanoes as a result of underground fluid movement, but were a surprise when discovered in 2002 at a subduction zone in Japan, a region where a piece of ocean floor is sliding under a continent.

Tremors were subsequently detected at the Cascadia subduction zone in Washington, Oregon and British Columbia, where several Pacific Ocean plates dive under the North American continental plate. In 2005, Nadeau identified mysterious “noise” detected by the Parkfield borehole seismometers as tremor activity, and has focused on them ever since. Unlike the Japanese and Cascadia tremor sites, however, the Parkfield area is a strike/slip fault, where the Pacific plate is moving horizontally against the North American plate.

“The Parkfield tremors are smaller versions of the Cascadia and Japanese tremors,” Nadeau said. “Most last between three and 21 minutes, while some Cascadia tremors go on for days.”

Because in nearly all known instances the tremors originate from the edge of a locked zone – a segment of a fault that hasn’t moved in years and is at high risk of a major earthquake – seismologists have thought that increases in their activity may forewarn of stress build-up just before an earthquake.

The new report strengthens that association, Nadeau said.

For the new study, Nadeau and Guilhem pinpointed the location of nearly 2,200 tremors recorded between 2001 and 2009 by borehole seismometers implanted along the San Andreas Fault as part of UC Berkeley’s High-Resolution Seismic Network. During this period, two nearby earthquakes occurred: one in San Simeon, 60 kilometers from Parkfield, on Dec. 22, 2003, and one in Parkfield on the San Andreas Fault on Sept. 28, 2004.

Before the San Simeon quake, tremor activity was low beneath the Parkfield and Cholame segments of the San Andreas Fault, but it doubled in frequency afterward and was six times more frequent after the Parkfield quake. Most of the activity occurred along a 25-kilometer (16-mile) segment of the San Andreas Fault south of Parkfield, around the town of Cholame. Fewer than 10 percent of the tremors occurred at an equal distance above Parkfield, near Monarch Peak. While Cholame is at the northern end of a long-locked and hazardous segment of the San Andreas Fault, Monarch Peak is not. However, Nadeau noted, Monarch Peak is an area of relative complexity on the San Andreas Fault and also ruptured in 1857 in the Fort Tejon 7.8 earthquake.

The tremor activity remains about twice as high today as before the San Simeon quake, while periodic peaks of activity have emerged that started to repeat about every 50 days and are now repeating about every 100-110 days.

“What’s surprising is that the activity has not gone down to its old level,” Nadeau said. The continued activity is worrisome because of the history of major quakes along this segment of the fault, and the long-ago Fort Tejon quake, which ruptured southward from Monarch Peak along 350 kilometers (220 miles) of the San Andreas Fault.

A flurry of pre-tremors was detected a few days before the Parkfield quake, which makes Nadeau hopeful of seeing similar tremors preceding future quakes.

He noted that the source of tremors is still somewhat of a mystery. Some scientists think fluids moving underground generate the tremors, just as movement of underground magma, water and gas causes volcanic tremors. Nadeau leans more toward an alternative theory, that non-volcanic tremors are generated in a deep region of hot soft rock, somewhat like Silly Putty, that, except for a few hard rocks embedded like peanut brittle, normally flows without generating earthquakes. The fracturing of the brittle inclusions, however, may be generating swarms of many small quakes that combine into a faint rumble.

“If tremors are composed of a lot of little earthquakes, each should have a primary and secondary wave just like large quakes,” but they would overlap and produce a rumble, said Guilhem.

The stimulation of tremors by shear (tearing) stress rather than by compressional (opening and closing) stress is more consistent with deformation in the fault zone than with underground fluid movement, Nadeau said. The researchers’ mapping of the underground tremors also shows that the tremors are not restricted to the plane of the fault, suggesting that faults spread out as they dive into the deeper crust.

Whatever their cause, tremors “are not relieving a lot of stress or making the fault less hazardous, they just indicate a changes in stress next to locked faults,” said Nadeau.

Seismologists around the world are searching for tremors along other fault systems, Guilhem noted, although tremors can be hard to detect because of noise from oceans as well as from civilization. Brief tremor activity has been observed on a few faults, triggered by huge quakes far away, and these may be areas to focus on. Tremors were triggered on Northern California’s Calaveras Fault by Alaska’s Denali quake of 2002, Nadeau said.

Arctic climate under greenhouse conditions in the Late Cretaceous

Fossil diatom algae of Cretaceous age from the Alpha Ridge of the Arctic Ocean
Fossil diatom algae of Cretaceous age from the Alpha Ridge of the Arctic Ocean

New evidence for ice-free summers with intermittent winter sea ice in the Arctic Ocean during the Late Cretaceous – a period of greenhouse conditions – gives a glimpse of how the Arctic is likely to respond to future global warming.

Records of past environmental change in the Arctic should help predict its future behaviour. The Late Cretaceous, the period between 100 and 65 million years ago leading up to the extinction of the dinosaurs, is crucial in this regard because levels of carbon dioxide (CO2) were high, driving greenhouse conditions. But scientists have disagreed about the climate at this time, with some arguing for low Arctic late Cretaceous winter temperatures (when sunlight is absent during the Polar night) as against more recent suggestions of a somewhat milder 15°C mean annual temperature.

Writing in Nature, Dr Andrew Davies and Professor Alan Kemp of the University of Southampton’s School of Ocean and Earth Science based at the National Oceanography Centre, Southampton, along with Dr Jennifer Pike of Cardiff University take this debate a step forward by presenting the first seasonally resolved Cretaceous sedimentary record from the Alpha Ridge of the Arctic Ocean.

The scientists analysed the remains of diatoms – tiny free-floating plant-like organisms – preserved in late Cretaceous marine sediments. In modern oceans, diatoms play a dominant role in the ‘biological carbon pump’ by which carbon dioxide is drawn down from the atmosphere through photosynthesis and a proportion of it exported to the deep ocean. Unfortunately, the role of diatoms in the Cretaceous oceans has until now been unclear, in part because they are often poorly preserved in sediments.

But the researchers struck lucky. “With remarkable serendipity,” they explain, ” successive US and Canadian expeditions that occupied floating ice islands above the Alpha Ridge of the Arctic Ocean, recovered cores containing shallow buried upper Cretaceous diatom ooze with superbly preserved diatoms.” This has allowed them to conduct a detailed study of the diatom fossils using sophisticated electron microscopy techniques. In the modern ocean, scientists use floating sediment traps to collect and study settling material. These electron microscope techniques that have been pioneered by Professor Kemp’s group at Southampton have unlocked a ‘palaeo-sediment trap’ to reveal information about Late Cretaceous environmental conditions.

They find that the most informative sediment core samples display a regular alternation of microscopically thin layers composed of two distinctly different diatom assemblages, reflecting seasonal changes. Their analysis clearly demonstrates that seasonal blooming of diatoms was not related to the upwelling of nutrients, as has been previously suggested. Rather, production occurred within a stratified water column, indicative of ice-free summers. These summer blooms comprised specially adapted species resembling those of the modern North Pacific Subtropical Gyre, or preserved in relatively recent organically rich Mediterranean sediments called ‘sapropels’.

The sheer number of diatoms found in the Late Cretaceous sediment cores indicates exceptional abundances equalling modern values for the most productive areas of the Southern Ocean. “This Cretaceous production, dominated by diatoms adapted to stratified conditions of the polar summer may also be a pointer to future trends in the modern ocean,” say the researchers: “With increasing CO2 levels and global warming giving rise to increased ocean stratification, this style of (marine biological) production may become of increasing importance.”

However, thin accumulations of earthborn sediment within the diatom ooze are consistent with the presence of intermittent sea ice in the winter, a finding that supports “a wide body of evidence for low Arctic late Cretaceous winter temperatures rather than recent suggestions of a 15C mean annual temperature at this time.” The size distribution of clay and sand grains in the sediment points to the formation of sea ice in shallow coastal seas during autumn storms but suggests the absence of larger drop-stones suggests that the winters, although cold, were not cold enough to support thick glacial ice or large areas of anchored ice.

Commenting on the findings, Professor Kemp said: “Although seasonally-resolved records are rarely preserved, our research shows that they can provide a unique window into past Earth system behaviour on timescales immediately comparable and relevant to those of modern concern.”