La Niña Conditions Strengthen, Expected To Continue





The ongoing La Niña event started in the third quarter of 2007 and has already influenced climate patterns during the last six months across many parts of the globe, including in the Equatorial Pacific, across the Indian Ocean, Asia, Africa and the Americas. (Credit: NOAA/National Weather Service)
The ongoing La Niña event started in the third quarter of 2007 and has already influenced climate patterns during the last six months across many parts of the globe, including in the Equatorial Pacific, across the Indian Ocean, Asia, Africa and the Americas. (Credit: NOAA/National Weather Service)

The current La Niña event, characterized by a cooling of the sea surface in the central and eastern Equatorial Pacific, has strengthened slightly in recent months and is expected to continue through the first quarter of 2008, with a likelihood of persisting through to the middle of the year.



The ongoing La Niña event started in the third quarter of 2007 and has already influenced climate patterns during the last six months across many parts of the globe, including in the Equatorial Pacific, across the Indian Ocean, Asia, Africa and the Americas.



During the last three months, La Niña conditions have become slightly stronger. Sea surface temperatures are now about 1.5 to 2 degrees Celsius colder than average over large parts of the central and eastern Equatorial Pacific. This La Niña is in the mid range of past historically recorded events, but the slight further cooling in recent months will likely place it on the stronger side of the middle range.



During a La Niña event, sea surface temperatures in the central and eastern Equatorial Pacific become cooler than normal. Such cooling has important effects on the global weather, particularly rainfall. While sea surface temperatures cool in the central and eastern Equatorial Pacific, those in the west remain warmer. This is associated with increases in the frequency of heavy rain and thunderstorms in surrounding regions.


In contrast to La Niña, the El Niño phenomenon is characterized by substantially warmer than average sea surface temperatures in the central and eastern Equatorial Pacific.



These temperature changes in the Equatorial Pacific related to La Niña and El Niño are strongly linked to major global climate fluctuations and, once initiated, can last for 12 months or more.



Most interpretations of existing climatological data suggest the likelihood of La Niña conditions remaining heightened through the second quarter of 2008 and, at a lower level of confidence, into the first part of the third quarter.



Longer seasonal forecasts beyond the third quarter of 2008 are not considered to contain useful information at this stage on the continuation of La Niña or the rise of El Niño.



It is rare for a La Niña event to persist for two years or more, such as occurred from early 1998 to early 2000. The likelihood of the current La Niña continuing for such a period will remain unclear for some months, but will be closely monitored. Long-term statistics suggest it is more likely that in the latter part of 2008, neutral conditions will prevail, i.e., neither La Niña nor El Niño with no significant cooling or warming of Equatorial Pacific sea surface temperatures.

On the hunt for ‘black gold’


The surprise discovery of university-owned rights to oil and natural gas in southern Alberta is leading to first-hand lessons in the energy sector for students and researchers who have begun exploring the potential of the reserves using some of the latest technology in exploration geology.



“This is a great treasure hunt that is going to provide real-world experience that might even result in a new source of revenue for the university,” said Department of Geoscience professor Rob Stewart. “We are in the remarkable position of being able to do a lot of the exploration work ourselves, which is a wonderful way for everybody to learn.”



Land title searches conducted last summer uncovered U of C’s ownership of mineral rights on two sections of ranch land near the community of Spring Coulee, south of Lethbridge. That prompted Stewart to organize a seismic survey last month to assess the resources, as well as test the university’s new seismic vibrator truck and other high-tech equipment.


“The data we acquired is a treasure trove of information that students are analyzing in class,” Stewart said. “There is a producing oil well nearby, so we too could have some black gold!”



The university is partnering with local seismic firms to explore the area and plans to seek other industrial partners as the project unfolds.



Department of Geoscience head Dave Eaton said a Geoscience field school will likely take place at the site this summer and the project will likely expand to include students from the Schulich School of Engineering, environmental science and the Haskayne School of Business.



“I think this is a real motivating opportunity for students, who can be involved in every step of the oil and gas development process,” Eaton said. “It also allows them to take part in an experiential learning opportunity that could bring economic benefit to the university.”

Past greenhouse warming events provide clues to what the future may hold





 James Zachos (foreground) inspects a sediment core drilled from the ocean floor. Photo courtesy of J. Zachos.
James Zachos (foreground) inspects a sediment core drilled from the ocean floor. Photo courtesy of J. Zachos.

If carbon dioxide emissions from the burning of fossil fuels continue on a “business-as-usual” trajectory, humans will have added about 5 trillion metric tons of carbon to the atmosphere by the year 2400. A similarly massive release of carbon accompanied an extreme period of global warming 55 million years ago known as the Paleocene-Eocene Thermal Maximum (PETM).



Scientists studying the PETM are piecing together an increasingly detailed picture of its causes and consequences. Their findings describe what may be the best analog in the geologic record for the global changes likely to result from continued carbon dioxide emissions from human activities, according to James Zachos, professor of Earth and planetary sciences at the University of California, Santa Cruz.



“All the evidence points to a massive release of carbon at the PETM, and if you compare it with the projections for anthropogenic carbon emissions, it’s roughly the same amount of carbon,” Zachos said. “The difference is the rate at which it was released–we’re on track to do in a few hundred years what may have taken a few thousand years back then.”



Zachos and his collaborators have been studying marine sediments deposited on the deep ocean floor during the PETM and recovered in sediment cores by the Integrated Ocean Drilling Program. He will discuss their findings, which reveal drastic changes in ocean chemistry during the PETM, in a presentation at the annual meeting of the Association for the Advancement of Science (AAAS) in Boston on Friday, February 15. His talk is part of a symposium entitled “Ocean Acidification and Carbon-Climate Connections: Lessons from the Geologic Past.”



The ocean has the capacity to absorb huge amounts of carbon dioxide from the atmosphere. But as carbon dioxide dissolves in the ocean, it makes the water more acidic. That, in turn, could make life more difficult for corals and other marine organisms that build shells and skeletons out of calcium carbonate.



Technically, the “acidification” is a lowering of the pH of ocean water, moving it closer to the acidic range of the pH scale, although it remains slightly alkaline. Lowering the pH affects the chemical equilibrium of the ocean with respect to calcium carbonate, reducing the concentration of carbonate ions and making it harder for organisms to build and maintain structures of calcium carbonate. Corals and some other marine organisms use a form of calcium carbonate called aragonite, which dissolves first, while many others build shells of a more resistant form called calcite.



“As the carbonate concentration starts to decrease, it becomes harder for some organisms to build their shells. They have to use more energy, and eventually it’s impossible–in laboratory experiments, they precipitate some shell during the day, and overnight it dissolves,” Zachos said. “If you lower the carbonate concentration enough, corals and eventually even calcite shells start to dissolve.”



The effect of ocean acidification on the chemistry of calcium carbonate is reflected in the sediment cores from the PETM. Marine sediments are typically rich in calcium carbonate from the shells of marine organisms that sink to the seafloor after they die. Sediments deposited at the start of the PETM, however, show an abrupt transition from carbonate-rich ooze to a dark-red clay layer in which the carbonate shells are completely gone.



Ocean acidification starts at the surface, where carbon dioxide is absorbed from the atmosphere, and spreads to the deep sea as surface waters mix with deeper layers. The calcium carbonate in marine sediments on the seafloor provides a buffer, neutralizing the increased acidity as the shells dissolve and enabling the ocean to absorb more carbon dioxide. But the mixing time required to bring acidified surface waters into the deep sea is long–500 to 1,000 years, according to Zachos.


“We are adding all this carbon dioxide in less than one mixing cycle. That’s important for how the ocean buffers itself, and it means the carbonate concentration in surface waters will get low enough to affect corals and other organisms, assuming emissions continue on the current trajectory,” he said.



In a recent article in Nature (January 17, 2008), Zachos and coauthors Gerald Dickens of Rice University and Richard Zeebe of the University of Hawaii provided an overview of the PETM and other episodes of greenhouse warming in the past 65 million years. These “natural experiments” can help scientists understand the complex interactions that link the carbon cycle and the climate.



Christina Ravelo, a professor of ocean sciences at UCSC and coorganizer of the symposium at which Zachos will speak, said climate records preserved in seafloor sediments provide a valuable test for the climate models scientists use to predict the future consequences of greenhouse gas emissions.



“There are no exact analogs in the past for what is happening now, but we can use past climates to test the models and improve them,” Ravelo said. “The ocean drilling program is the only way to get really good records of these past warm periods.”



Current climate models tend to have difficulty replicating the features of warm periods in the past, such as the PETM, she said. “Even though the models do a great job of simulating the climate over the past 150 years, the future probably holds many climatic surprises. As you run the models farther into the future, the uncertainties become greater.”



A particular concern over the long run is the potential for positive feedback that could amplify the initial warming caused by carbon dioxide emissions. For example, one possible cause of the PETM is the decomposition of methane deposits on the seafloor, which could have been triggered by an initial warming. Methane hydrates are frozen deposits found in the deep ocean near continental margins. Methane released from the deposits would react with oxygen to form carbon dioxide. Both compounds are potent greenhouse gases.



“We have some new evidence that there was a lag between the initial warming and the main carbon excursion of the PETM,” said Zachos, who is a coauthor of a paper describing these findings in the December 20/27, 2007, issue of Nature. “It’s consistent with the notion of a positive feedback, with an initial warming causing the hydrates to decompose,” he said.



Although this raises the possibility that the current global warming trend might trigger a similar release of methane from the ocean floor, that would not happen any time soon. It would take several centuries for the warming to reach the deeper parts of the ocean where the methane hydrate deposits are, Zachos said.



“By slowing the rate of carbon emissions and warming, we may be able to avoid triggering a strong, uncontrolled positive feedback,” he said.

Scientists Reveal First-Ever Global Map of Total Human Effects on Oceans





Global Impact Map
Global Impact Map

More than 40 percent of the world’s oceans are heavily affected by human activities, and few if any areas remain untouched, according to the first global-scale study of human influence on marine ecosystems. By overlaying maps of 17 different activities such as fishing, climate change, and pollution, the researchers have produced a composite map of the toll that humans have exacted on the seas.



The work, published in the Feb. 15 issue of Science and presented at a press conference Thursday, February 14 at 1 pm EST at the American Association for the Advancement of Science (AAAS) meeting in Boston, MA, was conducted at the National Center for Ecological Analysis and Synthesis (NCEAS) at UC Santa Barbara. It involved 19 scientists from a broad range of universities, NGOs, and government agencies.



The study synthesized global data on human impacts to marine ecosystems such as coral reefs, seagrass beds, continental shelves, and the deep ocean. Past studies have focused largely on single activities or single ecosystems in isolation, and rarely at the global scale. In this study the scientists were able to look at the summed influence of human activities across the entire ocean.



“This project allows us to finally start to see the big picture of how humans are affecting the oceans.” said lead author Ben Halpern, assistant research scientist at NCEAS. “Our results show that when these and other individual impacts are summed up, the big picture looks much worse than I imagine most people expected. It was certainly a surprise to me.”



“This research is a critically needed synthesis of the impact of human activity on ocean ecosystems,” said David Garrison, biological oceanography program director at NSF. “The effort is likely to be a model for assessing these impacts at local and regional scales.”



“Clearly we can no longer just focus on fishing or coastal wetland loss or pollution as if they are separate effects,” said Andrew Rosenberg, a professor of natural resources at the University of New Hampshire who was not involved with the study. “These human impacts overlap in space and time, and in far too many cases the magnitude is frighteningly high. The message for policy makers seems clear to me: conservation action that cuts across the whole set of human impacts is needed now in many places around the globe.”


The study reports that the most heavily affected waters in the world include large areas of the North Sea, the South and East China Seas, the Caribbean Sea, the east coast of North America, the Mediterranean Sea, the Red Sea, the Persian Gulf, the Bering Sea, and several regions in the western Pacific. The least affected areas are largely near the poles.



“Unfortunately, as polar ice sheets disappear with warming global climate and human activities spread into these areas, there is a great risk of rapid degradation of these relatively pristine ecosystems,” said Carrie Kappel, a principal investigator on the project and a post-doctoral researcher at NCEAS.



Importantly, human influence on the ocean varies dramatically across various ecosystems. The most heavily affected areas include coral reefs, seagrass beds, mangroves, rocky reefs and shelves, and seamounts. The least impacted ecosystems are soft-bottom areas and open-ocean surface waters.



“There is definitely room for hope,” added Halpern. “With targeted efforts to protect the chunks of the ocean that remain relatively pristine, we have a good chance of preserving these areas in good condition.”



The research involved a four-step process. First, the scientists developed techniques to quantify and compare how different human activities affect each marine ecosystem. For example, fertilizer runoff has been shown to have a large effect on coral reefs but a much smaller one on kelp forests. Second, the researchers gathered and processed global data on the distributions of marine ecosystems and human influences. Third, the researchers combined data from the first and second steps to determine “human impact scores” for each location in the world. Finally, using global estimates of the condition of marine ecosystems from previous studies, the researchers were able to ground-truth their impact scores.



Despite all this effort, the authors acknowledge that their maps are yet incomplete, because many human activities are poorly studied or lack good data. “Our hope is that as more data become available, the maps will be refined and updated,” said Fio Micheli, a principal investigator on the project and assistant professor at Stanford University. “But this will almost certainly create a more dire picture.”



This study provides critical information for evaluating where certain activities can continue with little effect on the oceans, and where other activities might need to be stopped or moved to less sensitive areas. As management and conservation of the oceans turns toward marine protected areas (MPAs), ecosystem-based management (EBM) and ocean zoning to manage human influence, such information will prove invaluable to managers and policy makers.



“Conservation and management groups have to decide where, when, and what to spend their resources on,” said Kimberly Selkoe, a principal investigator on the project and a post-doctoral researcher at the University of Hawaii. “Whether one is interested in protecting ocean wilderness, assessing which human activities have the greatest impact, or prioritizing which ecosystem types need management intervention, our results provide a strong framework for doing so.”



“My hope is that our results serve as a wake-up call to better manage and protect our oceans rather than a reason to give up,” Halpern said. “Humans will always use the oceans for recreation, extraction of resources, and for commercial activity such as shipping. This is a good thing. Our goal, and really our necessity, is to do this in a sustainable way so that our oceans remain in a healthy state and continue to provide us the resources we need and want.”

Satellite map will aid in managing Alberta’s vital resources





This comprehensive snapshot of Alberta's land mass will be updated in 2010.
This comprehensive snapshot of Alberta’s land mass will be updated in 2010.

Every nook and cranny of the province, from its forests to its foothills, has for the first time been recorded on a unique map by the University of Alberta.



The project, led by professor Arturo Sanchez-Azofeifa of the U of A Department of Earth and Atmospheric Sciences, was nine years in the making and marks the first time the entire province’s land cover has been mapped in satellite imagery.



The project will be vital to helping the province face growing challenges of integrated resource management, said Sanchez-Azofeifa.



“This is the first time we can see the whole picture of where we live,” he said. “The map records natural land sites, including coniferous and deciduous forests, wetlands, crop lands, grasslands, commercial and industrial parks, major roads, clearcuts, even burn sites.”


“It’s a comprehensive snapshot of Alberta’s land mass that will help government, researchers and resource managers support the development of sustainable land-use policies in years to come.”



The map can be used in many ways, from helping the province pinpoint the impact of forest fires, to plotting water, oil and gas management, to helping forestry companies plan woodlot management, all with a bird’s-eye view. The map will also be available to researchers at the U of A and other institutions, Sanchez-Azofeifa said.



Sanchez-Azofeifa and his research co-ordinator, Mei Mei Chong, created the map under the auspices of the U of A’s Centre for Earth Observation Sciences, and were recently awarded the Canadian Forest Service Merit Award for their work. The Alberta project is also part of a nation-wide map assembled by Natural Resources Canada’s Earth Observation for Sustainable Development of Forests project.



The high-definition imagery on the brightly colour-coded map is 1.5 gigabytes in size, and can provide details as specific as pipeline locations. The map is based on images from NASA’s Landsat 7 satellite, circa 2000, and will be updated in 2010, but in the meantime provides the most comprehensive imagery available for the many different characteristics of Alberta’s land mass.



To confirm the map’s imagery, about 3,000 field sites across the province were visited by experts over the past nine years, identifying Alberta’s vegetation inventory.

Carbon study could help reduce harmful emissions


Earth scientists at The University of Manchester have found that carbon dioxide has been naturally stored for more than a million years in several gas fields in the Colorado Plateau and Rocky Mountains of the United States.



Researchers say lessons learned from these natural gas fields will help to find sites suitable for injecting CO2 captured from power station chimneys.



Academics have been investigating five natural CO2 gas fields from the southwest United States, as they are examples of natural CO2 storage.



Their findings are published in the latest issue of the Geochemistry Journal Geochimica et Cosmochimica Acta.



In order for CO2 storage – also known as CO2 sequestration – to be considered as a viable method of reducing CO2 emissions to the atmosphere the public must be reassured that the CO2 pumped underground will be stored safely for a long time.


Dr Stuart Gilfillan, the University of Manchester researcher who led the project, said: “By measuring the noble gases within the Colorado carbon dioxide, we have been able to ‘fingerprint’ the CO2 for the first time. This has allowed us to show that the gas in all of the fields is the result of the degassing of molten magma within the Earth’s crust.



“In all of these gas fields, the last known magma melting event was over eight thousand years ago. In three of the fields magma melting last occurred over a million years ago, and in one it was at least 40 million years ago.



“We already know that oil and gas have been stored safely in oil and gas fields over millions of years and this study clearly shows that the CO2 has been stored naturally and safely for a very long time in these fields.



“So, underground CO2 storage, in the correct place, should be a safe option to help us cope with emissions until we can develop cleaner energy sources.”



The team hope that this study will pave the way for selection of similar safe sites for storage of CO2 captured in power plants in both the UK and abroad.



The research was funded by the Natural Environment Research Council (NERC).

New Greenland Ice Sheet Data Will Impact Climate Change Models





Research conducted by a UB geologist on the Greenland ice sheet shows the trimline (broken brown line) that marks the maximum extent of the ice sheet at the end of the 18th century and the subsequent retreat of the glacier and land exposed since 1944.
Research conducted by a UB geologist on the Greenland ice sheet shows the trimline (broken brown line) that marks the maximum extent of the ice sheet at the end of the 18th century and the subsequent retreat of the glacier and land exposed since 1944.

A comprehensive new study authored by University at Buffalo scientists and their colleagues for the first time documents in detail the dynamics of parts of Greenland’s ice sheet, important data that have long been missing from the ice sheet models on which projections about sea level rise and global warming are based.



The research, published online this month in the Journal of Glaciology, also demonstrates how remote sensing and digital imaging techniques can produce rich datasets without field data in some cases.



Traditionally, ice sheet models are very simplified, according to Beata Csatho, Ph.D., assistant professor of geology in the UB College of Arts and Sciences and lead author of the paper.



“Ice sheet models usually don’t include all the complexity of ice dynamics that can happen in nature,” said Csatho. “This research will give ice sheet modelers more precise, more detailed data.”



The implications of these richer datasets may be dramatic, Csatho said, especially as they impact climate projections and sea-level rise estimates, such as those made by the United Nations Intergovernmental Panel on Climate Change (IPCC).



“If current climate models from the IPCC included data from ice dynamics in Greenland, the sea level rise estimated during this century could be twice as high as what they are currently projecting,” she said.



The paper focuses on Jakobshavn Isbrae, Greenland’s fastest moving glacier and its largest, measuring four miles wide.



During the past decade, Jakobshavn Isbrae has begun to experience rapid thinning and doubling of the amount of ice it discharges into Disko Bay.



“Although the thinning started as early as the end of the 18th century, the changes we are seeing now are bigger than can be accounted for by normal, annual perturbations in climate,” Csatho said.



In order to document the most comprehensive story possible of the behavior of Jakobshavn Isbrae since the Little Ice Age in the late 1800s, Csatho and her colleagues at Ohio State University, the University of Kansas and NASA used a combination of techniques.



These included field mapping, remote sensing, satellite imaging and the application of digital techniques in order to glean “hidden” data from historic aerial photographs as many as 60 years after they were taken.





This satellite image, colored for emphasis and taken in 2001, shows the ice sheet margins where land (pink areas) has become exposed and lakes were formed (bright blue). The large bluish-green area is the glacier.
This satellite image, colored for emphasis and taken in 2001, shows the ice sheet margins where land (pink areas) has become exposed and lakes were formed (bright blue). The large bluish-green area is the glacier.

By themselves, Csatho explained, the two-dimensional pictures were of limited value.



“But now we can digitize them, removing the boundaries between them and turning several pictures into a single ‘mosaic’ that will produce one data set that can be viewed in three-dimensions,” she said.



“By reprocessing old data contained in these old photographs and records, we have been able to construct a long-term record of the behavior of the glacier,” said Csatho. “This was the first time that the data from the ’40s could be reused in a coherent way.”



The data from the historic photos were combined with data from historical records, ground surveys, field mapping and measurements taken from the air to document important signs of change in the glacier’s geometry.



Csatho explained that conventional methods of assessing change in glaciers have depended on documenting “iceberg calving,” in which large pieces at the front of the glacier break off.



“But we found that you can get significant changes in the ice sheet without seeing a change in front,” she said.



Other key findings of the paper are that two different parts of the same glacier may behave quite differently and that a glacier does not necessarily react to climate change as a single, monolithic entity.



“Climate forces are complex,” Csatho said. “For example, we found that the northern part of Jakobshavn was still thinning while the climate was colder between the 1960s and the 1990s.”



Csatho, who is a geophysicist, added that the research is the result of a strong interdisciplinary team involving experts in glaciology, ice sheet modeling and photogrammetry, the science of making measurements based on photographs.



At UB, research in Csatho’s remote sensing laboratory focuses on a multidisciplinary approach that integrates information across the geosciences.



Csatho’s co-authors on the paper are Tony Schenk of the Ohio State University Department of Civil and Environmental Engineering and Geodetic Science; Kees van der Veen of the Center for Remote Sensing of Ice Sheets at the University of Kansas, and William B. Krabill of the National Aeronautics and Space Administration’s Cryospheric Sciences Branch.



The research was funded by the National Science Foundation and NASA.

Lake Mead Could Be Dry by 2021





A map of the Colorado River basin.
A map of the Colorado River basin.

There is a 50 percent chance Lake Mead, a key source of water for millions of people in the southwestern United States, will be dry by 2021 if climate changes as expected and future water usage is not curtailed, according to a pair of researchers at Scripps Institution of Oceanography, UC San Diego.



Without Lake Mead and neighboring Lake Powell, the Colorado River system has no buffer to sustain the population of the Southwest through an unusually dry year, or worse, a sustained drought. In such an event, water deliveries would become highly unstable and variable, said research marine physicist Tim Barnett and climate scientist David Pierce.



Barnett and Pierce concluded that human demand, natural forces like evaporation, and human-induced climate change are creating a net deficit of nearly 1 million acre-feet of water per year from the Colorado River system that includes Lake Mead and Lake Powell. This amount of water can supply roughly 8 million people. Their analysis of Federal Bureau of Reclamation records of past water demand and calculations of scheduled water allocations and climate conditions indicate that the system could run dry even if mitigation measures now being proposed are implemented.



The paper, “When will Lake Mead go dry?,” has been accepted for publication in the peer-reviewed journal Water Resources Research, published by the American Geophysical Union (AGU), and is accessible via the AGU’s website (see instructions below).



“We were stunned at the magnitude of the problem and how fast it was coming at us,” said Barnett. “Make no mistake, this water problem is not a scientific abstraction, but rather one that will impact each and every one of us that live in the Southwest.”



“It’s likely to mean real changes to how we live and do business in this region,” Pierce added.



The Lake Mead/Lake Powell system includes the stretch of the Colorado River in northern Arizona. Aqueducts carry the water to Las Vegas, Los Angeles, San Diego, and other communities in the Southwest. Currently the system is only at half capacity because of a recent string of dry years, and the team estimates that the system has already entered an era of deficit.


“When expected changes due to global warming are included as well, currently scheduled depletions are simply not sustainable,” wrote Barnett and Pierce in the paper.



Barnett and Pierce note that a number of other studies in recent years have estimated that climate change will lead to reductions in runoff to the Colorado River system. Those analyses consistently forecast reductions of between 10 and 30 percent over the next 30 to 50 years, which could affect the water supply of between 12 and 36 million people.



The researchers estimated that there is a 10 percent chance that Lake Mead could be dry by 2014. They further predict that there is a 50 percent chance that reservoir levels will drop too low to allow hydroelectric power generation by 2017.



The researchers add that even if water agencies follow their current drought contingency plans, it might not be enough to counter natural forces, especially if the region enters a period of sustained drought and/or human-induced climate changes occur as currently predicted.



Barnett said that the researchers chose to go with conservative estimates of the situation in their analysis, though the water shortage is likely to be more dire in reality. The team based its findings on the premise that climate change effects only started in 2007, though most researchers consider human-caused changes in climate to have likely started decades earlier. They also based their river flow on averages over the past 100 years, even though it has dropped in recent decades. Over the past 500 years the average annual flow is even less.



“Today, we are at or beyond the sustainable limit of the Colorado system. The alternative to reasoned solutions to this coming water crisis is a major societal and economic disruption in the desert southwest; something that will affect each of us living in the region” the report concluded.



The research was supported under a joint program between UC San Diego and the Lawrence Livermore National Laboratory and by the California Energy Commission. The views expressed here do not necessarily represent the views of the California Energy Commission, its employees, or the state of California.

Robot Plumbs Wisconsin Lake on Way to Antarctica, Jovian Moon


A University of Illinois at Chicago scientist will lead a team testing a robotic probe in a polar-style, under-ice exploration that may have out-of-this world applications.



But the team will keep to a venue that’s much closer to home.



Peter Doran, associate professor of earth and environmental sciences at UIC, will lead the team Feb. 11-15 working in the icy waters of Lake Mendota off the campus of the University of Wisconsin, Madison.



They’ll conduct an under-ice test of a NASA-funded robotic probe called ENDURANCE — an acronym for Environmentally Non-Disturbing Under-ice Robotic ANtarctic Explorer.



The wintry Wisconsin conditions are hoped to simulate and to demonstrate whether the probe’s systems can operate in icy conditions as a first test of using such a vehicle in a similar environment on Jupiter’s moon Europa.


ENDURANCE is is a $2.3 million project funded by NASA’s Astrobiology Science and Technology for Exploring Planets Program. The probe is an underwater vehicle designed to swim untethered under ice, creating three-dimensional maps of underwater environments. The probe also will collect data on conditions in those environments use sensors to characterize the biological environment.



“Basically the game plan this week is to test the vehicle’s performance in a cold ice-covered environment,” Doran said. “Up to now it’s only been tested in relatively balmy environments like Texas and Mexico. We want to see what issues may come up by pushing it into the frigid water.”



The next step by the research team is to ship the probe to Antarctica’s permanently frozen Lake Bonney later this year. Bonney is a two-and-a-half mile long, mile-wide, 130 foot-deep lake located in the continent’s McMurdo Dry Valleys. It lies perpetually trapped beneath 12 to 15 feet of ice.



ENDURANCE will map Bonney for a month, then do a second mapping in 2009. Data gathered will be relayed back to Chicago where it will be used by UIC’s Electronic Visualization Laboratory to generate various 3-D images, maps and data renderings of the lake.



Science teams are developing and testing the technology for a possible underwater exploration mission on Europa far in the future. The probe is a follow-up to the Deep Phreatic Thermal explorer, a NASA-funded project led by Stone Aerospace that completed a series of underwater field tests in Mexico in 2007.

What is a red tide?





Red Tide caused by Dinoflagellates off the Scripps Institution of Oceanography Pier, La Jolla California.Credit Wikipedia
Red Tide caused by Dinoflagellates off the Scripps Institution of Oceanography Pier, La Jolla California.Credit Wikipedia

Although its name sounds like a low-budget horror movie, you won’t find “Red Tide” at a theater near you. To take in this natural phenomenon, you’ll have to venture to the ocean, because red tide – or more scientifically, HAB or harmful algae bloom – occurs when a harmful variety of algae reproduces so densely that the water appears red, yellowish-brown or green from the high concentrations of photosynthetic pigments.



Explained Michael Arthur, a Penn State geosciences professor with research expertise in marine ecosystems, most varieties of algae are harmless and occupy an essential tier of the food chain.



The algae responsible for red tide, however, is one of a few dozen species of phytoplankton called dinoflagellates that, under the right conditions, produce toxins. The level of toxicity depends on the specific algal variety involved, with the most dangerous species emitting potent neurotoxin that can kill marine and coastal wildlife, and can even cause illness or death in humans.


But why do these blooms occur?



According to Arthur, “Blooms can’t occur unless the algae is already present in the water, and the right nutrients are present – particularly nitrogen and phosphorus – to supply the raw materials that stimulate rapid growth of the algae population.”



Aided by warm water and stagnant surface conditions, nutrients play a large part in allowing the algae to reproduce so quickly. Arthur said one source of these nutrients comes from upwelling, a process where colder, deeper ocean water is brought up to shallower water, bringing nutrients with it. Upwelling along the U.S. West Coast may have a lot to do with the high incidence of blooms in that region. The bluish-green glow, or bioluminescence, of the relatively harmless dinoflagellate species Noctiluca scintillans, is a common sight in California’s waters.



However, Karenia brevis – the microorganism responsible for most red tides in Florida’s Gulf of Mexico – is a much redder and more dangerous strain. A 2005 outbreak along Florida’s southwest coast was described in The New York Times as one of the state’s “worst red tides in decades.” At its peak, it caused a 2,000 mile-wide “dead zone” off the city of St. Petersburg that was responsible for “more than 950 tons of dead creatures” that washed up on area beaches. Ironically, hurricanes typically bring relief by diluting the concentration of algae, but even Hurricane Katrina didn’t end Florida’s 2005 ferocious outbreak.


Areas where pollution drains into the ocean often experience algae blooms, Arthur noted. He suspects excess nutrification is the cause. “Blooms have been observed around aquaculture sites and fish farms where nutrients are being shed into the surrounding water,” he said. “Fertilizer run-off from farmland and sewage waste also add to the nutrient content, supplying the algae with a more abundant food source.”



According to the National Oceanic and Atmospheric Association (NOAA), over 50 percent of unusual marine mammal mortality events are due to harmful algae blooms. While shellfish such as clams, mussels and oysters are not directly harmed by red tide, their filter-feeding systems store the algae’s toxins in their tissue. If tainted shellfish is ingested, they can have a deadly affect.



As Arthur explained, “Paralytic Shellfish Poisoning (PSP) and other syndromes caused when people eat tainted shellfish can cause catastrophic damage to victims’ cardiovascular, digestive and/or neurological systems. Some, including PSP, are potentially fatal. If you contract this condition,” he continued, “the first symptom you’ll likely notice is a numbness on your tongue. If this occurs, you should get medical help immediately. Breathing machines have saved lives in this situation. Otherwise, the heart keeps beating but the paralyzed lungs fail, leading to death.”



Reported occurrences of PSP in coastal areas have increased in the last 30 to 40 years. Arthur said it’s possible that the condition is simply more widely recognized today and therefore more accurately reported. There also are more people living in coastal areas, as well as better monitoring systems to locate PSP-carrying algae. However, he suggested that the increase is likely due at least in part to an actual increase in HAB. Warmer water temperatures resulting from climate change have tentatively been associated with algae blooms, though research is continuing on the subject.



To those who consume seafood, Arthur suggested, “You have to be careful and know where things come from. Don’t just walk along the beach, pick up an oyster or mussel, and pop it into your mouth.” Watch the news for “red tide alerts” and, when dining at a seafood restaurant, “use the buddy system,” he advised, “so that you can notice physical or behavioral changes in each other that might signal shellfish poisoning.”



Caution is the watchword, because although “Red Tide” may not be a horror movie, its impact can be pretty horrific.