Glimpses of global warming
Advanced climate models and faster supercomputers must be developed to ensure that policymakers have better global and regional predictions of future climate and its effects
Sulfate particles from the combustion of fossil fuels contribute to the haze in Great Smoky Mountains National Park and exert a cooling effect that is considered in climate models.
Full size image available here.
As greenhouse gases accumulate in the atmosphere, many questions arise concerning how fast and in what ways Earth's environment will change. For example, in the United States, will increased emissions of carbon dioxide from coal combustion in the 21 st century make the Southeast wetter or drier over the next 100 years?
Will changes in temperature and moisture conditions make certain U.S. regions more vulnerable to insect-borne diseases? By the year 2100, will the world's glaciers be largely melted and will some low-lying coastal lands be flooded by rising sea levels?
Detailed answers to questions like these being asked by policymakers and researchers will require more sophisticated climate models, faster supercomputers to run them, and larger data storage repositories that can be networked nationwide to store and exchange large data files. The Department of Energy's climate science mission is to improve the scientific basis for assessing potential consequences of climate change on decade-to-century time scales. The goals of the Climate and Carbon Research Institute (CCRI) at DOE's Center for Computational Sciences at ORNL are centered on this endeavor. DOE's computer centers at ORNL and elsewhere are making great progress in understanding future climate--a predicted average of weather patterns--under different scenarios. Already scientists are running global warming "experiments" on supercomputers.
Because carbon is a waste product of fossil energy combustion in power plants and energy use by cars and trucks, DOE's primary interest is the effect of carbon dioxide emissions on climate warming. DOE wants researchers to represent the carbon cycle correctly in their models to provide an objective framework for investigations of interactions of processes and feedback involving the atmosphere, land, and oceans.
CCRI, led by David Erickson and John Drake, is coupling carbon and climate models with the help of researchers in ORNL's Environmental Sciences Division. Earlier under the guidance of Drake, CCS researchers modified an important climate model so it could run on massively parallel supercomputers. The model predicts interactions between the atmosphere and land and between the atmosphere and oceans. CCRI has collaborated with the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, and with Los Alamos National Laboratory in modeling interactions among processes generated by and affecting the land, atmosphere, and oceans.
Climate modelers face difficult challenges. For example, changes in the chemical makeup of the atmosphere can have confounding effects. Greenhouse gases such as carbon dioxide emitted from the land to the atmosphere can absorb infrared radiation from Earth's surface and prevent the escape of heat. However, sulfate aerosols from coal-fired power plant emissions can have a cooling effect, moderating the temperature signal and altering weather patterns.
DOE, which has funded research that has led to important breakthroughs in climate modeling, is pushing researchers to make climate simulation models more comprehensive and more detailed over the next 20 years. These models will consume a record number of compute cycles using the fastest computers ever built.
Oak Ridge and Japan
David Erickson (left, with JosÚ Hernandez) is director of CCS's Climate and Carbon Research Institute. He is a member of the scientific planning team of the Surface Ocean-Lower Atmosphere Study (SOLAS). He co-authored a chapter on the interactive effects of ozone depletion and climate change, which appears in the United Nations Environment Programme document entitled Environmental Effects of Ozone Depletion and Its Interactions with Climate Change: 2002 Assessment.
Full size image available here.
Climate simulation is an international research activity of importance to policy-makers in a variety of nations. The Japanese government has invested in climate science by building the world's largest supercomputer dedicated to fine-scale climate simulations.
The Japan Earth Simulator, ranked number one in supercomputer power and speed on the latest Top500 list of Jack Dongarra of the University of Tennessee, has a theoretical computing speed of 35 to 40 teraflops, or 35 to 40 trillion calculations per second. Significantly, the scientific codes used on supercomputers at DOE sites such as the CCS have made America competitive with Japan in advancing the understanding of climate. For its second priority for future facilities, DOE's Office of Science is supporting development of a supercomputer at ORNL that will surpass Japanese computational abilities in climate prediction and other areas.
U.S. computing sites also are collaborating with Japan in climate prediction. In 2001 the Intergovernmental Panel on Climate Change (IPCC), relying partly on results from global climate models run on the world's supercomputers, concluded that "there is new and stronger evidence that most of the warming observed over the last 50 years is attributable to human activities." The IPCC projected that, by the end of 2100, the global average temperature of the earth could rise by 2.7 to 10.4░F.
ORNL's CCRI has joined Japan, NCAR, and the National Energy Research Scientific Computing (NERSC) Center in California in running simulations to provide answers for IPCC's Fourth Assessment, due out in 2007. The CCRI simulations require an average of 25% of the compute hours on the IBM Power4 at CCS. The four participants are all using the same code on scenarios with different carbon concentrations in the atmosphere.
NERSC is predicting the climate and its effects through 2100 assuming that the carbon content stays the same as it is to-day--370 parts per million (ppm). NCAR is predicting the effects of a more optimistic scenario in which humankind finds a way to stop the buildup of atmospheric carbon when it reaches 876 ppm by 2100.
About a half dozen researchers at CCRI, aided by 20 staff members in CCS, have collaborated in building and optimizing the Community Climate System Model on the IBM Power4. The scenario simulated at ORNL assumes that humankind stabilizes atmospheric carbon concentrations at 550 ppm by 2100 and then reverses the buildup by sequestering carbon and replacing the fossil fuel economy with a hydrogen economy in which buildings and transportation vehicles are powered by fuel cells and fossil power is replaced with fission and fusion power. The IPCC runs at ORNL will generate 30 terabytes of data. Much of the data will be retained at the High Performance Storage System at CCS in Oak Ridge.
IPCC runs must be completed in 2004, so that the results can be analyzed and papers can be written, peer-reviewed, and published by 2006. The schedule will allow IPCC participants to assess the papers and prepare a report by 2007.
Depiction of the amount of carbon flux between the oceans and the atmosphere (peak heights) and amount of biological activity (color).
Full size image available here.
Drake says that the simulation at CCS for the IPCC is the largest set of coordinated runs he has ever seen for any project. Indeed, the climate simulation may consume the largest amount of computing resources ever used on a single set of codes with a single objective.
There are many questions about carbon and the climate that policymakers would like answered. For example, is the U.S. terrestrial system in the East taking up more carbon from the atmosphere than it is giving off? The forest that has grown in New England in the last century is storing carbon, but when the forest leaves fall, considerable carbon comes out of the decaying leaves and re-enters the atmosphere.
This and many other questions involving interactions and feedbacks in the climate system remain unanswered. To answer them, historical datasets must be better developed, the climate models must be improved to make more accurate predictions about regional as well as global climate, and the supercomputers must be faster to accommodate more data and model calculations. The CCRI hopes that leadership-class computing will provide the detailed answers that researchers and policymakers need.