Parallel computers 'evolutionize' research
Two views of a 20-km-resolution simulation of the "Perfect Storm." The figure on the top depicts atmospheric moisture. The figure on the bottom shows rainfall and cloud water.
A major research trend is harnessing advanced computers to complement theory and experiment. Advanced computing allows scientists to conduct experiments that could not otherwise be done, to test possible experiments before investing the time and money to physically carry them out, and to create models of complex phenomena.
Fueling the growth of scientific computing are the rapid expansion and availability of parallel computing facilities, such as Argonne's Chiba City, a 512-processor parallel computer based on the Linux operating system. A typical desktop computer has only one central processing unit. A parallel computer, however, has many processors that coordinate to work on smaller parts of a larger problem. Modern technology can even link computers around the world, providing still greater power to solve complex problems (see Globus Toolkit enables Grid computing).
Argonne computer scientist Mike Minkoff sees scientific computing as a natural evolution of science. "Research," he said, "combines three activities: experimental observation; the development of mathematical models, such as Newton's laws of motion, to describe the observations; and computation to test the models by applying them to new experimental observations."
In the past 20 years, the computer has given experimentalists more direct feedback, partly by speeding computation and lowering the costs of solving larger problems and partly by enabling researchers to develop and test models quickly.
Modeling the "Perfect Storm"
John Taylor of Argonne's Regional Climate Center develops large-scale models to estimate regional impacts of climate change. He focuses on the American Midwest and Great Plains, but his tools and techniques have been applied to other regions.
"Global climate models don't do a good job of predicting regional climate," Taylor said, "because the smallest area they examine is typically a square that measures 200 by 200 kilometers. A grid cell that large can't reveal extreme weather, which tends to be local in scale."
Taylor's models use 1- to 10-km cells. "At this level of detail," he said, "the model can include specific regional features--mountains, valleys, bodies of water--that shape local winds and precipitation. You begin to see extreme events, such as more intense rainfall and wind storms."
Taylor's group has developed a model of the "Perfect Storm," which hit the north Atlantic in October 1991 and subsequently inspired a best-selling novel and a Hollywood movie. At 20-km resolution, the model revealed a second hurricane that weather services never reported.
Because computing time rises sharply as resolution increases, regional climate modeling requires considerable computing power. To meet this requirement, Taylor's group uses Argonne's Chiba City. "The calculations scale well," he said, "and we can access a large number of processors to perform the runs cost effectively. Argonne is one of the few places in the world with a large-scale cluster testbed available for this kind of research."
His research is funded by DOE's Office of Biological and Environmental Research and the U.S. Environmental Protection Agency.
Mapping reactor behavior
David Weber's group in Argonne's Reactor Analysis and Engineering Division is working with the Korea Atomic Energy Research Institute (KAERI) and Purdue University to model the core of pressurized-water reactors, the world's most common type of commercial reactor. Their work, funded by DOE's Office of Nuclear Energy, could improve reactor performance, extend operating lifetime and increase output without compromising safety.
The project uses advanced computing tools to predict fuel and coolant temperatures throughout the core during normal and abnormal conditions. Because of the expense of operating massively parallel computers and the large size of the model, these tools will be used to improve and verify smaller, more economical models for routine use.
"This project is using massively parallel computers to look at the complex feedback relationships between reactor fuel and coolant in an integrated reactor system," said Weber, Reactor Analysis and Engineering Division director.
Nuclear reactors use fission to produce heat. The heat boils water to produce steam, and the steam turns a turbine, which generates electricity. Heat generation in the core depends on two phenomena:
Neutron flux, the rate at which the core emits neutrons, and
Neutron cross section, the probability that neutrons strike fertile nuclei to drive the chain reaction.
Higher neutron flux can raise fuel temperature; but higher fuel temperature can reduce neutron cross section, which lowers the fission rate and may reduce fuel temperature.
"Feedback is balanced when the reactor operates normally at constant power," Weber said, "because heat production and heat removal are equal. But when something changes--an operator adjusts the power or some short-term incident occurs--the system's response depends on feedback and operator actions."
The Argonne-KAERI-Purdue collaboration is building on previous Argonne work, which churned through a 240-million-cell model of a reactor core in 58 hours on a 200-processor parallel computer at IBM's SP Benchmark and Enablement Center, Poughkeepsie, N.Y.
"The earlier work," he said, "looked only at the temperature of the coolant as it flows through the core. This project incorporates details of neutron flux and fuel temperatures that we couldn't attempt without massively parallel computers."
Additional details include turbulent mixing when the water flow encounters components in the core. The mixing is beneficial, Weber said, because it creates more homogeneous coolant temperature and enhances heat transfer, but the turbulence makes the pumps work harder. "Our work may help redesign components to promote mixing while reducing pumping losses."
Mike Minkoff collaborates with chemist Al Wagner to model the basic chemical reactions of burning fuels. They are improving methods for calculating combustion-rate coefficients, which industry uses to model cleaner-burning, more efficient energy systems.
"Coefficients for reactions involving simple chemical species--such as oxygen reacting with hydrogen--can be calculated precisely on a desktop PC," Minkoff said. But as molecular size increases, the calculations rapidly outstrip the capabilities of most massively parallel computers.
Minkoff and Wagner use a matrix-based approach that allows parallel computers to calculate systems involving molecules containing many atoms. The elements of the matrix are combinations of kinetic and potential energy associated with the molecules' relative proximity and orientation. The computations involve multidimensional space and identify the most stable energy states among the molecules. Key to their research is PETSc, the "Portable, Extensible Toolkit for Scientific computation" developed by Argonne's Mathematics and Computer Science Division to solve large-scale, specialized problems.
Minkoff and Wagner start by modeling reactions of simple, two-atom molecules and test their results against the precise mathematical solution. They then expand their model to include more complex molecules containing many atoms and compare their results with those from the traditional approach, which uses statistical methods to estimate the coefficients. Their work is funded by the U.S. Department of Energy's (DOE) Office of Basic Energy Science and Office of Advanced Scientific Computing Research.
Diving into the nucleus
Argonne physicists Steve Pieper and Bob Wiringa work deeper inside the atom. They use parallel supercomputers to calculate the forces that bind together nucleons--protons and neutrons--to form atomic nuclei. Their goal is to develop theory that matches observation.
"Good theory has to explain, for example, why there's no stable eight-body nucleus," Wiringa said. "This imposes a limit on the nuclei created in the earliest moments of the Big Bang. In the beginning, there were no nuclei with more than seven nucleons."
A key challenge is to find models that work for both neutron-rich nuclei--those with a high ratio of neutrons to protons--and for those with equal numbers.
"If we want to compute the forces in neutron stars, which are essentially all neutrons," Wiringa said, "we need to understand neutron-rich nuclei like helium-10, which is unstable." With two protons and eight neutrons, helium-10 is the most neutron-rich nucleus known.
Pieper and Wiringa study nuclei with five to 10 nucleons. Their work begins with a model that calculates the binding energies for two-body nuclei.
Known from thousands of experiments involving collisions, binding energies are the precise energies required to break a nucleus apart. Each nucleus has more than one binding energy, depending on whether it is at its "ground" or most stable state, or whether it has been excited to an intermediate state by an interaction that imparted some energy but not enough to break it apart.
Pieper and Wiringa's two-body model is the "Argonne potential," published in 1995 by Wiringa and colleagues from Flinders University, South Australia, and Old Dominion University, Va. In 2000, their paper was the world's most cited theoretical nuclear physics publication.
To study nuclei with more than two nucleons, Pieper and Wiringa extend the two-body model by adding the "Illinois family," a collection of three-body models they developed with Vijay Pandharipande of the University of Illinois at Urbana-Champaign.
"The required computing power," explained Pieper, "increases exponentially with the number of nucleons. Calculations for up to six bodies can be done on a modern PC. For seven or eight, we can use Chiba City comfortably, but it's a stretch for nine. For 10, we use the National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory."
To compute a single 10-body energy state, 500 processors work in parallel at 250 million operations a second for eight to 15 hours. "And," said Pieper, "we have to test a whole family of energy states."
Their recent work, funded by DOE's Nuclear Physics Division, provides a fairly consistent picture of binding in nuclei with up to 10 nucleons. Their next step, as computing power continues to expand, will be to extend their work to larger nuclei.
For more information, please contact David Baurac.