A passion for computation benefits every discipline
DOE/Lawrence Livermore National Laboratory
![]() |
It's no secret that computers are in Lawrence Livermore's blood. That passion for computation was one of the principal factors that brought me to Livermore three years ago to become associate director of Computation. Livermore's research is invariably accomplished with the aid of computers, which include the most powerful supercomputers available. Our reliance on computers to simulate the physical world started at the Laboratory's founding in 1952, and computational excellence is a major reason for our continued success as one of the world's preeminent scientific research institutions.
Supercomputer simulations allow us to understand the intricacies of physical phenomena that occur at vanishingly short time scales and at extreme pressures and temperatures. Sometimes, an experiment cannot be conducted because it is too costly, too difficult to perform or diagnose, or involves toxic materials. The computer, in effect, serves as a virtual laboratory, one whose experimental results point to scientific "truth" as effectively as other means. Physical experiments and computational simulations often complement one another, in that physical experiments validate computational simulations, and simulations aid in the design of laboratory experiments.
During the 1970s and 1980s, as computer processing power continued to increase, Livermore researchers anticipated the intrinsic role of sophisticated, three-dimensional simulations. Indeed, Livermore first coined the phrase, "Simulation is a peer to theory and experiment." However, we have not yet fully arrived. The Department of Energy's (DOE's) Stockpile Stewardship Program is leading the way because, as a matter of national policy, the U.S. does not conduct underground nuclear weapons testing to learn about the performance of nuclear warheads in our stockpile. Rather, researchers use the world's most powerful supercomputers to help characterize nuclear reactions and the aging of materials. Working on stockpile stewardship pushes us to develop new computer codes, better simulation methods, and improved visualization technologies. It also sets the requirements for new generations of machines. The resulting capabilities, in turn, are applicable to other disciplines.
As the article Experiment and Theory Have a New Partner: Simulation describes, many Livermore research areas, such as chemistry, materials science, climate change, and physics, also have been among the leaders in taking advantage of supercomputers to advance their fields. Other disciplines, such as the biological sciences, do not yet leverage supercomputers to such an extent. However, DOE's Human Genome Project demonstrated the power of computers in biological research. Follow-on efforts, including those describing the shape and function of proteins, can only be performed with the help of powerful simulation tools.
The competition to produce or acquire the fastest or most capable supercomputer keeps everyone sharp and raises the bar for the entire high-performance computing community. Although we avidly participate in that competition, it is important not to lose sight of why we find these computers essential. We acquire the world's most powerful machines to strengthen national security and to advance our understanding of the physical world.
To continue to provide robust computational resources to our scientists and engineers, we look for emerging technologies that can offer increased power at lower cost. The new supercomputer Purple is a very robust, massively parallel system that will have a peak performance of 100 trillion operations per second (teraops). Purple was built by IBM and will be in operation at Livermore later this year. In 2004, we installed Thunder, a cluster of standard desktop microprocessors running the Linux operating system and capable of 23 teraops. Use of Thunder can be somewhat risky for some projects, but it offers enormous potential for reducing costs and compute time.
Another computer from IBM, BlueGene/L, will eventually be capable of 360 teraops. We call its design "disruptive" because it uses 131,072 commercial microprocessors to achieve unprecedented computing power. Clever packaging allows the machine to use less floor space and consume less energy than previous computers.
BlueGene/L still will not be powerful enough to simulate all the complexities of matter at extreme pressures and temperatures, so we look to acquire a petaops (1 quadrillion operations per second or 1,000 teraops) supercomputer by 2010. The JASONs, a prestigious advisory committee to DOE, validated our need for such a machine.
Will Livermore and DOE scientists be satisfied with the astonishing capability of the petaops? Not likely. Solving grand challenge scientific problems will continue to require ever faster and more capable supercomputers. We look forward to continuing to meet that need.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.