It's all in motion when simulating fluids
DOE/Lawrence Livermore National Laboratory
![]() Full size image available through contact |
WHETHER it's the mechanics of a supernova, the ignition of an inertial confinement fusion (ICF) capsule, or the detonation of a nuclear weapon, simulating the motion of fluids is anything but simple. Every piece of the model is moving. Fluids interact with each other and with solid materials, plus those interactions occur quickly and at intense pressures and temperatures. In some instances, the fluid has almost nothing with which to collide, as happens in simulations of a nuclear explosion in the upper atmosphere or electrons in a fluorescent light bulb.
Scientists in many fields are interested in the motions of fluids. In Livermore's Defense and Nuclear Technologies (DNT) Directorate, researchers study fluid motion to answer questions in astrophysics, atomic and nuclear physics, computational physics, fluid dynamics, turbulence, high-energy-density physics, radiation transfer, and particle transport. Computer scientists work with physicists to write codes that simulate experiments, providing data to augment experimental results or to verify test codes used to predict results in regimes where experiments cannot be performed.
Livermore physicist Jim Rathkopf, who is associate leader for DNT's AX Division, says, "We're looking at a variety of high-energy-density problems related to ICF and national security issues. Many of them require codes with different approaches for modeling the hydrodynamics of gases and plasmas in motion. Our multiphysics codes also simulate physical processes such as radiation, neutron, and charged-particle transport in addition to hydrodynamics."
Three Livermore codes that explore this world of motion are HYDRA, MIRANDA, and CPK. The radiation hydrodynamics code HYDRA is used to help design targets for the National Ignition Facility (NIF). MIRANDA looks at hydrodynamic instabilities, and the complex particle kinetics code CPK bridges the hard-to-model regime between collisionless and collision-dominated plasmas.
![]() Full size image available through contact |
Simulating Targets for NIF
When all of NIF's 192 laser beams are operational, they will deliver up to 1.8 million joules of ultraviolet laser energy and 500 terawatts of power to millimeter-size targets. Any asymmetry in how each beam delivers this energy or any perturbation (roughness) on the surface of a target capsule can affect the target's performance. Computational scientists developed HYDRA to evaluate different target designs in realistic three-dimensional (3D) geometries.
The baseline target design for ICF experiments on NIF uses a small metal cylinder, called a hohlraum, surrounding a spherical plastic or beryllium fusion capsule that contains a small amount of deuterium–tritium fuel. When NIF's laser beams simultaneously deposit their energy on the hohlraum's inner surface, much of it is converted to thermal x rays. When the x rays ablate the surface of the capsule, they create a rocketlike effect that causes the capsule to implode, producing high temperatures and pressures in the fusion fuel.
HYDRA can simulate the entire ignition target in 3D, including the hohlraum, capsule, and all relevant features. The code is flexible enough to model intrinsic asymmetries that result from the ideal laser illumination pattern and those that result from effects of irregularities in laser pointing and power balance. It also simulates the hydrodynamic instabilities that occur when the capsule implodes. HYDRA calculates all of the radiation, electron, ion, and charged-particle transport and the hydrodynamics from first principles--that is, no adjustments are made to the modeling parameters.
These simulations allow scientists to evaluate the robustness of a target design. For example, a designer can place realistic roughness on the capsule surfaces and calculate how these features evolve into irregularities--bubble and spike patterns--as a result of hydrodynamic instabilities. Three-dimensional simulations indicate that the ultimate amplitudes of the bubbles and spikes are greater than are shown in the 2D simulations. Thus, the 3D calculations provide more accurate information on peak amplitudes of these irregularities and how they affect target performance.
HYDRA calculations can now be run on Livermore's Linux-based MCR supercomputing cluster, and with that processing capability, the 7- to 10-day run times have been cut in half compared with run times from a year ago. "For the first time, we can model the entire ignition target in an integrated, 3D simulation," says Livermore physicist Marty Marinak, who leads the HYDRA development team. "We also can resolve the 3D evolution of the full range of interesting Rayleigh–Taylor instability modes. As a result, we have a substantially clearer understanding of how our target designs perform."
Designers are also using HYDRA to evaluate alternative target designs, including one with two concentric spherical shells and direct-drive targets that eliminate the need for a hohlraum. The HYDRA development team continues to enhance the code's capabilities in response to user requests. One new physics package will treat magnetic fields in 3D. Says Marinak, "This addition will enable unprecedented completeness in our modeling and further improve our understanding of the target physics."
![]() Full size image available through contact |
Observing Hydrodynamic Instabilities
Another Livermore-developed hydrodynamics code, MIRANDA, is being used to study the behavior of instabilities that evolve when materials of different densities are accelerated. Weapon physicists use MIRANDA simulations to better understand the physics involved in several stockpile stewardship issues they need to address. It also can provide information of interest in astrophysics research.
Developed by physicists Andrew Cook and Bill Cabot, MIRANDA can run as either a direct numerical simulation (DNS) or a large-eddy simulation (LES). In DNS mode, physical processes are calculated from first principles or as close as possible. In LES mode, models are added to describe some of the finer-scale physical processes and, thus, reduce the computational time.
"With this code, we can explore topics of interest, such as Rayleigh–Taylor instabilities and shock-induced mixing, at higher resolutions than ever before," says Livermore physicist Paul Miller. For instance, a high-resolution simulation of Rayleigh–Taylor instability revealed how the flow evolves through four stages. MIRANDA can even model the stage beyond the mixing transition, where the quantity of mixed fluid increases dramatically.
MIRANDA simulations are also allowing physicists to examine situations that are difficult to measure in experiments, such as fluid flow at high Reynolds numbers. The Reynolds number is a dimensionless parameter used to characterize the ratio of inertial effects to viscous effects in fluid flow. Flows at low Reynolds numbers are slow, small in size, viscous, or a combination of all three--for example, a raindrop sliding down a window, liquid being slowly sipped through a straw, or motor oil being poured from a container. Because these low-Reynolds-number flows involve a small range of scales, they are relatively easy to simulate.
In contrast, flows at high Reynolds numbers are fast, large scale, less viscous, or some combination of those properties.
High-Reynolds-number flows include air flowing around a truck on a freeway, swirling inside a thunderstorm, or jetting from the nozzle of a leaf blower. Such high-speed fluid dynamics are frequently of interest for scientific applications but are more challenging to compute. By running MIRANDA in both DNS and LES modes, physicists can examine fluid dynamics at a wider range of Reynolds numbers. "With DNS techniques, we can only model flows at Reynolds numbers up to about 5,500," says Cook. "But with LES techniques, we can look at systems up to about 40,000." At these conditions, the simulations can only be validated by statistical comparisons with experiments, so the team is "bootstrapping" results. That is, they compare results generated at the limit of DNS techniques to those generated by LES. Agreement in the DNS and LES data helps them validate the LES results.
The runs are complex and use the tremendous computational resources of two Livermore supercomputers, MCR and ALC. In one calculation, 1,728 processors were used for nearly a month. When the BlueGene/L supercomputer is available, scientists will have the computing power needed to run DNS calculations above the mixing transition, an important regime in which to validate the LES results.
![]() Full size image available through contact |
Bridging Fluid and Particle Motion
The CPK code, like HYDRA, can be applied to ICF problems. CPK's forte is semicollisional plasmas, where two fluids only partially collide. For example, when two puffs of gas are released, they "splash" into each other, and some--but not all--of their particles collide. Most of these particles flow in the same direction and at the same rate, near the average flow velocity. However, a few of them move in other directions, expanding away from the direction of flow and at significantly different velocities.
Although current simulation algorithms accurately model collisionless and collision-dominated plasmas, semicollisional plasmas have proven to be more difficult. "As it turns out, neither fluid nor kinetic models are adequate for the semicollisional regime," says Livermore scientist Dennis Hewett. "The simple hydrodynamics or fluid model assumes that the plasma is collision-dominated, and the velocity can be represented by a smooth Maxwellian distribution. However, because the hydrodynamics model doesn't look at the system particle by particle, it can't simulate ionization or particles that move against the flow."
A kinetic model, such as a particle-in-cell (PIC) code, performs calculations for "macroparticles," which are used to represent the system and thus capture the kinetic behavior of the system. "The drawback with a kinetic model is that macroparticles populate all of the areas--even those where nothing is going on that we're interested in," says Hewett, "and those calculations become expensive in terms of computational time and effort. Instead, what we need is an algorithm to bridge this gap by accurately simulating semicollisional systems." The CPK code uses an ensemble of small, fluidlike macroparticles to represent particle distributions in phase space, a mathematical construct that has the number of dimensions needed to define the state of a given substance or system. These macroparticles are Gaussian-shaped in both position and velocity. That is, plots showing the position and velocity of the particles that comprise each macroparticle form a bell-shaped curve. As time progresses in the calculation, the internal dynamics of the macroparticles may cause their shape in phase space to "tilt" as the velocity and spatial distributions evolve. The macroparticles may then break apart or combine with other macroparticles as the simulation continues.
If a lot of kinetic activity emerges, the code allows one macroparticle to be fragmented into many smaller particles that can be simulated with a kinetic model to make the important features easier to observe. Conversely, when collisions dominate a region with many macroparticles, those macroparticles can be merged, and their fluidlike behavior modeled with a hydrodynamics calculation. With this approach, computational effort is reserved for areas with the more relevant events and not squandered on less interesting areas. "The key to this approach," says Hewett, "is to design fragmentation and merging algorithms so that the code can reassemble the fragmented pieces and preserve the initial distribution without adding incidental new features or 'pseudo-physics' to the problem."
The CPK code works well at the limits of fluid and particle behavior. More importantly, results from semicollisional simulations, developed by coworker David Larson, agree with those from experiments, which helps validate the code's accuracy.
For example, Larson used CPK to simulate an experiment conducted by Livermore physicist Alan Wan in which a laser beam strikes two metal slabs, which are oriented 45 degrees from the beam and 90 degrees from each other. When the metal ionizes, two plasma beams stream off the surfaces and interpenetrate along the symmetry axis. Simulations performed with a fluid model do not allow interpenetration, so in those simulations, a strong density peak is formed where the two beams collide. This central peak eventually blows apart and forms two off-axis density peaks.
In the CPK simulation, the same general sequence evolves. However, because the beams initially interpenetrate instead of stagnating, the soft stagnation on-axis yields a significantly lower density. The subsequent off-axis density peaks are also lower than the peaks predicted by the fluid model. Results from the CPK simulations match the experimental results.
"We're now adding ionization physics to the code," says Hewett. "Livermore is one of the few places working on such a code to bridge fluid and collisionless models."
Just the Beginning
These three codes are not, of course, the only ones in AX Division's computational arsenal, says Rathkopf. There's RAPTOR, which like MIRANDA can be used to study hydrodynamic instabilities, but can adapt its underlying computational mesh in response to fluid conditions. LASNEX is a radiation and hydrodynamics code for ICF simulations. F3D models examine laser–plasma interactions, and KULL is a radiation and hydrodynamics code with similar capabilities to HYDRA.
"We also have a suite of codes that we use to simulate the performance of a nuclear weapon's secondary stage," says Rathkopf. "Many of these codes have methods for simulating gas dynamics--exploring what happens when things are in motion."
--Ann Parker
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.