Large-scale simulation reveals novel insights about turbulence
Using HLRS’s Hawk supercomputer, University of Stuttgart scientists have for the first time produced high-resolution simulation data characterizing the transition from low to high Reynolds numbers in turbulent boundary layers.
Gauss Centre for Supercomputing
image: After a certain point in the development of a turbulent flow — for example, as air moves over a wing in flight — the outer region of the turbulent boundary layer (where blue dominates) maintains a persistent, self-similar physical structure.
Credit: IAG, University of Stuttgart
Scientists at the University of Stuttgart’s Institute of Aerodynamics and Gas Dynamics (IAG) have produced a novel dataset that will improve the development of turbulence models. With the help of the Hawk supercomputer at the High-Performance Computing Center Stuttgart (HLRS), investigators in the laboratory of Dr. Christoph Wenzel conducted a large-scale direct numerical simulation of a spatially evolving turbulent boundary layer. Using more than 100 million CPU hours on Hawk, the simulation is unique in that it captures the onset of a canonical, fully-developed turbulent state in a single computational domain. The study also identified with unprecedented clarity an inflection point at which the outer region of the turbulent boundary layer begins to maintain a self-similar structure as it moves toward high Reynolds numbers. The results appear in a new paper published in the Journal of Fluid Mechanics.
“Our team’s goal is to understand unexplored parameter regimes in turbulent boundary layers,” said Jason Appelbaum, a PhD candidate in the Wenzel Lab and leader of this research. “By running a large-scale simulation that fully resolves the entire development of turbulence from an early to an evolved state, we have generated the first reliable, full-resolution dataset for investigating how high-Reynolds-number effects emerge.”
Why it's difficult to study moderate Reynolds numbers
During flight, turbulence causes very high shear stress on the surface of an aircraft. The resulting drag can reduce flight performance and fuel efficiency. To predict this effect, aerospace engineers rely on computational models of the turbulent boundary layer, the millimeters-thin region where the surface of the aircraft interacts with free-flowing air.
For industrial applications, turbulence models do not need to replicate physics down to the finest details; they must only be accurate enough for practical use and be capable of running smoothly on modest computing resources. Before engineers can use such simplified models, however, scientific research using high-performance computing systems is necessary to provide the data on which they are based. This is why the Wenzel Lab has long used HLRS’s supercomputers to run NS3D, direct numerical simulation software it created to investigate fundamental physical properties of turbulent boundary layers at extremely high resolution.
Scientists in the field of computational fluid dynamics (CFD) use a figure called the Reynolds number to characterize the developmental state of a turbulent boundary layer. The Reynolds number is the ratio of inertial forces to viscous forces in a fluid flow, which governs the local range of turbulent eddy sizes. At low Reynolds numbers, which occur early in the motion of a surface through air, nonlinear convective instabilities responsible for turbulence are quickly damped by viscous action at small scales. With increasing Reynolds number, the turbulent boundary layer becomes thicker. Large, coherent structures emerge, creating a more complex turbulent system that is not simply an extrapolation of trends at low Reynolds numbers, but has its own distinct properties.
In the past, CFD simulations have generated rich datasets for understanding turbulence at low Reynolds numbers. This is because the computational domain size and the necessary number of simulation time steps involved at this stage are still relatively small. By today's standards, this means that the simulations are not prohibitively expensive. Laboratory experiments also provide invaluable data for turbulence research. For quantities relevant to the present study, however, they have only focused on the high Reynolds number regime due to physical limitations. Sensors can only be machined so small, and some fundamental physical quantities, such as shear stress, are notoriously difficult to measure in the lab with high accuracy.
As a result, scientists have accumulated a wealth of simulation data for low Reynolds numbers and reliable experimental data for high Reynolds numbers. What has been missing, however, is a clear picture of what happens in between, as both simulation and experimental methods are of limited use. Appelbaum and his collaborators in the Wenzel Lab set out to attack this problem directly.
A sharp turn
Using HLRS’s Hawk supercomputer, Appelbaum ran a series of simulations that, when the results were stitched together, replicate the entire evolution of a turbulent boundary layer from low to high Reynolds numbers. Although the “real-life” situation the simulation represented might seem vanishingly small — traveling at Mach 2 for approximately 20 milliseconds — the campaign required large-scale computing power. The team used 1,024 computing nodes (more than 130,000 cores) on Hawk — one-fourth of the entire machine — for hundreds of short runs, each of which lasted 4 to 5 hours. In total, the simulation required more than 30 days of computer runtime.
“Most research groups would not take the risk of spending so much computational time on a problem like this, and might instead look at other interesting research problems that aren’t as expensive,” Appelbaum said. “We’re the weird ones who put all of our eggs in this one basket to investigate a long-standing gap in the research.”
The investment paid off. In their large-scale simulation the investigators focused on (among other factors) the skin friction coefficient, a value that represents the proportion between shear stress at a solid surface in a moving fluid in comparison to free momentum of the flow. It is a key parameter describing the shape of the mean velocity profile and is fundamental in determining the viscous drag.
Appelbaum used the results of the simulation to show how the previously separate datasets for low and high Reynolds numbers blend together. Whereas past research could only estimate through interpolation how the datasets might intersect, the IAG team’s results reveal a sharp turn. Notably, they identified a change in skin friction scaling that is linked to the establishment of a fully-developed state in the outer 90% of the boundary layer. This self-similar state is a milestone in the turbulent boundary layer’s development, signaling that scaling behavior continues in a predictable way as it evolves to industrially relevant Reynolds numbers.
“To understand self-similarity, it helps to think about the aspect ratio of a photograph,” Appelbaum explained. “If I have a rectangular picture where the lengths of the sides have a ratio of 1:2, it doesn’t matter whether the picture is the size of my hand or if I scale it to the size of a bus. The relationships among the elements in the photo remain self-similar, regardless how large it is. Our work confirms that the outer region of the turbulent boundary layer takes on the same kind of self-similarity once the system reaches a specific Reynolds number. Importantly, this state is coupled with the change in scaling behavior of the skin friction coefficient, which experiments have shown to remain in effect until very high Reynolds numbers seen in aerospace applications. This allows us to get an early, but realistic glimpse of the turbulent behavior in this ultimate regime of turbulence.”
Increased performance offers new opportunities for research and engineering
This new dataset offers a unique resource that will better enable researchers in the computational fluid dynamics community to investigate turbulent boundary layers at moderate Reynolds numbers. For the Wenzel Lab, the next step will be to dive deeper into the physics behind the inflection point they identified. Appelbaum says that the team already has some ideas about this and plans to publish a follow-up paper soon.
In other ongoing work in the Wenzel Lab, the scientists have been busy porting the NS3D code to GPUs on HLRS’s newest supercomputer, Hunter. With the help of user support staff at HLRS and computing processor manufacturer AMD, they have already verified that the code remains physically accurate and performant using the new, GPU-accelerated system. In the coming months they will be optimizing NS3D to ensure that it takes full advantage of the increased performance that Hunter offers.
“We anticipate being able to to simulate larger domains at even higher turbulent states,” Appelbaum said. “More computing performance will also make it more feasible to do studies in which we might run several simulations to investigate the scaling behavior of two or more parameters simultaneously.”
In work that points toward this future, Tobias Gibis, a member of the Wenzel Lab and co-author of the present work, recently defended his thesis, in which he unified the scaling behavior of heat transfer and pressure gradients in turbulent boundary layers. Appelbaum added, “Building on Christoph and Tobias’s transformative work on the influence of heat transfer and pressure gradients to include Reynolds number effects would undoubtedly have very high scientific value. The support and resources from HLRS are the bedrock for this type of heavy computational work.”
In the meantime, the team’s dataset at moderate Reynolds numbers will contribute to a pool of wall-bounded turbulent flow data and could aid in the development of more comprehensive, more accurate turbulence models. This will give engineers new capabilities for optimizing aircraft designs for a wider range of operating conditions, and for improving other kinds of machines like fans or automobiles whose efficiency relies on managing effects in turbulent boundary layers.
— Christopher Williams
Funding for HLRS's Hawk and Hunter supercomputers was provided by the Baden-Württemberg Ministry for Science, Research, and the Arts and the German Federal Ministry of Research, Technology and Space through the Gauss Centre for Supercomputing (GCS).
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.