Wind tunnels won’t go away—we’ll need them to store the printouts from our CFD solutions!—Dean Chapman, NASA Ames
This is my second-hand memory of a quote attributed to Dean Chapman of NASA Ames as he gave the 1979 AIAA Dryden Lecture1. He was joking, of course, but the lecture was a serious look at the future computer requirements for various types of CFD calculations. He also forecast the rapid growth of computer power through the 1990s. The highest fidelity CFD simulation that he described was a large-eddy simulation (LES) of a large full airplane configuration. To do this simulation requires a grid containing one trillion cells (or grid points).
Even Reynold’s averaged Navier-Stokes (RANS) solutions will require more grid points as the complexity of the problems and the geometry increase. For example, an airplane in landing configuration has a complex array of flaps and slats that allow it to land at slower speeds. In this flight regime, aerodynamicists use flow control devices, like the nacelle fence in the image below, to tailor the behavior of the flow on the wing. In this case the fence creates a vortex that allows the flow on the wing to remain attached to a higher angle-of-attack than it would have otherwise.
However, they don’t want the fence vortex to be too effective or it will lead to wing-tip stall and poor low-speed behavior. So it is very important to accurately resolve the vortex creation and propagation, and that requires a lot of grid.
Even in 1979 it was understood that the computer performance is, and will remain for some time, the primary limiter to what can be accomplished with CFD. The large full airplane CFD calculations being done today are in the 100’s of millions of cells, but they are not LES. Instead they solve the RANS equations, which employ turbulence models to account for sub-grid flow features. These models have limitations and must be applied with care. LES calculations have fewer assumptions and therefore should have broader applicability.
In CFD, the size of the grid (in cells or grid points) increases as the computer power increases. It grows with Moore’s law!
Dean Chapman was an early visionary in CFD, and his ideas on grid requirements have held up well. Recently, NASA released their CFD Vision 2030 Study. In its Technology Development Roadmap, under the category “Knowledge Extraction”, it has technology demonstrations for the following:
- In 2020: On demand analysis/visualization of a ten-billion point unsteady CFD simulation.
- In 2025: On demand analysis/visualization of a 100-billion point unsteady CFD simulation.
- In 2025: Creation of a real-time multi-fidelity database with 1000 unsteady CFD simulations, plus test data with complete uncertainty quantification of all data sources.By extrapolation of technology demonstrations 1 and 2, we would expect:
- In 2030: On-demand analysis/visualization of a one-trillion point unsteady CFD simulation.
In other words, in 2030 we should expect the first LES calculations for a large full airplane configuration, and that will require one trillion cells. But, you say, why does our challenge visualize one trillion cells in 2015 when the first real calculations won’t occur until 2030?
It is because we need to ensure our designs scale appropriately. The truth is we don’t expect the one-trillion cell visualization to be snappy in 2015. It is the equivalent of visualizing one billion cells on the workstation you had in 2000 (for me, two cores and 2 GB of memory). But we want to write our software to handle future growth. And as we’ll discuss in the next blog, this means sub-linear scaling for all algorithms!
1Chapman, D. R. “Computational Aerodynamics Development and Outlook,” AIAA Journal 17. 1293 (1979).
2Slotnick, J., Khodadoust, A., Alonso, J., Darmofal, D., Gropp, W., Lurie, E., and Mavriplis, D., “CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences,” NASA/CR-2014-218178, https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20140003093.pdf.
Blogs in the Trillion Cell Grand Challenge Series
Blog #1 The Trillion Cell Grand Challenge
Blog #2 Why One Trillion Cells?
Blog #3 What Obstacles Stand Between Us and One Trillion Cells?
Blog #4 Intelligently Defeating the I/O Bottleneck
Blog #5 Scaling to 300 Billion Cells – Results To Date
Blog #6 SZL Data Analysis—Making It Scale Sub-linearly
Blog #7 Serendipitous Side Effect of SZL Technology