« Back

Overcoming the curse of dimensionality in hypersonic design


Tecplot Chorus helps University of Colorado researchers demonstrate method for kriging surrogate models to increase accuracy of low fidelity results by an order of magnitude.

May 2012

The decades-old quest to develop hypersonic vehicles powered by airbreathing1 engines for space exploration, defense, and air travel has yielded some promising results. Yet the goal of developing viable designs for commercial use remains elusive.

Designing manned and unmanned vehicles for hypersonic speeds—anything over Mach 5—creates unique challenges, the greatest of which is the non-linear interdependency of each system. Add one parameter to the initial vehicle design sweep, for example, and the number of single point solutions in the design space will increase exponentially. Second, if high fidelity simulations are used to converge the design loop, this process can take on the order of months depending on how far off the initial design approximation is from the final configuration.

This is a long way from NASA’s directive to reduce an iteration to two days by 2012 in the Integrated Design and Engineering Analysis (IDEA) environment.

It also is expensive.

Dr. Ryan Starkey, director of the Busemann Advanced Concepts Lab (BACLab) at the University of Colorado and an expert in supersonic and hypersonic aerodynamics, is leading a group of graduate and senior students to find methods that can reduce iteration time and costs. One of his graduate students, Kevin Basore, recently helped demonstrate an effective method using kriging surrogate models to increase the accuracy of the computed metrics in low fidelity data by an order of magnitude. More specifically, Basore and Dr. Starkey used Uncertainty Quantification (UQ) methods across several computational models to understand the trade-offs between computational time and fidelity. This allowed them to identify the best options for generating reliable results quickly without sacrificing the accuracy necessary for designing hypersonic aircraft.

“Getting this level of accuracy in low fidelity data means you don’t have to spend months running high fidelity data [in design iterations] to get the information you need to make these design decisions,” says Dr. Starkey.

Non-linear interaction of hypersonic vehicle design

Non-Linear interaction of hypersonic vehicle design. Image credit: Dr. Ryan Starkey.

The curse of dimensionality

When moving objects exceed the speed of sound, the physics change and so does their behavior. When they hit Mach 5, the physics and subsequent behaviors change yet again, creating new design challenges with little room for error.

Basore used a simple equation, taken from the book Engineering Design via Surrogate Modelling: A Practical Guide,2 by Prof Andy J Keane, Dr. Alexander I Forrester, and Dr. Andras Sobester, to explain this complex issue and the computational expense involved in running high fidelity data analysis:

nk

Where n is the number of sample points and k is the number of dimensions, if 10 points are sampled for each dimension in a 5-dimensional space, the design space would calculate out to 100,000 points. Assuming each point solution in a low fidelity design is optimized to on the order of a min, it would require ≈ 1,600 CPU hours, or $160 to populate the design space based on the average CPU cost of $.10 per hour.

Sampling 20 points in each dimension for a 5-dimensional space results in 3,200,000 points. Optimized to 1 min per solution this results in approximately 53,000 CPU hours (53,333 exact). At $0.10 per CPU hour populating this design space would cost ≈ $5,300

“We call it ‘the curse of dimensionality’, a term coined by the mathematician Richard Bellman,” says Basore. “Not only can a high fidelity iteration take weeks or months to complete, you can see that it gets costly fast.

Kriging surrogate models

Kriging surrogate models are an interpolation method that can accurately predict deterministic data in a sparsely populated n dimensional random field. Using these surrogate models, Basore was able to find an empirical approximation to the difference between a high and low fidelity design solution in order increase the accuracy of the low fidelity design by an order of magnitude.

“This ability to determine the trends within the multidimensional data and also get a baseline surrogate approximation accounted for a huge savings in the amount of time it would have taken otherwise to post process.”
— Kevin Basore

Currently, the accepted standard for multi-fidelity design incorporates the co-kriging surrogate model. This surrogate model, a variation on the kriging model, can directly account for and correctly weight the different fidelity levels in a multi-fidelity data set. This current practice though, is always dependent on multiple fidelity levels and is the reason Basore and Starkey began examining UQ methods to directly train low fidelity solvers to be more accurate.

Basore used two geometrical design configurations to prove the method: a diamond airfoil, also called a double wedge, and a generic cavity scramjet combustor. The airfoil was used as a simple test case to verify the methodology and the third-party software that was used in the calculations. It was selected over other simplified geometries because there is an analytical inviscid solution to the diamond airfoil and published works are readily available for comparison. After solving the empirical error between solution spaces on a diamond airfoil, the lessons learned were applied to the scramjet combustion case for a relatively small computational expense.

The process

Basore began by meshing the external flow around the airflow and then compared it at three different parameter levels. From the initial points taken in each of these design spaces, a discrete error space was created between the model levels. This discrete space was then turned into a continuous spectrum across a multidimensional domain using a kriging surrogate model. An empirical correction between fidelity levels was then superimposed on the low fidelity model to increase the accuracy of the metric results. The same method was then applied to the scramjet combustor. Discrete points were sampled to populate the continuous error spectrum between the fidelity levels. After doing some more work to account for differences in the geometry, chemical fidelity levels, and computational requirements, the methodology was successfully applied to the scramjet geometry.

Tools

The Sandia National Laboratories Dakota surfpack was used to produce the surrogate models because it supports n dimensional models—an important criterion for the high dimensionality required for the scramjet chemistry section. It also supports other surrogate models used to verify that the kriging model was the appropriate choice for this work. The meshing programs and solver that were chosen for this thesis are Chimera, ANSYS ICEM and VULCAN.

Tecplot Chorus

Finally, Basore used Tecplot Chorus, the new simulation analytics tool that helps engineers manage CFD projects with tens or thousands of simulation cases, to verify the convergence in his design spaces. Once he got the data into Tecplot Chorus, it took just a few steps in a simple interface to verify the data.

Sample Tecplot Chorus output, inspect convergence.

Sample Tecplot Chorus output, inspect convergence.

Tecplot Chorus enables engineers to manage CFD projects by bringing together results from simulation cases, derived quantities, and plot images in a single environment. An engineer using Tecplot Chorus can evaluate overall system performance and visually compare tens, hundreds, or thousands of simulation cases without writing scripts. It also allows them to analyze a single parameter over the entire project both visually and quantitatively. This gives engineers the ability to make decisions faster and with more confidence. Tecplot Chorus changes the post processing paradigm by pre-computing the plots, making downstream analysis faster. Currently, CFD engineers use a time-consuming combination of scripting and manual steps to do that work.
Basore uses Tecplot Chorus in other areas of his work, as well. To create the multidimensional kriging models for each of the fidelity levels, for example, he had huge datasets with least four independent variables for each of the design metrics.

“In Chorus, I was able to quickly switch between these independent parameters to determine the trends within the data without having to reimport for different dimensional subsets,” he says.

Basore also used the surrogate models of the design spaces to create an error vector between the multi-fidelity levels. He created huge datasets, both at high and low fidelity, which he needed to visualize in one database.

“Chorus allowed me to do this, as well as do my initial approximations with the built-in surrogate models until I required an n dimensional capability,” he adds.”These initial surrogate approximations, plus the ability to switch between multiple parameter subsets helped me determine the trends in the data very quickly. These capabilities, combined, allowed me to save a huge amount of time in post-processing the data.”
 
1Airbreathing engines utilize oxygen from the air to burn the fuel.
2Prof Andy J Keane, Dr Alexander I Forrester, Dr Andras Sobester, Engineering Design via Surrogate Modelling: A Practical Guide (The American Institute of Aeronautics and Astronautics, 2008).