A company in Japan has built a computer that’s just surpassed 10 petaflops. IBM is now launching its own entry into the super computer market. And you can bet many more will follow.
We’ve been living with the Moore’s Law paradigm for a long time now and engineers have grown used to the constant increase in computational resources. Now, they’re nowhere near as constrained by their tools as they were even five years ago. The notion of a teraflop on your desktop isn’t out of the realm of imagination.
The Appetite of Computational Fluid Dynamics Analysis
Recently, Dr. Scott Imlay, Director of Research at Tecplot, Inc., was at the Supercomputing Conference in Seattle. One overwhelming result that came to mind after attending the show was that all this power will create a tremendous quantity of data via simulations. As Dr. Imlay pointed out, “The need for simulation data is essentially unlimited.”
He went on to give the example of climate science and the scale of study from tens of thousands of kilometers down to nanometers.
“If you were to capture all those scales, you can see that computation power measured in petaflops wouldn’t even come close to getting the job done. Exoflops will get you down the kilometer scale.”
While this is an extreme example of simulations that have an unlimited appetite for computing power, on the engineering side, there is already adequate computing power to do acceptable simulations for many kinds of studies. But even in situations where there’s enough computational power already, the addition of more computing power will soon change the way engineers work. The reason is simply – they’ll have the capability to generate more simulations, so they will generate more simulations. It’s inevitable. And the ability to generate more simulations means the engineer will begin to ask more interesting questions.
Dr. Durrell Rittenberg, Director of Customer Development at Tecplot, Inc., added that while there are many applications that don’t yet seem to need more computing power, the addition of more computing power, and the greater understanding that comes from it, will radically change the types of questions engineers ask.
Power Yes, But Computational Fluid Dynamic Software Must Keep Up
Dr. Imlay recalled the term most used at the Supercomputing conference in Seattle, “the data tsunami”. As computing power increases, engineers working in computational fluid dynamics analysis will see an exponential growth in data. “The problem for engineers in that situation will be not so much asking the question, but finding the answers among this vast amount of data that will be generated,” Dr. Imlay said.
In the near future, engineers will not have enough time in the day to analyze hundreds of thousands of cases. They will have to rely more and more on software to extract the knowledge from all that data.
The Computational Fluid Data Tsunami Reaches the Shores of Government
Beyond simulations, much of the data growth will come from sensors. The vast amount of data that flows into military and police entities, much of which is now analyzed by humans, will soon need to be analyzed by software. For instance, in war zones, un-manned vehicles are filming at an alarming rate. On the streets of London, it’s been said that at any one moment, there are three cameras aimed at every citizen. There’s simply more data than there are eyeballs and brains to analyze it. Luckily, the computing power is coming that can help. The question is, will the software be there to lend a hand? At Tecplot, Inc. we know the answer is “yes.”