« Back

Tecplot SZL File Output from FUN3D


Fun3D Version 13 Now Available

FUN3D version 13, released in September 2016, supports writing binary files in the new SZL format.

FUN3D version 13, released in September 2016, supports writing binary files in the new SZL format. I’ve written in previous blogs about the benefits of SZL technology for visualization of large data files. With SZL technology, common visualization tasks are up to two orders of magnitude faster and data files are compressed by up to 50%.

This change offers additional benefits to FUN3D/Tecplot users, especially for those running in parallel on a HPC system, for these reasons:

  1. Tecplot data is now written in parallel to a single file. Previous versions of FUN3D supported parallel output in Tecplot format by writing a file for each MPI Rank, resulting in a large number of files per CFD case. Not only was this cumbersome to manage, but it unnecessarily used additional inodes, which are limited on many parallel file systems.
  2. TecIO-MPI uses MPIIO to write the data in parallel, which offers performance benefits when writing to parallel file systems.

How to obtain TecIO libraries

There are now separate TecIO libraries for scalar and MPI-based parallel output. Both libraries are included in our TecIO Library. Selecting the “Download TecIO source code for Windows, Linux, and Mac” downloads a “tar” file that contains the scalar and MPI versions of TecIO in separate folders: teciosrc and teciompisrc. To build these libraries, follow the steps outlined in the readme.txt file located in each folder.

How to build FUN3D with TecIO

The FUN3D documentation was not completely updated for the new version of TecIO. For example, it refers to tecio64.a (a name no longer used – all tecio libraries are 64-bit) and makes no mention of the TecIO-MPI library. However, the configure command in section A.7.12 of the FUN3D Manual is still accurate. Simply configure using

--with-tecio=/path/to/tecio

where “/path/to/tecio” is the path to the appropriate version of TecIO: teciosrc for scalar or teciompisrc for MPI parallel.

Performance Considerations

TecIO-MPI uses MPIIO to write data in parallel to a single file. The parallel efficiency of this process is dependent upon a number of factors, including the parallel file system. If your performance is lower than you expect, you can try modifying system parameters. One user discovered that, for his Lustre file system, increasing the number of stripes in the folder you are writing to improves performance.