FLUENT - Two common errors when using the planar conduction model (floating point exception or MPI error) and tips to address the underlying issues


Most of the issues with planar conduction in parallel mode are not actually due to parallel.
There are two other possible causes:

I) If one gets floating point exception in serial mode, then one should check that all walls that have planar conduction ON should
NOT have zero thickness. It is easy to make such mistake. Fortunately, it is rather easy to check for such condition by examining the bc file.

To perform such quick check:

-Write out a bc file, by going to FLUENT's Text User Interface (TUI) command line and typing: /file/write-bc/any_name
-Use any text editor that has search/replace functionality and read the bc file.
-Search for line in the bc that contains: (planar-conduction? . #t)
The #t means that planar conduction is turned on. Eighth line above this one is thickness definition. Thickness definition line looks as such:
(d . 0.001). Search through the file for all the walls and make sure the thickness is not zero - you should not see: (d . 0).

If you find one with zero thickness, then you can either visit that wall in boundary condition panel and put a non-zero positive thickness or remove planar conduction. You can also make these changes in the bc file. If you want to turn OFF planar conduction in the bc file, then repalce (planar-conduction? . #t) with (planar-conduction? . #f). You can read the modified bc file by typing in the TUI: /file/read-bc bc_name.



II) The floating exceptoin that one receives can also be caused by applying read bc file from a case which was somewhat different. Lets say I have 'wall-1' defined as coupled in the bc file. In the new case, 'wall-1' is one-sided, so no coupled option is available. When one reads such bc file (/file/read-bc bc.bc) onto the new case, eventhough it does not return any warning, NONE of the bottons under Thermal Conditions in the boundary condition panel for 'wall-1' will be selected.

If nothing is selected under 'Thermal Conditions' for any wall in the boundary conditions panel, energy, if run in serial mode, will return 'floating point exception'. And if it is run in parallel model, it will return MPI error, which follows as such:

fluent_vmpi.5.5.14: MPI Daemon (hsg): MPI Application received signal 10
fluent_vmpi.5.5.14: Rank 2: MPI_Reduce: Internal MPI error: Invalid argument
fluent_vmpi.5.5.14: MPI Daemon (auto2): Aborting the application: mpirun exited
fluent_vmpi.5.5.14: MPI Daemon (auto1): Aborting the application: mpirun exited
fluent_vmpi.5.5.14: MPI Daemon (auto1): Aborting the application: mpirun exited
fluent_vmpi.5.5.14: MPI Daemon (hsg): Aborting the application: mpirun exited
fluent_vmpi.5.5.14: MPI Daemon (auto2): Aborting the application: mpirun exited

There is no easy way to find such problem but to visit these walls one-by-one in interactive mode. It would help if one knows the
difference between the two cases: the old case where bc file was written from, and the new case where bc file is being read into.





Show Form
No comments yet. Be the first to add a comment!