OpenFOAM

OpenFOAM is a suite of computational fluid dynamics applications. It contains myriad solvers, both compressible and incompressible, as well as many utilities and libraries.

Availability and Restrictions

Versions

The following versions of OpenFOAM are available on OSC clusters:

version glenn oakley
1.7.x X X
2.1.0   X
2.2.2 X  

Feel free to contact OSC Help if you need other versions for your work.

Access 

OpenFOAM is available to all OSC users without restriction.

Basic Structure for a OpenFOAM Case

The basic directory structure for a OpenFOAM case is shown as follows:

/home/yourusername/OpenFOAM_case
|-- 0
|-- U
|-- epsilon
|-- k
|-- p
`-- nut 
|-- constant
|-- RASProperties
|-- polyMesh
|   |-- blockMeshDict
|   `-- boundary
|-- transportProperties
`-- turbulenceProperties
|-- system
|-- controlDict
|-- fvSchemes
|-- fvSolution
`-- snappyHexMeshDict

IMPORTANT: To run in parallel, you need to also create the "decomposeParDict" file in the system directory. If you do not create this file, the decomposePar command will fail.

Usage

Usage on Glenn

Set-up on Glenn

To configure the Glenn cluster for the use of OpenFOAM 1.7.x, use the following commands:

module load gcc-4.4.5
module switch mpi mvapich2-1.6-gnu
module load openfoam-1.7.x

To configure the Glenn cluster for the use of OpenFOAM 2.2.2, use the following commands:

module load gcc-4.9.1
module switch mpi mvapich2-1.6-gnu
module load openfoam-2.2.2
source $FOAM_INST_DIR/OpenFOAM-2.2.2/etc/bashrc

Batch Usauge on Glenn

Batch jobs can request mutiple nodes/cores and compute time up to the limits of the OSC systems. Refer to Queues and Reservations for Glenn and Scheduling Policies and Limits for more info. 

Interactive Batch Session

For an interactive batch session on Glenn, one can run the following command:

qsub -I -l nodes=1:ppn=8 -l walltime=1:00:00

which gives you 8 cores ( -l nodes=1:ppn=8 ) with 1 hour ( -l walltime=1:00:00 ). You may adjust the numbers per your need.

Once your job is ready and your environment has been configured (refer to Set-up On Glenn), you can then start any of the OpenFOAM utilities by the associated command:

blockmesh

or 

icoFoam

or 

sonicFoam
Non-interactive Batch Job (Serial Run)
batch script can be created and submitted for a serial or parallel run. You can create the batch script using any text editor you like in a working directory on the system of your choice. Below is the example batch script ( job.txt ) for a serial run:
#PBS -N serial_OpenFOAM 
#PBS -l nodes=1:ppn=1 
#PBS -l walltime=24:00:00 
#PBS -j oe 
#PBS -S /bin/bash 
#Initialize OpenFOAM on Glenn Cluster
module load gcc-4.4.5
module switch mpi mvapich2-1.6-gnu
module load openfoam-1.7.x
#Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR
# Copy files to $TMPDIR and move there to execute the program
cp * $TMPDIR
cd $TMPDIR
#Mesh the geometry 
blockMesh
#Run the solver 
icoFoam
#Finally, copy files back to your home directory
cp * $PBS_O_WORKDIR

In order to run it via the batch system, submit the job.txt  file with the following command:

qsub job.txt

Non-interactive Batch Job (Parallel Run)

Below is the example batch script ( job.txt ) for a parallel run:

#PBS -N parallel_OpenFOAM
#PBS -l nodes=2:ppn=8
#PBS -l walltime=6:00:00
#PBS -j oe
#PBS -S /bin/bash 
#Initialize OpenFOAM on Glenn Cluster
module load gcc-4.4.5
module switch mpi mvapich2-1.6-gnu
module load openfoam-1.7.x
#Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR
#Mesh the geometry
blockMesh
#Decompose the mesh for parallel run
decomposePar
#Run the solver
mpirun -np 16 simpleFoam -parallel 
#Reconstruct the parallel results
reconstructPar

Usage on Oakley

Set-up on Oakley

To configure the Oakley cluster for the use of OpenFOAM 1.7.x, use the following commands:

. /usr/local/OpenFOAM/OpenFOAM-1.7.x/etc/bashrc

To configure the Oakley cluster for the use of OpenFOAM 2.1.0, use the following commands:

. /usr/local/OpenFOAM/OpenFOAM-2.1.0/etc/bashrc

Batch Usauge on Oakley

Batch jobs can request mutiple nodes/cores and compute time up to the limits of the OSC systems. Refer to Queues and Reservations for Oakley and Scheduling Policies and Limits for more info. 

Interactive Batch Session

For an interactive batch session on Oakley, one can run the following command:

qsub -I -l nodes=1:ppn=12 -l walltime=1:00:00

which gives you 12 cores ( -l nodes=1:ppn=12 ) with 1 hour ( -l walltime=1:00:00 ). You may adjust the numbers per your need.

Non-interactive Batch Job (Serial Run)

A batch script can be created and submitted for a serial or parallel run. You can create the batch script using any text editor you like in a working directory on the system of your choice. Below is the example batch script ( job.txt ) for a serial run:

#PBS -N serial_OpenFOAM 
#PBS -l nodes=1:ppn=1 
#PBS -l walltime=24:00:00 
#PBS -j oe 
#PBS -S /bin/bash 
#Initialize OpenFOAM on Oakley Cluster
. /usr/local/OpenFOAM/OpenFOAM-1.7.x/etc/bashrc
#Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR
# Copy files to $TMPDIR and move there to execute the program
cp * $TMPDIR
cd $TMPDIR
#Mesh the geometry 
blockMesh
#Run the solver 
icoFoam
#Finally, copy files back to your home directory
cp * $PBS_O_WORKDIR

In order to run it via the batch system, submit the job.txt file with the following command:

qsub job.txt
Non-interactive Batch Job (Parallel Run)

Below is the example batch script ( job.txt ) for a parallel run:

#PBS -N parallel_OpenFOAM
#PBS -l nodes=2:ppn=8
#PBS -l walltime=6:00:00
#PBS -j oe
#PBS -S /bin/bash 
#Initialize OpenFOAM on Oakley Cluster
. /usr/local/OpenFOAM/OpenFOAM-1.7.x/etc/bashrc
#Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR
#Mesh the geometry
blockMesh
#Decompose the mesh for parallel run
decomposePar
#Run the solver
mpiexec simpleFoam -parallel 
#Reconstruct the parallel results
reconstructPar

Further Reading

The OpenFOAM home page

Supercomputer: 
Service: 
Fields of Science: