Over the past two weeks we have experienced Oakely login node crashes potentially caused by a Lustre bug.

Intel MPI

Intel's implementation of the Message Passing Interface (MPI) library.

Availability & Restrictions

This library may be used as an alternative to - but not in conjunction with - the MVAPICH2 MPI libraries.

Version Glenn Oakley
3.2.2p-006 X X  
4.0.0pu-027 X   X   X



Simply load the module:

module load intelmpi

On Glenn, the modules are named slightly differently:

module load intel-mpi-3.2.2p-006

Since this module conflicts with MVAPICH installations, you should unload those modules first.

Using Intel MPI

Software compiled against this module will use the libraries at runtime.

Building With Intel MPI

On Glenn, we do not recommend building against these libraries. The modules are not configured to set up useful environment variables.

On Oakley, we have defined several environment variables to make it easier to build and link with the Intel MPI libraries.

Variable Use
$MPI_CFLAGS Use during your compilation step for C programs.
$MPI_CXXFLAGS Use during your compilation step for C++ programs.
$MPI_FFLAGS Use during your compilation step for Fortran programs.
$MPI_F90FLAGS Use during your compilation step for Fortran 90 programs.
$MPI_LIBS Use when linking your program to Intel MPI.

In general, for any application already set up to use mpicc, (or similar), compilation should be fairly straightfoward. 

Batch Usage

Running a program compiled against Intel MPI (called my-impi-application) for five-hours on Oakley:

#PBS -N MyIntelMPIJob
#PBS -l nodes=4:ppn=12
#PBS -l walltime=5:00:00

module swap mvapich2 intelmpi

mpiexec my-impi-application

Further Reading

See Also

Fields of Science: