Gaussian is a very popular general purpose electronic structure program. Recent versions can perform density functional theory, Hartree-Fock, Möller-Plesset, coupled-cluster, and configuration interaction calculations among others. Geometry optimizations, vibrational frequencies, magnetic properties, and solution modeling are available. It performs well as black-box software on closed-shell ground state systems.
Availability and Restrictions
Versions
Gaussian is available on the Pitzer and Owens Clusters. These versions are currently available at OSC (S means single node serial/parallel and C means CUDA, i.e., GPU enabled):
Version | Owens | Pitzer | Ascend | Cardinal |
---|---|---|---|---|
g09e01 | S | |||
g16a03 |
S |
S | ||
g16b01 | SC | S | ||
g16c01 | SC* | SC* | ||
g16c02 | SC* | SC* |
You can use module spider gaussian
to view available modules for a given machine. Feel free to contact OSC Help if you need other versions for your work.
Access for Academic Users
Use of Gaussian for academic purposes requires validation. In order to obtain validation, please contact OSC Help for further instruction.
Publisher/Vendor/Repository and License Type
Gaussian, commercial
Usage
Usage on Owens
Set-up on Owens
To load the default version of the Gaussian module which initalizes your environment for Gaussian, usemodule load gaussian
. To select a particular software version, use module load gaussian/version
. For example, use module load gaussian/g09e01
to load Gaussian version g09e01 on Owens.
Using Gaussian
To execute Gaussian, simply run the Gaussian binary (g16 or g09) with the input file on the command line:
g16 < input.com
When the input file is redirected as above ( < ), the output will be standard output; in this form the output can be seen with viewers or editors when the job is running in a batch queue because the batch output file, which captures standard output, is available in the directory from which the job was submitted. Alternatively, Gaussian can be invoked without file redirection:
g16 input.com
in which case the output file will be named 'input.log' and its path will be the working directory when the command started; in this form outputs may not be available when the job is running in a batch queue, for example if the working directory was .
Batch Usage on Owens
When you log into owens.osc.edu you are logged into a login node. To gain access to the mutiple processors in the computing environment, you must submit your computations to the batch system for execution. Batch jobs can request mutiple processors and compute time up to the limits of the OSC systems. Refer to Queues and Reservations and Batch Limit Rules for more info.
Interactive Batch Session
For an interactive batch session on Owens, one can run the following command:sinteractive -A <project-account> -N 1 -n 28 -t 1:00:00which gives you 28 cores (
-N 1 -n 28
) with 1 hour (-t 1:00:00
). You may adjust the numbers per your need.
Non-interactive Batch Job (Serial Run)
A batch script can be created and submitted for a serial or parallel run. You can create the batch script using any text editor you like in a working directory on the system of your choice. Sample batch scripts and Gaussian input files are available here:
/users/appl/srb/workshops/compchem/gaussian/
This simple batch script demonstrates the important points:
#!/bin/bash #SBATCH --job-name=GaussianJob #SBATCH --nodes=1 --ntasks-per-node=28 #SBATCH --time=1:00:00 #SBATCH --account=<project-account> cp input.com $TMPDIR # Use TMPDIR for best performance. cd $TMPDIR module load gaussian g16 input.com cp -p input.log *.chk $SLURM_SUBMIT_DIR
Usage on Pitzer
Set-up on Pitzer
To load the default version of the Gaussian module which initalizes your environment for Gaussian, use module load gaussian
.
Using Gaussian
To execute Gaussian, simply run the Gaussian binary (g16 or g09) with the input file on the command line:
g16 < input.com
When the input file is redirected as above ( < ), the output will be standard output; in this form the output can be seen with viewers or editors when the job is running in a batch queue because the batch output file, which captures standard output, is available in the directory from which the job was submitted. Alternatively, Gaussian can be invoked without file redirection:
g16 input.com
in which case the output file will be named 'input.log' and its path will be the working directory when the command started; in this form outputs may not be available when the job is running in a batch queue, for example if the working directory was .
Batch Usage on Pitzer
When you log into pitzer.osc.edu you are logged into a login node. To gain access to the mutiple processors in the computing environment, you must submit your computations to the batch system for execution. Batch jobs can request mutiple processors and compute time up to the limits of the OSC systems. Refer to Queues and Reservations and Batch Limit Rules for more info.
Interactive Batch Session
For an interactive batch session on Pitzer, one can run the following command:sinteractive -A <project-account> -N 1 -n 40 -t 1:00:00which gives you 40 cores (
-n 40
) with 1 hour (-t 1:00:00
). You may adjust the numbers per your need.
Non-interactive Batch Job (Serial Run)
A batch script can be created and submitted for a serial or parallel run. You can create the batch script using any text editor you like in a working directory on the system of your choice. Sample batch scripts and Gaussian input files are available here:
/users/appl/srb/workshops/compchem/gaussian/
This simple batch script demonstrates the important points:
#!/bin/bash #SBATCH --job-name=GaussianJob #SBATCH --nodes=1 --ntasks-per-node=40 #SBATCH --time=1:00:00 #SBATCH --account=<project-account> cp input.com $TMPDIR # Use TMPDIR for best performance. cd $TMPDIR module load gaussian g16 input.com cp -p input.log *.chk $SLURM_SUBMIT_DIR
Running Gaussian jobs with GPU
Gaussian jobs can utilize the P100 GPUS of Owens. GPUs are not helpful for small jobs but are effective for larger molecules when doing DFT energies, gradients, and frequencies (for both ground and excited states). They are also not used effectively by post-SCF calculations such as MP2 or CCSD. For more
The above example will utilize CPUs indexed from 0 to 19th, but 0th CPU is associated with 0th GPU.
A sample batch script for GPU on Owens is as follows:
#!/bin/bash #SBATCH --job-name=GaussianJob #SBATCH --nodes=1 --ntasks-per-node=40 #SBATCH --gpus-per-node=1 #SBATCH --time=1:00:00 #SBATCH --account=<project-account> set echo cd $TMPDIR set INPUT=methane.com # SLURM_SUBMIT_DIR refers to the directory from which the job was submitted. cp $SLURM_SUBMIT_DIR/$INPUT . module load gaussian/g16b01 g16 < ./$INPUT ls -al cp -p *.chk $SLURM_SUBMIT_DIR
A sample input file for GPU on Owens is as follows:
%nproc=28 %mem=8gb %CPU=0-27 %GPUCPU=0=0 %chk=methane.chk #b3lyp/6-31G(d) opt methane B3LYP/6-31G(d) opt freq 0,1 C 0.000000 0.000000 0.000000 H 0.000000 0.000000 1.089000 H 1.026719 0.000000 -0.363000 H -0.513360 -0.889165 -0.363000 H -0.513360 0.889165 -0.363000
A sample batch script for GPU on Pitzer is as follows:
#!/bin/tcsh #SBATCH --job-name=methane #SBATCH --output=methane.log #SBATCH --nodes=1 --ntasks-per-node=48 #SBATCH --gpus-per-node=1 #SBATCH --time=1:00:00 #SBATCH --account=<project-account> set echo cd $TMPDIR set INPUT=methane.com # SLURM_SUBMIT_DIR refers to the directory from which the job was submitted. cp $SLURM_SUBMIT_DIR/$INPUT . module load gaussian/g16b01 g16 < ./$INPUT ls -al cp -p *.chk $SLURM_SUBMIT_DIR
A sample input file for GPU on Pitzer is as follows:
%nproc=48 %mem=8gb %CPU=0-47 %GPUCPU=0=0 %chk=methane.chk #b3lyp/6-31G(d) opt methane B3LYP/6-31G(d) opt freq 0,1 C 0.000000 0.000000 0.000000 H 0.000000 0.000000 1.089000 H 1.026719 0.000000 -0.363000 H -0.513360 -0.889165 -0.363000 H -0.513360 0.889165 -0.363000
Known Issues
Out of Memory Problems for Large TMPDIR Jobs
For some Gaussian jobs, the operating system will start swapping and may trigger the out of memory (OOM) killer because of memory consumption by the local filesystem (TMPDIR) cache. For these jobs %mem may not be critical, i.e., these jobs may not be big memory jobs per se; it is the disk usage that causes the OOM; known examples of this case are large ONIOM calculations.
While an investigation is ongoing, a simple workaround is to avoid putting the Gaussian internal files on TMPDIR. The most obvious alternative to TMPDIR is PFSDIR, in which case the commands are
... #SBATCH --gres=pfsdir ... module load gaussian export GAUSS_SCRDIR=$PFSDIR ...
Other workarounds exist; contact oschelp@osc.edu for details.
g16b01 G4 Problem
See the known issue and note that g16c01 is the current default module version.
Further Reading