The Assisted Model Building with Energy Refinement (AMBER) package, which includes AmberTools, contains many molecular simulation programs targeted at biomolecular systems. A wide variety of modelling techniques are available. It generally scales well on modest numbers of processors, and the GPU enabled CUDA programs are very efficient.
Availability and Restrictions
Versions
AMBER is available on the Owens, Pitzer, and Ascend clusters. The following versions are currently available at OSC (S means serial executables, P means parallel, and C means CUDA, i.e., GPU enabled):
Version | Owens | Pitzer | Ascend | Cardinal | Notes |
---|---|---|---|---|---|
18 | SPC | SPC | |||
19 | SPC* | SPC* | |||
20 | SPC | SPC | SPC | ||
22 | SPC | SPC | SPC | ||
24 | SPC |
module spider amber/{version}
.You can use module spider amber
to view available modules and use module spider amber/{version}
to view installation details including applied Amber updates. Feel free to contact OSC Help if you need other versions or executables for your work.
Access for Academic Users
OSC's Amber is available to not-for-profit OSC users; simply contact OSC Help to request the appropriate form for access.
Access for Commercial Users
For-profit OSC users must obtain their own Amber license.
Publisher/Vendor/Repository and License Type
University of California, San Francisco, Commercial
Usage
Usage on Owens
Set-up
module load amber
. To select a particular software version, use module load amber/version
. For example, use module load amber/16
to load AMBER version 16. Using AMBER
A serial Amber program in a short duration run can be executed interactively on the command line, e.g.:
tleap
Parallel Amber programs must be run in a batch environment with srun, e.g.:
srun pmemd.MPI
Batch Usage
When you log into owens.osc.edu you are actually logged into a linux box referred to as the login node. To gain access to the mutiple processors in the computing environment, you must submit your AMBER simulation to the batch system for execution. Batch jobs can request mutiple nodes/cores and compute time up to the limits of the OSC systems. Refer to Queues and Reservations and Batch Limit Rules for more info.
Interactive Batch Session
For an interactive batch session, one can run the following command:sinteractive -A <project-account> -N 1 -n 28 -t 1:00:00which gives you one node with 28 cores (
-N 1 -n 28
), with 1 hour ( -t 1:00:00
). You may adjust the numbers per your need.
Non-interactive Batch Job (Serial Run)
A batch script can be created and submitted for a serial or parallel run. You can create the batch script using any text editor you like in a working directory on the system of your choice. Sample batch scripts and Amber input files are available here:
~srb/workshops/compchem/amber/
Below is the example batch script ( job.txt
) for a serial run:
# AMBER Example Batch Script for the Basic Tutorial in the Amber manual #!/bin/bash #SBATCH --job-name 6pti #SBATCH --nodes=1 --ntasks-per-node=28 #SBATCH --time=0:20:00 #SBATCH --account=<project-account> module load amber # Use TMPDIR for best performance. cd $TMPDIR # SLURM_SUBMIT_DIR refers to the directory from which the job was submitted. cp -p $SLURM_SUBMIT_DIR/6pti.prmtop . cp -p $SLURM_SUBMIT_DIR/6pti.prmcrd . # Running minimization for BPTI cat << eof > min.in # 200 steps of minimization, generalized Born solvent model &cntrl maxcyc=200, imin=1, cut=12.0, igb=1, ntb=0, ntpr=10, / eof sander -i min.in -o 6pti.min1.out -p 6pti.prmtop -c 6pti.prmcrd -r 6pti.min1.xyz cp -p min.in 6pti.min1.out 6pti.min1.xyz $SLURM_SUBMIT_DIR
In order to run it via the batch system, submit the job.txt
file with the command: sbatch job.txt
.
Usage on Pitzer
Set-up
module load amber
. Using AMBER
A serial Amber program in a short duration run can be executed interactively on the command line, e.g.:
tleap
Parallel Amber programs must be run in a batch environment with mpiexec, e.g.:
srun pmemd.MPI
Batch Usage
When you log into owens.osc.edu you are actually logged into a linux box referred to as the login node. To gain access to the mutiple processors in the computing environment, you must submit your AMBER simulation to the batch system for execution. Batch jobs can request mutiple nodes/cores and compute time up to the limits of the OSC systems. Refer to Queues and Reservations and Batch Limit Rules for more info.
Interactive Batch Session
For an interactive batch session, one can run the following command:sinteractive -A <project-account> -N 1 -n 48 -t 1:00:00which gives you one node with 48 cores (
-N 1 -n 48
) with 1 hour ( -t 1:00:00
). You may adjust the numbers per your need.
Non-interactive Batch Job (Serial Run)
A batch script can be created and submitted for a serial or parallel run. You can create the batch script using any text editor you like in a working directory on the system of your choice. Sample batch scripts and Amber input files are available here:
~srb/workshops/compchem/amber/
Below is the example batch script ( job.txt
) for a serial run:
# AMBER Example Batch Script for the Basic Tutorial in the Amber manual #!/bin/bash #SBATCH --job-name 6pti # SBATCH --nodes=1 --ntasks-per-node=48 SBATCH --time=0:20:00 #SBATCH --account=<project-account> module load amber # Use TMPDIR for best performance. cd $TMPDIR # SLURM_SUBMIT_DIR refers to the directory from which the job was submitted. cp -p $SLURM_SUBMIT_DIR/6pti.prmtop . cp -p $SLURM_SUBMIT_DIR/6pti.prmcrd . # Running minimization for BPTI cat << eof > min.in # 200 steps of minimization, generalized Born solvent model &cntrl maxcyc=200, imin=1, cut=12.0, igb=1, ntb=0, ntpr=10, / eof sander -i min.in -o 6pti.min1.out -p 6pti.prmtop -c 6pti.prmcrd -r 6pti.min1.xyz cp -p min.in 6pti.min1.out 6pti.min1.xyz $SLURM_SUBMIT_DIR
In order to run it via the batch system, submit the job.txt
file with the command: sbatch job.txt
.
Troubleshooting
In general, the scientific method should be applied to usage problems. Users should check all inputs and examine all outputs for the first signs of trouble. When one cannot find issues with ones inputs, it is often helpful to ask fellow humans, especially labmates, to review the inputs and outputs. Reproducibility of molecular dynamics simulations is subject to many caveats. See page 24 of the Amber18 manual for a discussion.