The CSF2 has been replaced by the CSF3 - please use that system! This documentation may be out of date. Please read the CSF3 documentation instead. To display this old CSF2 page click here. |
CHARMM
Overview
CHARMM is a many-particle molecular simulation program which focuses on biological systems, e.g. peptides, proteins, prosthetic groups, small molecule ligands, nucleic acids, lipids, and carbohydrates.
Version c35b2 and c36b2 of CHARMM are available both compiled with the xlarge, MPI and em64t options using Intel compilers v11.1. Version c36b2 has also been compiled with the Intel 12.0 compiler (e.g., for use with ChemShell 3.5.0 which has also been compiled with the Intel 12.0 compiler).
Restrictions on use
The CHARMM installs are restricted to specific research groups. Only these groups may use the software and only the version they have a license for. The head of any research group holding a license can email its-ri-team@manchester.ac.uk to find out how to get access to CHARMM.
Version c35b2 – A full copy of the license is available from the Head of the Research Group.
Version c36b2 – A full copy of the license is available in /opt/gridware/apps/intel-11.1/charmm/c36b2/CharmmLicenseForm.pdf
Important points to note are:
- The software can only be used internally for non-commercial purposes and cannot be used for any work that is funded by a commercial organisations. Check the license (usually Section 5) for a full definition of what is classed as non-commercial.
- The software may only be used on campus.
- Any reports or publications must contain an acknowledgement as set out in the license (usually Section 12).
Users wishing to access the software should request access via its-ri-team@manchester.ac.uk. A confirmation email will be sent once a user has been added to the appropriate unix group and users who are not part of an eligible research group will be given additional advice on how to proceed.
Set up procedure
To use CHARMM load one of the following modulefiles:
# Version c35b2 module load apps/intel-11.1/charmm/c35b2 # Version c36b2 (with different compiler versions and possibly infiniband) module load apps/intel-12.0/charmm/c36b2-ib # Use for multi-node MPI jobs module load apps/intel-12.0/charmm/c36b2 module load apps/intel-11.1/charmm/c36b2
The above modulefiles will automatically load the appropriate compiler and MPI modulefiles.
Running the application
Charmm c36b2
Charmm will read a file named charmm.inp
in the current directory if no input files are specified on the charmm command-line in your jobscript (never run charmm on the login node!)
Alternatively you may specify input and output files using command-line flags:
charmm -i myinputfile.inp -o myoutputfile.out
A third method will fail when running in parallel where input/output is done by reading from stdin and writing to stdout using a command-line such as:
# This will FAIL when running charmm in parallel charmm < myinputfile.inp > myoutputfile.out
It is safest to always use the -i
and -o
flags.
Charmm c35b2
First load the required module (see above) and create a directory containing a file called charmm.inp containing the job parameters.
Note: a feature of the MPI, em64t c35b2 version of CHARMM is that it will read input only from a file named charmm.inp, i.e. CHARMM will not read from standard input. This is described here. If you see messages which look like:
# forrtl: No such file or directory # forrtl: severe (29): file not found, unit 5, file /opt/gridware/apps/intel-11.1/charmm/c35b2/test/charmm.inp
it is likely to be for this reason.
Serial job
1. An example batch submission script:
### SGE Job Stuff #$ -cwd #$ -V charmm
2. Submit with the command: `qsub scriptname`
Parallel job
Single node job – 2 to 24 cores
Ensure you have an modulefile without ‘ib’ in the name loaded.
An example batch submission script that will read a file named charmm.inp
in the current directory:
#!/bin/bash #$ -S bash #$ -cwd #$ -V #$ -pe smp.pe 8 # Use 8 cores on a single node. Min 2, max 24. mpiexec -n $NSLOTS charmm
Submit with the command qsub scriptname
where scriptname is the name of your jobscript from above.
Multi-node jobs – 48+ cores
Ensure you have loaded a charmm modulefile with -ib
in the name.
An example batch submission script that will read a file named charmm.inp
in the current directory:
#!/bin/bash #$ -S bash #$ -cwd #$ -V #$ -pe orte-24-ib 48 # Use two nodes, each with 24 cores mpiexec -n $NSLOTS charmm
Submit with the command qsub scriptname
where scriptname is the name of your jobscript from above.
The following is an example batch submission script that will read and write named files:
#!/bin/bash #$ -S bash #$ -cwd #$ -V #$ -pe orte-24-ib 48 # Use two nodes, each with 24 cores mpiexec -n $NSLOTS charmm -i myinputfile.inp -o myoutputfile.out
Submit the job using qsub jobscript
where jobscript is the name of your script above.
If your jobscript contains file input/output using redirection, such as:
# WARNING: This will FAIL! mpiexec -n $NSLOTS charmm < myinputfile.inp > myoutputfile.out
it will fail when run in parallel. Use the -i
and -o
flags instead.
Further info
Updates
None at this time.