CHARMM
Overview
CHARMM is a many-particle molecular simulation program which focuses on biological systems, e.g. peptides, proteins, prosthetic groups, small molecule ligands, nucleic acids, lipids, and carbohydrates.
Version c36b2 of CHARMM is available, compiled with the xxlarge, MPI, COLFFT, MKL, PIPF, and DOMDEC options using Intel compilers v17.0.7.
Version 46b2 of CHARMM is available, compiled with the MPI, FFTW3, MKL, COLFFT, Gaussin09 support and options using Intel compilers v19.1.2.
Restrictions on use
charmm c36b2 – this version is limited to one research group. The T&C are very similar to those for 46b2.
charrmm 46b2 may be used by UoM Staff and Students.
Visitors/Collaborators, even if they have a UoM IT username, are not permitted to access the software without the written consent of the CHARMM development project.
All users who wish to access this software must read and agree to the license and confirm this in an email to its-ri-team@manchester.ac.uk
A summary of some important points from the license:
- The software can only be used for internal non-commercial purposes which is defined in the license as:
“academic or other not-for-profit scholarly research which (a) is not undertaken for profit, or (b) is not intended to produce works, services, or data for commercial use, or (c) is neither conducted, nor funded, by a person or an entity engaged in the commercial use, application or exploitation of works similar to CHARMM, unless such funding confers no commercial rights to the funding person or entity.”
- There is no access to the source code on the CSF
- You must cite your use of the software as per clause 11 of the license.
- Visitors/Collaborators, even if they have a UoM IT username, are not permitted to access the software without the written consent of the CHARMM development project.
Set up procedure
To use CHARMM load the following modulefile:
module load apps/intel-19.1/charmm/46b2 module load apps/intel-17.0/charmm/c36b2
The above modulefile will automatically load the appropriate compiler and MPI modulefiles.
Running the application
Please do not run CHARMM on the login node. You must submit all work to batch.
Charmm 46b2
Charmm will by default read a file named charmm.inp
in the current directory if no input files are specified on the charmm command-line in your jobscript.
Alternatively you may specify input and output files using command-line flags:
charmm -i myinputfile.inp -o myoutputfile.out
A third method will fail when running in parallel where input/output is done by reading from stdin and writing to stdout using a command-line such as:
# This will FAIL when running charmm in parallel charmm < myinputfile.inp > myoutputfile.out
It is safest to always use the -i
and -o
flags.
Serial job
An example batch submission script:
#!/bin/bash --login ### SGE Job Stuff #$ -cwd module load apps/intel-19.1/charmm/46b2 charmm
Submit with the command: qsub scriptname
where scriptname is the name of your jobscript.
Parallel job
Single node job – 2 to 32 cores
An example batch submission script that will read a file named charmm.inp
in the current directory:
#!/bin/bash --login #$ -cwd #$ -pe smp.pe 8 # Use 8 cores on a single node. Min 2, max 32. module load apps/intel-19.1/charmm/46b2 mpirun -n $NSLOTS charmm
Submit with the command qsub scriptname
where scriptname is the name of your jobscript.
Multi-node jobs – 48 or more cores in multiples of 24
Example 1: A jobscript that will read a file named charmm.inp
in the current directory:
#!/bin/bash --login #$ -cwd #$ -pe mpi-24-ib 48 # Use two nodes, each with 24 cores module load apps/intel-19.1/charmm/46b2 mpirun -n $NSLOTS charmm
Submit with the command qsub scriptname
where scriptname is the name of your jobscript.
Example 2:A jobscript that will read and write named files:
#!/bin/bash --login #$ -cwd #$ -pe mpi-24-ib 48 # Use two nodes, each with 24 cores module load apps/intel-19.1/charmm/46b2 mpirun -n $NSLOTS charmm -i myinputfile.inp -o myoutputfile.out
Submit the job using qsub scriptname
where scriptname is the name of your jobscript.
If your jobscript contains file input/output using redirection, such as:
# WARNING: This will FAIL! mpirun -n $NSLOTS charmm < myinputfile.inp > myoutputfile.out
it will fail when run in parallel. Use the -i
and -o
flags instead.
Further info
Updates
None at this time.