OpenMolcas
Overview
OpenMolcas is an open source version of MOLCAS.
Version v18.09,v19.11,v21.02 and v22.06 are installed on the CSF.
Restrictions on use
OpenMolcas is open source software released under the Lesser General Public License (LGPL). It is free to use by all members of the University.
Set up procedure
To access the software you must first load one of the modulefiles
module load apps/gcc/openmolcas/18.09 module load apps/gcc/openmolcas/19.11 module load apps/gcc/openmolcas/21.02 module load apps/gcc/openmolcas/22.06
We now recommend that for batch jobs you load the modulefile in the jobscript rather than loading it on the command line prior to submission. See below for examples.
Running the application
Please do not run OpenMolcas on the login node. Jobs should be submitted to the compute nodes via batch. NOTE we now recommend loading modules within your batch scripts.
Serial batch job submission
Create a batch submission script, for example:
#!/bin/bash --login #$ -cwd # Job will run from the current directory module load apps/gcc/openmolcas/22.06 pymolcas mymol.input ### Use pymolcas -clean mymol.input to have the temporary scratch directory ### deleted at the end of the job (see below)
Submit the jobscript using:
qsub scriptname
where scriptname is the name of your jobscript.
Parallel batch job submission
Parallel jobs on a single node using OpenMP are currently possible. Multi-node calculations using MPI are not currently supported.
Create a batch submission script, for example:
#!/bin/bash --login #$ -cwd # Job will run from the current directory #$ -pe smp.pe 12 # 12 will be the number of OpenMP threads module load apps/gcc/openmolcas/22.06 export OMP_NUM_THREADS=$NSLOTS pymolcas mymol.input ### Use pymolcas -clean mymol.input to have the temporary scratch directory ### deleted at the end of the job (see below)
Submit the jobscript using:
qsub scriptname
where scriptname is the name of your jobscript.
OpenMolcas Scratch (temp) files
It is possible to modify how OpenMolcas uses your scratch directory for temporary files. Please read the following section so that you are aware of what OpenMolcas is doing with your scratch directory (you may create a lot of temporary junk files you do not need to keep).
The modulefiles above set the following environment variable:
MOLCAS_WORKDIR=/scratch/username
where username
is your CSF username. This instructs OpenMolcas to create a directory in your scratch area named after your input file. For example if your input file is called test000.input
then OpenMolcas will create a directory named
/scratch/username/test000
in which to store temporary files used during the computation. This directory will not be deleted at the end of the job. Hence you may end up with a lot of these temporary directories if you run many jobs!
To instruct OpenMolcas to delete this directory at the end of the job add the flag -clean
to the pymolcas
command in your jobscript. For example:
# Automatically delete the temporary scratch directory at the end of the job (RECOMMENDED) pymolcas -clean test000.input
If you wish to keep temporary directories and use a different temporary directory name each time you run (and rerun) the same input file (e.g., if you run the test000.input
input with a different number of CPU cores to do some timing tests) you should instruct OpenMolcas to add a random number to the directory name by adding the following to your jobscript:
# OpenMolcas will add a random number to the temporary directory name export MOLCAS_PROJECT=NAMEPID
Removing the -clean
flag from the pymolcas
command in your jobscript will prevent OpenMolcas from deleting it.
Using a Job Array
If running OpenMolcas in a job array you may need to create a directory per task otherwise the temporary directories and files created by OpenMolcas will overwrite each other when a job array task runs. Remember that OpenMolcas will use the name of your input file when creating its temporary directory. If each task in the job array uses the same OpenMolcas input filename then this will cause a problem when several job array tasks run at the same time. To fix this, please add the following to your jobscript before the lines that runs OpenMolcas:
export MOLCAS_WORKDIR=/scratch/$USER/molcas_${JOB_ID}_${SGE_TASK_ID} mkdir -p $MOLCAS_WORKDIR
Each task in the job array will have its own directory. Within there will be a directory named after the input file (see above).
Further info
Updates
None.