The CSF2 has been replaced by the CSF3 - please use that system! This documentation may be out of date. Please read the CSF3 documentation instead. To display this old CSF2 page click here. |
EMAN2
Overview
EMAN2 is a broadly based greyscale scientific image processing suite with a primary focus on processing data from transmission electron microscopes. It performs single particle reconstructions (3-D volumetric models from 2-D cryo-EM images) at the highest possible resolution, and also offers support for single particle cryo-ET, and tools useful in many other subdisciplines such as helical reconstruction, 2-D crystallography and whole-cell tomography. EMAN2 is capable of processing very large data sets (>100,000 particle) very efficiently
Note that EMAN2 is the successor to EMAN1 (which is not available on the CSF).
Version 2.11 of EMAN2 is installed on the CSF.
MPI support is provided by the PyDusa v1.15. This has required a custom build of the CSF OpenMPI 1.6 gcc installation.
Restrictions on use
The software is free to use to all CSF users. However all users make themselves aware of the citation requirements requested by the authors.
Set up procedure
To access the software you must first load one of the following modulefiles:
- For serial / multi-threaded batch jobs and interactive usage:
module load apps/binapps/eman/2.11
- For larger, multi-process (multi-node) jobs using fast InfiniBand networking
module load apps/binapps/eman/2.11-mpi-ib
Running the application
Please do not run EMAN2 on the login node. Jobs should be submitted to the compute nodes via batch or run interactively via qrsh to obtain an interactive session on a backend node.Interactive Usage
A number of EMAN2 tools can be run interactively, includinge2projectmanager.py
, e2display.py
and an interacive python shell e2.py
(for advanced users). Do not run these directly on the login node.
To run these commands:
- Start an interactive session on a backend compute node:
qrsh -l inter -l short
Wait for the session to begin. If no session can be started, try again later.
- Now load the serial modulefile on the backend node
module load apps/binapps/eman/2.11
- Now run the required EMAN2 program, for example:
e2projectmanager.py # # Could also be e2display.py or e2.py for an interactive python shell.
Note that the project manager can be used to launch other commands. It is OK to do so provided you do not add any parallel options to the commands generated by the EMAN2 project manager.
Once you have finished with the interactive session please terminate it using:
exit
which will return you to the login node.
Serial batch job submission
Make sure you have the serial modulefile loaded then create a batch submission script, for example:
#!/bin/bash #$ -S /bin/bash #$ -cwd # Job will run from the current directory #$ -V # Job will inherit current environment settings e2appname.py args # # replace with the required EMAN2 tool (e.g., e2boxer.py)
Submit the jobscript using:
qsub scriptname
where scriptname is the name of your jobscript.
Multithreaded Single-node Parallel Job Submission
Make sure you have the non-mpi modulefile loaded then create a batch submission script, for example:
#!/bin/bash #$ -S /bin/bash #$ -cwd # Job will run from the current directory #$ -V # Job will inherit current environment settings #$ -pe smp.pe 4 # Can use between 2 and 16 cores in smp.pe # The batch system will replace $NSLOTS with the number of cores given above e2appname.py --parallel=thread:$NSLOTS args # # replace with the required EMAN2 tool (e.g., e2boxer.py) # (not all e2 programs support parallel execution)
Submit the jobscript using:
qsub scriptname
where scriptname is the name of your jobscript.
Multi-node Parallel Job Submission
Make sure you have one of the parallel (MPI) modulefiles loaded. We recommend the -mpi-ib
version for large parallel jobs because they will run on node connected by faster InfiniBand networking. To use these nodes you must specify at least 24 cores and the number of cores must be a multiple of 12. Then create a batch submission script, for example:
#!/bin/bash #$ -S /bin/bash #$ -cwd # Job will run from the current directory #$ -V # Job will inherit current environment settings #$ -pe orte-24-ib.pe 48 # Minimum 48, must be a multiple of 24 # The batch system will replace $NSLOTS with the number of cores given above e2appname.py --parallel=mpi:$NSLOTS:/scratch/$USER args # # replace with the required EMAN2 tool (e.g., e2boxer.py) # (not all e2 programs support parallel execution) # Do not call mpirun directly - the e2 app will do that for you.
Submit the jobscript using:
qsub scriptname
where scriptname is the name of your jobscript.
Further info
Updates
None.