{"id":2749,"date":"2015-10-14T14:47:28","date_gmt":"2015-10-14T14:47:28","guid":{"rendered":"http:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/?page_id=2749"},"modified":"2016-07-22T12:42:11","modified_gmt":"2016-07-22T12:42:11","slug":"eman2","status":"publish","type":"page","link":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/software\/applications\/eman2\/","title":{"rendered":"EMAN2"},"content":{"rendered":"<h2>Overview<\/h2>\n<p><a href=\"http:\/\/blake.bcm.edu\/emanwiki\/EMAN2\">EMAN2<\/a> is a broadly based greyscale scientific image processing suite with a primary focus on processing data from transmission electron microscopes. It performs single particle reconstructions (3-D volumetric models from 2-D cryo-EM images) at the highest possible resolution, and also offers support for single particle cryo-ET, and tools useful in many other subdisciplines such as helical reconstruction, 2-D crystallography and whole-cell tomography. EMAN2 is capable of processing very large data sets (>100,000 particle) very efficiently<\/p>\n<p>Note that EMAN2 is the successor to EMAN1 (which is not available on the CSF).<\/p>\n<p>Version 2.11 of EMAN2 is installed on the CSF. <\/p>\n<p>MPI support is provided by the <a href=\"http:\/\/ncmi.bcm.edu\/ncmi\/software\/counter_222\/software_121\">PyDusa<\/a> v1.15. This has required a custom build of the CSF OpenMPI 1.6 gcc installation.<\/p>\n<h2>Restrictions on use<\/h2>\n<p>The software is free to use to all CSF users. However all users make themselves aware of the <a href=\"http:\/\/blake.bcm.tmc.edu\/emanwiki\/EMAN2\/FAQ\/CiteEman2\">citation requirements<\/a> requested by the authors.<\/p>\n<h2>Set up procedure<\/h2>\n<p>To access the software you must first load <em>one<\/em> of the following modulefiles:<\/p>\n<ul>\n<li>For serial \/ multi-threaded batch jobs and interactive usage:\n<pre>\r\nmodule load apps\/binapps\/eman\/2.11\r\n<\/pre>\n<\/li>\n<li>For larger, multi-process (multi-node) jobs using fast InfiniBand networking\n<pre>\r\nmodule load apps\/binapps\/eman\/2.11-mpi-ib\r\n<pre><\/li>\r\n<\/ul>\r\nThe above modulefiles will automatically load any dependent modulefiles (e.g., mpi).\r\n\r\n<h2>Running the application<\/h2>\r\n\r\nPlease <strong>do not run<\/strong> EMAN2 on the login node. Jobs should be submitted to the compute nodes via batch or run interactively via <em>qrsh<\/em> to obtain an interactive session on a backend node.\r\n\r\n<h3>Interactive Usage<\/h3>\r\nA number of EMAN2 tools can be run interactively, including <code>e2projectmanager.py<\/code>, <code>e2display.py<\/code> and an interacive python shell <code>e2.py<\/code> (for advanced users). Do <strong>not<\/strong> run these directly on the login node. \r\n\r\nTo run these commands:\r\n<ol>\r\n<li>Start an interactive session on a backend compute node:\r\n<pre>\r\nqrsh -l inter -l short\r\n<\/pre>\n<p>Wait for the session to begin. If no session can be started, try again later.<\/li>\n<li>Now load the serial modulefile on the backend node\n<pre>\r\nmodule load apps\/binapps\/eman\/2.11\r\n<\/pre>\n<\/li>\n<li>Now run the required EMAN2 program, for example:\n<pre>\r\ne2projectmanager.py\r\n   #\r\n   # Could also be e2display.py or e2.py for an interactive python shell.\r\n<\/pre>\n<p>Note that the project manager can be used to launch other commands. It is OK to do so provided you do not add any <em>parallel<\/em> options to the commands generated by the EMAN2 project manager.<\/li>\n<\/ol>\n<p>Once you have finished with the interactive session please terminate it using:<\/p>\n<pre>\r\nexit\r\n<\/pre>\n<p>which will return you to the login node.<\/p>\n<h3>Serial batch job submission<\/h3>\n<p>Make sure you have the serial modulefile loaded then create a batch submission script, for example:<\/p>\n<pre>\r\n#!\/bin\/bash\r\n#$ -S \/bin\/bash\r\n#$ -cwd             # Job will run from the current directory\r\n#$ -V               # Job will inherit current environment settings\r\n\r\ne2<em>appname<\/em>.py <em>args<\/em>\r\n    #\r\n    # replace with the required EMAN2 tool (e.g., e2boxer.py)\r\n<\/pre>\n<p>Submit the jobscript using: <\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<p>where <em>scriptname<\/em> is the name of your jobscript.<\/p>\n<h3>Multithreaded Single-node Parallel Job Submission<\/h3>\n<p>Make sure you have the non-mpi modulefile loaded then create a batch submission script, for example:<\/p>\n<pre>\r\n#!\/bin\/bash\r\n#$ -S \/bin\/bash\r\n#$ -cwd             # Job will run from the current directory\r\n#$ -V               # Job will inherit current environment settings\r\n#$ -pe smp.pe 4     # Can use between 2 and 16 cores in smp.pe\r\n\r\n# The batch system will replace $NSLOTS with the number of cores given above\r\n\r\ne2<em>appname<\/em>.py --parallel=thread:$NSLOTS <em>args<\/em>\r\n    #\r\n    # replace with the required EMAN2 tool (e.g., e2boxer.py)\r\n    # (not all e2 programs support parallel execution)\r\n<\/pre>\n<p>Submit the jobscript using: <\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<p>where <em>scriptname<\/em> is the name of your jobscript.<\/p>\n<h3>Multi-node Parallel Job Submission<\/h3>\n<p>Make sure you have one of the parallel (MPI) modulefiles loaded. We recommend the <code>-mpi-ib<\/code> version for large parallel jobs because they will run on node connected by faster InfiniBand networking. To use these nodes you must specify at least 24 cores and the number of cores must be a multiple of 12. Then create a batch submission script, for example:<\/p>\n<pre>\r\n#!\/bin\/bash\r\n#$ -S \/bin\/bash\r\n#$ -cwd                     # Job will run from the current directory\r\n#$ -V                       # Job will inherit current environment settings\r\n#$ -pe orte-24-ib.pe 48     # Minimum 48, must be a multiple of 24\r\n\r\n# The batch system will replace $NSLOTS with the number of cores given above\r\n\r\ne2<em>appname<\/em>.py --parallel=mpi:$NSLOTS:\/scratch\/$USER <em>args<\/em>\r\n    #\r\n    # replace with the required EMAN2 tool (e.g., e2boxer.py)\r\n    # (not all e2 programs support parallel execution)\r\n    # Do <strong>not<\/strong> call mpirun directly - the e2 app will do that for you.\r\n<\/pre>\n<p>Submit the jobscript using: <\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<p>where <em>scriptname<\/em> is the name of your jobscript.<\/p>\n<h2>Further info<\/h2>\n<ul>\n<li><a href=\"http:\/\/blake.bcm.tmc.edu\/emanwiki\/EMAN2#Documentation\">EMAN2 Documentation<\/a><\/li>\n<li><a href=\"http:\/\/blake.bcm.tmc.edu\/emanwiki\/EMAN2\/Parallel\">EMAN2 Parallel options<\/a><\/li>\n<li><a href=\"http:\/\/blake.bcm.tmc.edu\/emanwiki\/EMAN2\">EMAN2 website<\/a><\/li>\n<\/ul>\n<h2>Updates<\/h2>\n<p>None.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Overview EMAN2 is a broadly based greyscale scientific image processing suite with a primary focus on processing data from transmission electron microscopes. It performs single particle reconstructions (3-D volumetric models from 2-D cryo-EM images) at the highest possible resolution, and also offers support for single particle cryo-ET, and tools useful in many other subdisciplines such as helical reconstruction, 2-D crystallography and whole-cell tomography. EMAN2 is capable of processing very large data sets (>100,000 particle) very.. <a href=\"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/software\/applications\/eman2\/\">Read more &raquo;<\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"parent":31,"menu_order":0,"comment_status":"open","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-2749","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/2749","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/comments?post=2749"}],"version-history":[{"count":8,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/2749\/revisions"}],"predecessor-version":[{"id":3221,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/2749\/revisions\/3221"}],"up":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/31"}],"wp:attachment":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/media?parent=2749"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}