{"id":4404,"date":"2018-02-09T16:08:55","date_gmt":"2018-02-09T16:08:55","guid":{"rendered":"http:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/?page_id=4404"},"modified":"2018-02-12T09:56:21","modified_gmt":"2018-02-12T09:56:21","slug":"plumed","status":"publish","type":"page","link":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/software\/applications\/plumed\/","title":{"rendered":"Plumed"},"content":{"rendered":"<h2>Overview<\/h2>\n<p><a href=\"http:\/\/www.plumed.org\/\">PLUMED<\/a> PLUMED is an open source library for free energy calculations in molecular systems which works together with some of the most popular molecular dynamics engines.<\/p>\n<p>Version 2.4.0 is installed on the CSF. It was compiled with the Intel 15.0.3 compiler and uses the Intel MKL for blas and lapack functions.<\/p>\n<h2>Restrictions on use<\/h2>\n<p>There are no restrictions on accessing this software on the CSF. It is released under the <a href=\"https:\/\/github.com\/plumed\/plumed2\/blob\/master\/COPYING.LESSER\">GNU Lesser GPL v3.0<\/a> and any usage must adhere to that license.<\/p>\n<h2>Set up procedure<\/h2>\n<p>To access the software you must first load one of the following modulefile:<\/p>\n<pre>\r\nmodule load apps\/intel-15.0\/plumed\/2.4.0            # Serial or Single-node OpenMP parallel\r\nmodule load apps\/intel-15.0\/plumed\/2.4.0-mpi        # Single-node MPI parallel\r\nmodule load apps\/intel-15.0\/plumed\/2.4.0-mpi-ib     # Multi-node MPI parallel using Infiniband network\r\n<\/pre>\n<h2>Running the application<\/h2>\n<p>Please do not run plumed on the login node. Jobs should be submitted to the compute nodes via batch.<\/p>\n<p>Plumed provides a list of commands \/ tools that are run via the main <code>plumed<\/code> executable. To see a list of available commands \/ tools you may run the following on the login node:<\/p>\n<pre>\r\nplumed -h\r\n<\/pre>\n<p>The tools are:<\/p>\n<pre>\r\ndriver\t\tkt\t\tsimplemd\tmklib\r\ndriver-float\tmanual\t\tsum_hills\tpartial_tempering\r\ngentemplate\tpathtools\tnewcv\t\tvim2html\r\ninfo\t\tpesmd\t\tconfig\t\tpatch\r\n<\/pre>\n<h3>Serial batch job submission<\/h3>\n<p>Make sure you have the OpenMP plumed modulefile loaded then create a batch submission script, for example:<\/p>\n<pre>\r\n#!\/bin\/bash\r\n#$ -S \/bin\/bash\r\n#$ -cwd             # Job will run from the current directory\r\n#$ -V               # Job will inherit current environment settings\r\n\r\n# Inform plumed how many cores to use (1 for serial)\r\nexport OMP_NUM_THREADS=$NSLOTS\r\nplumed <em>toolname<\/em> <em>list of input flags for that tool<\/em>\r\n         #\r\n         # See above for list of tool names. \r\n         # Example: To display the plumed version use:\r\n         # plumed info --version\r\n<\/pre>\n<p>Submit the jobscript using: <\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<p>where <em>scriptname<\/em> is the name of your jobscript.<\/p>\n<h3>OpenMP Parallel batch job submission<\/h3>\n<p>Make sure you have the OpenMP plumed modulefile loaded then create a batch submission script, for example:<\/p>\n<pre>\r\n#!\/bin\/bash\r\n#$ -S \/bin\/bash\r\n#$ -cwd             # Job will run from the current directory\r\n#$ -V               # Job will inherit current environment settings\r\n#$ -pe smp.pe 8     # Number of cores (2-23 for single-node multi-core jobs)\r\n\r\n# Inform plumed how many cores to use\r\nexport OMP_NUM_THREADS=$NSLOTS\r\nplumed <em>toolname<\/em> <em>list of input flags for that tool<\/em>\r\n         #\r\n         # See above for list of tool names. \r\n         # Example: To display the plumed version use:\r\n         # plumed info --version\r\n<\/pre>\n<p>Submit the jobscript using: <\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<p>where <em>scriptname<\/em> is the name of your jobscript.<\/p>\n<h3>MPI Parallel batch job submission<\/h3>\n<p>Make sure you have one of the MPI plumed modulefiles loaded (the -ib version should be used when running multi-node parallel jobs) then create a batch submission script, for example:<\/p>\n<pre>\r\n#!\/bin\/bash\r\n#$ -S \/bin\/bash\r\n#$ -cwd             # Job will run from the current directory\r\n#$ -V               # Job will inherit current environment settings\r\n\r\n#### Choose one of the following\r\n#$ -pe smp.pe 8           # Number of cores (2-23 for single-node parallel jobs)\r\n### OR\r\n#$ -pe orte-24-ib.pe  48  # Number of cores (48 or more in multiples of 24 for multi-node jobs)\r\n\r\n# Run the requested number of parallel instances of plumed\r\nmpirun -n $NSLOTS plumed <em>toolname<\/em> <em>list of input flags for that tool<\/em>\r\n                            #\r\n                            # See above for list of tool names. \r\n                            # Example: To display the plumed version use:\r\n                            # plumed info --version\r\n<\/pre>\n<p>Submit the jobscript using: <\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<p>where <em>scriptname<\/em> is the name of your jobscript.<\/p>\n<h2>Further info<\/h2>\n<ul>\n<li><a href=\"https:\/\/plumed.github.io\/doc-v2.4\/user-doc\/html\/index.html\">PLUMED 2.4 documentation<\/a><\/li>\n<li><a href=\"http:\/\/www.plumed.org\/\">PLUMED<\/a> website<\/li>\n<\/ul>\n<h2>Updates<\/h2>\n<p>None.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Overview PLUMED PLUMED is an open source library for free energy calculations in molecular systems which works together with some of the most popular molecular dynamics engines. Version 2.4.0 is installed on the CSF. It was compiled with the Intel 15.0.3 compiler and uses the Intel MKL for blas and lapack functions. Restrictions on use There are no restrictions on accessing this software on the CSF. It is released under the GNU Lesser GPL v3.0.. <a href=\"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/software\/applications\/plumed\/\">Read more &raquo;<\/a><\/p>\n","protected":false},"author":15,"featured_media":0,"parent":31,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-4404","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/4404","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/users\/15"}],"replies":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/comments?post=4404"}],"version-history":[{"count":4,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/4404\/revisions"}],"predecessor-version":[{"id":4413,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/4404\/revisions\/4413"}],"up":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/31"}],"wp:attachment":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/media?parent=4404"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}