{"id":633,"date":"2018-10-22T16:16:02","date_gmt":"2018-10-22T15:16:02","guid":{"rendered":"http:\/\/ri.itservices.manchester.ac.uk\/csf3\/?page_id=633"},"modified":"2021-08-03T17:31:56","modified_gmt":"2021-08-03T16:31:56","slug":"molpro","status":"publish","type":"page","link":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/software\/applications\/molpro\/","title":{"rendered":"Molpro"},"content":{"rendered":"<h2>Overview<\/h2>\n<p><a href=\"https:\/\/www.molpro.net\/\">Molpro<\/a> is a comprehensive system of ab initio programs for advanced molecular electronic structure calculations<\/p>\n<p>2019.2.0, 2018.2.0 &amp; 2015.1.27 OpenMP are installed on the CSF3. Multi-node parallelism using MPI is supported by this version (though there are no multi-node resources in CSF3 yet).<\/p>\n<h2>Restrictions on use<\/h2>\n<p>This software is restricted to a specific research group. Please <a href=\"http:\/\/www.molpro.net\/info\/authors?portal=visitor&amp;choice=Authors\">cite<\/a> the software when used in your research.<\/p>\n<h2>Set up procedure<\/h2>\n<p>To access the software you must load one of the following modulefiles in your jobscript:<\/p>\n<pre>\r\nmodule load apps\/binapps\/molpro\/2021.2.1_mpipr\r\nmodule load apps\/binapps\/molpro\/2021.2.1_sockets\r\nmodule load apps\/binapps\/molpro\/2021.1.0_omp\r\nmodule load apps\/binapps\/molpro\/2018.2.0_omp\r\nmodule load apps\/binapps\/molpro\/2015.1.27_omp\r\nmodule load apps\/binapps\/molpro\/2019.2.0_omp<\/pre>\n<h2>Running the application<\/h2>\n<p>Please do not run molpro on the login node. Jobs should be submitted to the compute nodes via batch. Molpro is supplied with a script named <code>molpro<\/code> which will run the actual molpro binary <code>molpro.exe<\/code> with the requested number of cores. To see the available options run<\/p>\n<pre>molpro -h\r\n<\/pre>\n<p>on the login node. But please do NOT run simulations on the login node.<\/p>\n<h3>Serial batch job submission<\/h3>\n<p>Make sure you have your input file in the current directory and then create a jobscript in that directory. For example:<\/p>\n<pre>#!\/bin\/bash --login\r\n#$ -S \/bin\/bash\r\n#$ -cwd             # Job will run from the current directory\r\n\r\nmodule load apps\/binapps\/molpro\/2018.2.0_omp\r\n\r\nmolpro <em>args<\/em>\r\n<\/pre>\n<p>Submit the jobscript using:<\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<p>where <em>scriptname<\/em> is the name of your jobscript.<\/p>\n<h3>Single-node Parallel batch job submission<\/h3>\n<p>Single-node parallel jobs can be run using MPI (multiple molpro processes are started) or with OpenMP (a single molpro processes is started that runs multiple threads). The results and efficiency of these two methods may be different.<\/p>\n<p>Make sure you have your input file in the current directory and then create a jobscript in that directory. For example:<\/p>\n<pre>#!\/bin\/bash --login\r\n#$ -S \/bin\/bash\r\n#$ -cwd             # Job will run from the current directory\r\n#$ -pe smp.pe 8     # Number of cores (max 32)\r\n\r\nmodule load apps\/binapps\/molpro\/2018.2.0_omp\r\n\r\n# Run molpro with multiple MPI processes on a single node\r\n# $NSLOTS is automatically set to the number of cores set above\r\nmolpro -n $NSLOTS <em>args<\/em>\r\n\r\n### OR ###\r\n\r\n# Run molpro.exe with multiple threads (using OpenMP) on a single node.\r\n# Note, running the molpro helper script always tries to start MPI\r\n# processes.\r\nmolpro.exe -t $NSLOTS <em>args<\/em>\r\n<\/pre>\n<p>Submit the jobscript using:<\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<h3>Multi-node Parallel batch job submission &#8211; not available on CSF3 at the moment<\/h3>\n<p>A multi-node parallel job must use the MPI method of starting molpro.<\/p>\n<p>Make sure you have the modulefile loaded then create a batch submission script, for example:<\/p>\n<pre>#!\/bin\/bash --login\r\n#$ -S \/bin\/bash\r\n#$ -cwd                   # Job will run from the current directory\r\n#$ -pe ??????.pe ??   # Minimum of 48 cores, must be a multiple of 24\r\n\r\nmolpro -n $NSLOTS <em>args<\/em>\r\n<\/pre>\n<p>Submit the jobscript using:<\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<h3>Experimental Multi-node Parallel Job &#8211; not applicable to CSF3 at the moment<\/h3>\n<p>It is also possible to start the <code>molpro.exe<\/code> directly with <code>mpirun<\/code> as we do with other MPI applications. In this case you must load an MPI module file. For example:<\/p>\n<pre># This is suitable for fully-populated nodes (where you are using all cores on the node)\r\nmodule load mpi\/intel-14.0\/openmpi\/1.8.3m-ib\r\nmodule load apps\/binapps\/molpro\/2015.1.0\r\n<\/pre>\n<p>Then submit a batch job containing:<\/p>\n<pre>#!\/bin\/bash --login\r\n#$ -S \/bin\/bash\r\n#$ -cwd                   # Job will run from the current directory\r\n#### See previous example for other PEs\r\n#$ -pe ?????.pe ???   # Minimum of 48 cores, must be a multiple of 24\r\n\r\n# We start the molpro.exe with mpirun:\r\n\r\nmpirun -n $NSLOTS molpro.exe <em>args<\/em>\r\n<\/pre>\n<h2>Further info<\/h2>\n<ul>\n<li><a href=\"https:\/\/www.molpro.net\/\">MOLPRO website<\/a><\/li>\n<\/ul>\n<h2>Updates<\/h2>\n<p>None.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Overview Molpro is a comprehensive system of ab initio programs for advanced molecular electronic structure calculations 2019.2.0, 2018.2.0 &amp; 2015.1.27 OpenMP are installed on the CSF3. Multi-node parallelism using MPI is supported by this version (though there are no multi-node resources in CSF3 yet). Restrictions on use This software is restricted to a specific research group. Please cite the software when used in your research. Set up procedure To access the software you must load.. <a href=\"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/software\/applications\/molpro\/\">Read more &raquo;<\/a><\/p>\n","protected":false},"author":3,"featured_media":0,"parent":86,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-633","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/wp-json\/wp\/v2\/pages\/633","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/wp-json\/wp\/v2\/comments?post=633"}],"version-history":[{"count":11,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/wp-json\/wp\/v2\/pages\/633\/revisions"}],"predecessor-version":[{"id":5528,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/wp-json\/wp\/v2\/pages\/633\/revisions\/5528"}],"up":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/wp-json\/wp\/v2\/pages\/86"}],"wp:attachment":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf3\/wp-json\/wp\/v2\/media?parent=633"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}