{"id":1565,"date":"2014-06-11T14:01:39","date_gmt":"2014-06-11T14:01:39","guid":{"rendered":"http:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/?page_id=1565"},"modified":"2017-08-10T13:16:23","modified_gmt":"2017-08-10T13:16:23","slug":"wrf","status":"publish","type":"page","link":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/software\/applications\/wrf\/","title":{"rendered":"WRF and WPS"},"content":{"rendered":"<h2>Overview<\/h2>\n<p><a href=\"http:\/\/www.wrf-model.org\/index.php\">Weather Research &amp; Forecasting (WRF)<\/a> Model is a next-generation mesoscale numerical weather prediction system designed to serve both atmospheric research and operational forecasting needs.<\/p>\n<h3>Version Installed<\/h3>\n<p>WPS + WRF ARW (Advanced Research WRF) v3.6 compiled for Real cases, (distributed memory parallelism &#8211; dmpar &#8211; version) with basic nesting is installed on the CSF.<\/p>\n<h4>3.6<\/h4>\n<p>Version 3.6 has been compiled for use on AMD Bulldozer nodes only.<\/p>\n<h4>3.8<\/h4>\n<p>Version 3.8 has been compiled for use on Intel nodes only.<\/p>\n<h3>WRF Tutorial for CSF Users<\/h3>\n<p>All users of WRF on the CSF are strongly encouraged to work through the tutorial at the UoM WRF Community <a href=\"http:\/\/wiki.rac.manchester.ac.uk\/community\/WRF\/public\/CSFmodules#v3.6\">WRF \/ CSF tutorial page<\/a>. This provides a complete example of a WRF simulation including pre-processing with WPS and post-processing with NCL. Thanks to Jonathan Fairman (<a href=\"http:\/\/www.cas.manchester.ac.uk\/\">CAS<\/a>) for developing this tutorial.<\/p>\n<h3>Compilation Info<\/h3>\n<h4>3.6<\/h4>\n<p>WPS + WRF and their dependency libraries were compiled using the PGI 13.6 with ACML (math libraries) optimized for Bulldozer FMA4 instructions (see modulefiles below). The following compiler flags were used:<\/p>\n<pre>-tp bulldozer -O3 -fast\r\n<\/pre>\n<p>See <a href=\"http:\/\/wiki.rac.manchester.ac.uk\/community\/WRF\/public\/Installation\/\">RAC Community Wiki WRF Build Pages<\/a> for how this version was compiled.<\/p>\n<h4>3.8<\/h4>\n<p>WPS + WRF and their dependency libraries were compiled using Intel 15.0, using the following compiler flags:<\/p>\n<p>-w -O3 -ip -msse2 -axSSE4.2,AVX,CORE-AVX2<\/p>\n<p><a href=\"https:\/\/github.com\/gcapes\/software-installation-scripts\/tree\/master\/WRF\/3.8\/CSF2\">See here<\/a> for build details.<\/p>\n<h3>Related Tools<\/h3>\n<p>For post processing of WRF results <a href=\"\/csf-apps\/software\/applications\/ncl\/\">NCL<\/a> is installed on the CSF. This is a binary install so can be run on any CSF nodes (not just AMD Bulldozer).<\/p>\n<h2>Restrictions on use<\/h2>\n<p>None &#8212; <a href=\"http:\/\/www2.mmm.ucar.edu\/wrf\/users\/public.html\">Public Domain<\/a>.<\/p>\n<h2>Set up procedure<\/h2>\n<p>To access the software you must first load the appropriate modulefile from the options below (if you&#8217;re not sure, it&#8217;s probably the first one):<\/p>\n<pre>module load apps\/intel-15.0\/WRF\/3.8\u00a0\r\nmodule load apps\/pgi-13.6-acml-fma4\/wrf\/3.6-ib-amd-bd\r\n<\/pre>\n<h3>Settings applied by the WRF modulefile<\/h3>\n<p>The <code>WRF<\/code> modulefiles will automatically load the following modulefiles, which indicate the dependencies used when compiling WRF:<\/p>\n<h4>3.6<\/h4>\n<pre># These are automatically loaded for you\r\n\r\ncompilers\/PGI\/13.6-acml-fma4                   # PGI 13.6 with optimized maths libraries\r\nmpi\/pgi-13.6-acml-fma4\/openmpi\/1.6-ib-amd-bd   # OpenMPI 1.6 with InfiniBand\r\nlibs\/pgi-13.6-acml-fma4\/zlib\/1.2.8-ib-amd-bd   # ZLIB compression\r\nlibs\/pgi-13.6-acml-fma4\/hdf\/5\/1.8.13-ib-amd-bd # HDF-5 (serial I\/O)\r\nlibs\/pgi-13.6-acml-fma4\/netcdf\/4.3.2-ib-amd-bd # NetCDF4 inc FORTRAN libraries\r\n<\/pre>\n<h4>3.8<\/h4>\n<pre># These are automatically loaded for you \r\n\r\ncompilers\/intel\/c\/15.0.3\r\ncompilers\/intel\/fortran\/15.0.3\r\nmpi\/intel-15.0\/openmpi\/1.8.30-ib\r\nlibs\/intel-15.0\/netcdf\/4.4.0\r\nlibs\/intel-15.0\/zlib\/1.2.8\r\nlibs\/intel-15.0\/hdf\/5\/1.8.16<\/pre>\n<h4>Both versions<\/h4>\n<p>Both <code>wrf<\/code> modulefiles will set the following environment variables (for convenient access to the installation directories and for some compilation settings):<\/p>\n<ul>\n<li><code>$WRF_DIR<\/code> &#8211; the WRFV3 installation directory<\/li>\n<li><code>$WPS_DIR<\/code> &#8211; the WPS installation directory<\/li>\n<li><code>$WPS_GEOG<\/code> &#8211; the new static WPS_GEOG data installation directory<\/li>\n<li><code>$JASPERLIB<\/code> &#8211; Location of JPEG libraries for GRIB2<\/li>\n<li><code>$JASPERINC<\/code> &#8211; Location of JPEG headers for GRIB2<\/li>\n<li><code>$WRFIO_NCD_LARGE_FILE_SUPPORT<\/code> &#8211; Is set to 1 to enable compilation of Large File support in NetCDF<\/li>\n<li><code>$NETCDF4<\/code> &#8211; Is set to 1 to enable compilation of NetCDF support<\/li>\n<\/ul>\n<p>To see the actual values of these variables (e.g. so you can put the WPS_GEOG directory in a namelist file) run:<\/p>\n<pre>echo $WPS_GEOG\r\n<\/pre>\n<p>The <code>$WRF_DIR\/run\/<\/code> and <code>$WPS_DIR<\/code> directories will also be added to your <code>$PATH<\/code> environment variable.<\/p>\n<h2>Running the application<\/h2>\n<p>Please do not run WRF on the login node. Jobs should be submitted to the compute nodes via batch.<\/p>\n<p><strong>Important:<\/strong> you must run WRF in your <em>scratch<\/em> directory, not your <em>home<\/em> directory. WRF input and output files can be large and you can generate lots of them. You will very likely fill up the <em>home<\/em> filesystem which is shared with other members of your group. This will cause your jobs and <strong>other users&#8217;<\/strong> jobs to fail &#8211; resulting in very unhappy colleagues! Please <strong>run in scratch<\/strong>. Any important results (and small input\/config files such as <em>namelist<\/em> files and batch jobscripts) can be copied back to <em>home<\/em> for safe-keeping (home is backed up, scratch is not).<\/p>\n<h3>Parallel batch job submission &#8211; all versions<\/h3>\n<p>The WRF and WPS executables were built as dmpar executables during WRF config (i.e., they use MPI). The following are available:<\/p>\n<ul>\n<li>In <code>$WRF_DIR\/run\/<\/code>: ndown.exe, nup.exe, real.exe tc.exe wrf.exe<\/li>\n<li>In <code>$WPS_DIR\/<\/code>: geogrid.exe, metgrid.exe, ungrib.exe(this one is not an MPI executable)<\/li>\n<\/ul>\n<h3>Version 3.8 only<\/h3>\n<p>Make sure you have loaded the modulefile, then create a batch submission script using the template <code>$WRF_DIR\/run\/qsub-wrf.sh<\/code> as an example:<\/p>\n<pre>#!\/bin\/bash\r\n#!\/bin\/bash\r\n#$ -S bash\r\n#$ -cwd                                      # Job will run from the current directory\r\n#$ -V                                        # Job will inherit current environment settings\r\n#$ -N WRF_job                                # Give the job a name of your choosing.\r\n#$ -m bea                                    # Send yourself an email when job starts, ends, or on error.\r\n#$ -M firs&#116;&#110;&#97;&#109;&#101;&#46;&#115;&#x75;&#x72;&#x6e;&#x61;&#x6d;&#x65;&#x40;&#x6d;&#x61;nche&#115;&#116;&#101;&#114;&#46;&#97;&#99;&#x2e;&#x75;&#x6b;     # Change to your email address.\r\n\r\n##### Multi-node MPI #####\r\n#$ -pe orte-24-ib.pe 48 -l haswell \u00a0 \u00a0\u00a0      # Use a multiple of 24\r\nmpirun -n $NSLOTS .\/wrf.exe                  # $NSLOTS is automatically set to number of cores<\/pre>\n<h3>Version 3.6 only<\/h3>\n<p>Note: If trying to run any of the above executables on the login node you will get an error message:<\/p>\n<pre>Illegal instruction\r\n<\/pre>\n<p>This is because the executables are compiled for the AMD Bulldozer architecture. The login node uses an Intel CPU.<\/p>\n<p>Make sure you have the modulefile loaded then create a batch submission script to run on the AMD Bulldozer nodes, for example:<\/p>\n<pre>#!\/bin\/bash\r\n#$ -S \/bin\/bash\r\n#$ -cwd                      # Job will run from the current directory\r\n#$ -V                        # Job will inherit current environment settings\r\n\r\n########### Choose ONE of these PEs ##########\r\n\r\n##### Multi-node MPI #####\r\n\r\n#$ -pe orte-64bd-ib.pe  128  # 128 cores or more in this PE, multiples of 64 only\r\n\r\n##### Single-node MPI or OpenMP #####\r\n\r\n#$ -pe smp-64bd.pe 64        # 64 cores or fewer\r\n\r\n\r\nmpirun -n $NSLOTS real.exe\r\n<\/pre>\n<p>Submit the jobscript using:<\/p>\n<pre>qsub <em>scriptname<\/em><\/pre>\n<p>where <em>scriptname<\/em> is the name of your jobscript.<\/p>\n<h3>Interactive Use and Compilation<\/h3>\n<p>If you need to compile a model against the WRF installation or wish to quickly run one of the WPS tools then you&#8217;ll need to start an interactive session on the Bulldozer <em>short<\/em> node as follows:<\/p>\n<pre>qrsh -l bulldozer -l short\r\n  #\r\n  # Wait for new prompt (or try again later if asked)\r\n\r\n# Now set up environment on backend Bulldozer node\r\nmodule load apps\/pgi-13.6-acml-fma4\/wrf\/3.6-ib-amd-bd\r\n<\/pre>\n<p>You can now run WRF and WPS tools interactively. However, please note:<\/p>\n<ul>\n<li>There is only one interactive Bulldozer node. Do not run on all 64-cores.<\/li>\n<li>Maximum runtime is 12 hours<\/li>\n<li>The above <code>qrsh<\/code> command reserves you only one core. If you plan to run small MPI jobs interactively you must reserve the correct number of cores using:\n<pre>qrsh -l bulldozer -l short -pe smp-64bd.pe 8    # e.g., 8 cores\r\n<\/pre>\n<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<h2>Further info<\/h2>\n<ul>\n<li><a href=\"http:\/\/wiki.rac.manchester.ac.uk\/community\/WRF\/public\/CSFmodules#v3.6\">WRF \/ CSF tutorial page<\/a><\/li>\n<li><a href=\"\/csf-apps\/software\/applications\/ncl\/\">NCL on the CSF<\/a> for post-processing and visualization of WRF results.<\/li>\n<li><a href=\"http:\/\/www.wrf-model.org\/index.php\">WRF website<\/a> &#8211; General WRF info<\/li>\n<li><a href=\"http:\/\/www2.mmm.ucar.edu\/wrf\/users\/docs\/user_guide_V3\/contents.html\">WRF ARW info<\/a> &#8211; Details about the Advanced Research WRF (ARW) version used on CSF<\/li>\n<li><a href=\"http:\/\/www.dtcenter.org\/wrf-nmm\/users\/\">WRF NMM info<\/a> &#8211; Details about the Nonhydrostatic Mesoscale Model (NMM) version which is NOT currently installed on CSF (but may be of interest)<\/li>\n<li><a href=\"http:\/\/www2.mmm.ucar.edu\/wrf\/OnLineTutorial\/Introduction\/index.html\">WRF ARW tutorial<\/a> including the build procedure followed on the CSF<\/li>\n<li><a href=\"http:\/\/wiki.rac.manchester.ac.uk\/community\/WRF\/public\/Installation\/\">RAC Community Wiki WRF Build Pages<\/a> Step-by-step details of the build of WRF 3.6 on the CSF<\/li>\n<\/ul>\n<h2>Updates<\/h2>\n<p>None.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Overview Weather Research &amp; Forecasting (WRF) Model is a next-generation mesoscale numerical weather prediction system designed to serve both atmospheric research and operational forecasting needs. Version Installed WPS + WRF ARW (Advanced Research WRF) v3.6 compiled for Real cases, (distributed memory parallelism &#8211; dmpar &#8211; version) with basic nesting is installed on the CSF. 3.6 Version 3.6 has been compiled for use on AMD Bulldozer nodes only. 3.8 Version 3.8 has been compiled for use.. <a href=\"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/software\/applications\/wrf\/\">Read more &raquo;<\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"parent":31,"menu_order":0,"comment_status":"open","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-1565","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/1565","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/comments?post=1565"}],"version-history":[{"count":20,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/1565\/revisions"}],"predecessor-version":[{"id":4134,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/1565\/revisions\/4134"}],"up":[{"embeddable":true,"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/pages\/31"}],"wp:attachment":[{"href":"https:\/\/ri.itservices.manchester.ac.uk\/csf-apps\/wp-json\/wp\/v2\/media?parent=1565"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}