Ansys Mechanical

Overview

Ansys Mechanical can perform a variety of engineering simulations, including stress, thermal, vibration, thermo-electric, and magnetostatic simulations. The program has many finite-element analysis capabilities, ranging from a simple, linear, static analysis to a complex, nonlinear, transient dynamic analysis.

Version 2021R1 is installed.

Restrictions on Use

Only users who have been added to the Fluent group can run the application (yes, the Fluent group due to the way all Ansys products are installed.) Owing to licence restrictions, only users from the School of MACE and one specific CEAS Research Group can be added to this group. Requests to be added to the Fluent group should be emailed to

its-ri-team@manchester.ac.uk

Ansys Mechanical jobs must not be run on the login node. If you need to run an interactive job please use qrsh as detailed below.

Set Up Procedure

Once you have been added to the Fluent group, you will be able to access the executables by using one of the following module commands:

module load apps/binapps/ansys/19.2
module load apps/binapps/ansys/19.5
module load apps/binapps/ansys/2021R1

We now recommend loading modulefiles within your jobscript so that you have a full record of how the job was run. See the example jobscript below for how to do this. Alternatively, you may load modulefiles on the login node and let the job inherit these settings.

Running the application

Please do not run Ansys Mechanical on the login node. Jobs should be submitted to the compute nodes via batch.

The software can be run as a singe-node multicore application, a single-node distributed MPI application or a multi-node distributed MPI application. Please ensure you include all of the steps in your jobscripts to ensure they are run correctly.

Note that the parallel method used to run the application may restrict which parallel solvers or other features can be used. Please consult the Ansys documentation about this – see end of this page for how to access the Ansys online documentation.

Single-node 2-32 cores – shared-memory

Note that the default parallel mode is distributed memory (MPI) mode so we must use the -smp flag on the Ansys command-line to indicate we are running in shared-memory (single compute-node) parallel mode. Create a jobscript similar to the following:

#!/bin/bash --login
#$ -cwd
#$ -pe smp.pe  16       # Number of cores - can be 2--32

module load apps/binapps/ansys/2021R1

# $NSLOTS is set automatically the number of cores requested above
ansys211 -smp -np $NSLOTS -b -i mysim.inp -o mysim.out -j mysim
                                                            #
                                                            # Job Name (used for per-process files
                                                            # output file such as .rst, .err, .db)

Submit the job using

qsub jobscript

where jobscript is the name of your jobscript file.

Single-node 2-32 cores – distributed memory MPI

Distributed memory mode is usually used for larger multi-node jobs. However it can be used for a single-node job. You may wish to use this if a particular solver can only be run in this mode, so isn’t supported by the shared-memory parallel mode described above. You may also want to compare timing of both methods if possible to see which is faster. Create a jobscript similar to the following:

#!/bin/bash --login
#$ -cwd
#$ -pe smp.pe 8        # Number of cores - can be 2--32

# Load your required version
module load apps/binapps/ansys/2021R1

# Note: An extra setup step is required for Ansys Mechanical MPI jobs. You must do this!
source setup_ansys

ansys211 -mpi openmpi -mpifile $HOSTS_FILE -b -i mysim.inp -o mysim.out -j mysim
                                  # 
                                  # This is set by the "setup_ansys" script above

Submit the job using

qsub jobscript

where jobscript is the name of your jobscript file.

Multi-node 48 cores or more in multiples of 24 – distributed memory MPI

Distributed memory mode must be used for larger multi-node jobs. Create a jobscript similar to the following:

#!/bin/bash --login
#$ -cwd
#$ -pe mpi-24-ib.pe 48        # Number of cores - can be 48 or more in multiples of 24

# Load your required version
module load apps/binapps/ansys/2021R1

# Note: An extra setup step is required for Ansys Mechanical MPI jobs. You must do this!
source setup_ansys

ansys211 -mpi openmpi -mpifile $HOSTS_FILE -b -i mysim.inp -o mysim.out -j mysim
                                  # 
                                  # This is set by the "setup_ansys" script above

Submit the job using

qsub jobscript

where jobscript is the name of your jobscript file.

GPU job with shared memory CPU parallelism

Please note that access to GPUs is not automatic and you must request access before running the following type of job. Please email its-ri-team@manchester.ac.uk to request access to the GPUs.

The GPU does not replace CPU usage in Ansys Mechanical – computation will be offloaded from the CPUs to the GPU when appropriate. You should also check the Ansys documentation for more details on which solvers can use the GPU and how to get the best performance from those solvers.

The following example uses two GPUs and 16 CPU cores – the number of GPUs you have access to is dependent on the contributing group in which you run CSF jobs.

#!/bin/bash --login
#$ -cwd
#$ -l v100=2              # Number of GPUs - can be 1-4 depending on your level of access
#$ -pe smp.pe 16          # Can be up to 8 cores per GPU

# Load your required version
module load apps/binapps/ansys/2021R1

# $NSLOTS is set automatically the number of cores requested above.
# $NGPUS is set automatically to the number of GPUs requested above.
ansys211 -acc nvidia -na $NGPUS -smp -np $NSLOTS -b -i mysim.inp -o mysim.out -j mysim
                #          #
                #          # -na flag says how many accelerators (GPUs) are to be used
                #
                # -acc flag says which type of accelerator (GPU) is to be used

Submit the job using

qsub jobscript

where jobscript is the name of your jobscript file.

Further Information

To access the Ansys online help, run the following command on the login node after loading the required modulefile:

anshelp

Attempting to open help page "https://ansyshelp.ansys.com/account/Secured?Token=.....".
If this page does not open, you may need to install ...
   #
   # A web-browser will NOT be opened on the CSF. Instead, paste the generated URL
   # in to your own web-browser. The URL is valid for a short period of time.

Paste the generated URL (https://ansyshelp.ansys.com/account/Secured?Token=.....) in to your web-browser. Note that you need to do this soon after running the anshelp command because the URL will only be valid for a short time. You should NOT need to log in to the Ansys support portal. If you are asked to log in to the Ansys portal, run the above anshelp command to generate a new URL.

Last modified on April 21, 2023 at 3:26 pm by George Leaver