FEPX

Overview

FEPX is a finite element software package for polycrystal plasticity. It can model both the global and local mechanical behaviors of large polycrystalline aggregates with complex microstructures via a scalable parallel framework.

FEPX interfaces primarily with the program FEPX which acts as the primary pre- and post-processor. Neper handles the generation of FEPX’s microstructure input (the mesh), the conversion of raw output to more accessible formats, and a wide array of output visualization

Version 1.1.1 is available on the CSF3.

Restrictions on use

FEPX is currently maintained and developed by the Advanced Computational Materials Engineering Laboratory (ACME Lab) at The University of Alabama.

FEPX is distributed as a free / open-source software, under the terms of the GNU General Public License (GPL). In short, this means that everyone is free to use FEPX and to redistribute it on a free basis. FEPX is not in the public domain; it is copyrighted and there are restrictions on its distribution (see the license and the related FAQ).

Set up procedure

To use the software you will need to load the modulefile:

# FEPX 1.1.1
module load apps/gcc/fepx/1.1.1

Running the application

FEPX must be run in batch

By default, a minimum of two files is necessary to completely define a simulation. These files are the configuration file e.g. simulation.config, and the mesh file e.g. simulation.msh. The configuration file defines the material parameters, control of the simulation (i.e., boundary conditions and loading history), printing of output files, and various optional input. The mesh file contains the polycrystal finite element mesh information (grain morphologies, phases and crystal orien-tations) as well as simulation-related information on the domain faces and the mesh partitions used for parallel simulations.

The mesh file is generally generated by Neper and not directly modified. Please note Neper is loaded alongside FEPX and therefore can be called directly from your FEPX job script. See below for an example script generate_mesh.sh that generates a suitable mesh file using Neper. This script is then executed within the later FEPX jobscript.

#!/bin/bash

# Generate the tessellation and mesh files for use in example `1_uniaxial'
# - This bash script requires a configured installation of Neper 4.0.0
#   in order to be properly executed.

# First, generate a centroidal tessellation for the domain with `Neper -T':
neper -T -n 100 -reg 1 -rsel 1.25 -mloop 4 \
    -morpho "diameq:1,1-sphericity:lognormal(0.145,0.03)" -morphooptistop val=5e-3 \
    -oricrysym "cubic" \
    -o simulation

# Then, generate a coarse finite element mesh for the domain with `Neper -M':
neper -M simulation.tess -order 2 -rcl 1.25 -part 2

exit 0

# This script produces output files:
# - simuation.tess
# - simulation.msh

Both the configuration file and the mesh file should reside in the working directory from where the job was submitted.

Example simulations can be found at the following directory $FEPX_examples their usage is outlined in the FEPX documentation FEPX docs. Feel free to copy them to your local directory.

Examples

Serial batch job submission

  • Make sure you have the module loaded.
  • Create or upload a simulation file and mesh file to your working directory.
  • Write a submission script, for example:
    #!/bin/bash --login
    
    #$ -cwd
    
    module load apps/gcc/fepx/1.1.1    # This will also load Neper
    
    # A mesh file is required either copy one to the working directory or use the generate_mesh.sh (see above) to generate one.
    ./generate_mesh.sh
    
    # Then, run FEPX.
    fepx
    
  • Submit with: qsub scriptname

Parallel (Single-Node) batch job submission

  • Make sure you have the module loaded.
  • Create or upload a simulation file and mesh file to your working directory.
  • Write a submission script, for example:
    #!/bin/bash --login
    
    #$ -cwd
    #$ -pe smp.pe 2 # Number of cores (2-32)
    
    export OMP_NUM_THREADS=$NSLOTS
    
    module load apps/gcc/fepx/1.1.1     # This will also load Neper
    
    # A mesh file is required either copy one to the working directory or use the generate_mesh.sh to generate one.
    ./generate_mesh.sh
    
    # Then, run FEPX in parallel, executes FEPX with OpenMPI's `mpirun' on $NSLOTS processors by default.
    mpirun -np $NSLOTS fepx
    
  • Submit with: qsub scriptname

Parallel (Multi-Node) batch job submission

  • Make sure you have the module loaded.
  • Create or upload a simulation file and mesh file to your working directory.
  • Write a submission script, for example:
    #!/bin/bash --login
    
    #$ -cwd
    #$ -pe mpi-24.ib.pe 48 # Number of cores (48-128 in multiples of 24)
    
    module load apps/gcc/fepx/1.1.1    # This will also load Neper
    
    # A mesh file is required either copy one to the working directory or use the generate_mesh.sh to generate one.
    ./generate_mesh.sh
    
    # Then, run FEPX in parallel, executes FEPX with OpenMPI's `mpirun' on $NSLOTS processors by default.
    mpirun -np $NSLOTS fepx
    
  • Submit with: qsub scriptname
  • Acknowledgment

    If you use FEPX for your own work, please, mention it explicitly in your reports and cite the following publication: P.R. Dawson, D.E. Boyce. FEPX Finite Element Polycrystals: Theory, Finite Element Formulation, Numerical Implementation and Illustrative Examples. arXiv:1504.03296

    Further info

    • Further information including detailed documentation can be found on the FEPX docs

Last modified on June 11, 2021 at 4:10 pm by Chris Grave