MOOSE Configuration
Configure and test your own "MOOSE" framework
Basic Steps
- mkdir projects
- cd projects
- module load moose/26-jul-21
- git clone https://github.com/idaholab/moose.git
- cd moose
- git checkout master
- export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
- Run "./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required --with-vtk-lib=${HPC_VTK_LIB} --with-vtk-include=${HPC_VTK_INC}" and allow it to complete. It should finish without errors.
- cd test; run_tests -j 4
- Build and test the "phase field" module:
- cd moose/modules/phase_field
- make -j 4
- ./run_tests -j 4
- Build and test the "combined" module:
- cd moose/modules/combined
- make -j 4
- ./run_tests -j 4
Additional Information
More Detailed Instructions
Load a dev Session
Build libmesh and the MOOSE executables in the dev session, but run the tests on the login node. This is an alias you can add to your ~/.bash_profile:
vim ~/.bash_profile
alias devsess =’srundev --mem=32gb --ntasks=4 --cpus-per-task=4 --time=12:00:00’
Load the MOOSE Module
There are multiple versions. Each version was built with different versions of the required libraries. In general, the latest version of the libraries will be used in the latest version of the module. The module listed below is good in general, but you may have specific reasons to use a different version.
module load moose/26-jul-21
Check the module list, you should see something like:
module list
Currently Loaded Modules:
1) ufrc
2) gcc/9.3.0
3) openmpi/4.1.1
4) petsc/3.15.1
5) qt/5.12.9
6) vtk/8.2.0
7) moose/26-jul-21
Clone MOOSE
mkdir ~/projects
cd ~/projects
git clone https://github.com/idaholab/moose.git
cd moose
git checkout master
Export VTK paths
Make sure you export the VTK paths after the MOOSE module is loaded, otherwise the path may be empty.
export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
Peacock
To use the peacock input file syntax front-end, add the path to the 'python' directory in your MOOSE install as peacock requires the mooseutils python module. E.g.
export PYTHONPATH=/my/moose/dir/python:$PYTHONPATH
Update and Build libmesh
Run the update and build libmesh script. It should finish without errors.
cd ~/projects/moose
./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required --with-vtk-lib=${HPC_VTK_LIB} --with-vtk-include=${HPC_VTK_INC}
Compile MOOSE Tests
Build the MOOSE test executable.
cd ~/projects/moose/test
make -j 4
Run the Test Suites
Now run the tests. The test suite in moose/test is important to check if all libraries are correct. You need to exit the dev session to run the tests. If the build was successful, I would recommend building the module you are interested in and then exit and run both test scripts. Don’t forget that you may need to reload the moose module on the login node.
cd ~/projects/moose/test
./run_tests -j 4 (login node)
Compile and Test Your Own Module
Build the executable and run the specific tests.
cd ~/projects/moose/modules/combined
make -j16 (dev session)
./run_tests -j 4(login node)
General suggestions
If you are not getting a fresh clone of MOOSE, it’s recommended to do a clean-up before you recompile your executables.
cd ~/projects/moose/modules/combined
make clobberall
make -j 4
You may consider adding the module load and export steps to your ~/.bash_profile:
vim ~/.bash_profile
module purge
module load moose/12-aug-20
export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
MOOSE-based Applications
Perform all the previous steps related to the MOOSE installation. Then build your application and test it.
cd ~/projects/your_app
make clobberall
make -j 4
./run_tests -j 4 (login node)
Slurm Job Script
This is an example of a slurm job script:
#!/bin/sh
#SBATCH --job-name=moose #Job name
#SBATCH --nodes=1 #Number of nodes (servers, 32 proc/node)
#SBATCH --ntasks=16 #Number of tasks/MPI RankS
#SBATCH --ntasks-per-node=16 #Tasks per node
#SBATCH --ntasks-per-socket=8 #Tasks per socket
#SBATCH --cpus-per-task=1 #Number of CPU per task
#SBATCH --mem-per-cpu=3600mb #Memory (120 gig/server)
#SBATCH --distribution=cyclic:cyclic #Distribute tasks cyclically
#SBATCH --time=12:00:00 #Walltime days-hh:mm:ss
#SBATCH --output=moose-%j.out #Output and error log
#SBATCH --mail-type=END,FAIL #When to email user
#SBATCH --mail-user=your-email@ufl.edu #Email address to send mail to
#SBATCH --account=michael.tonks #Allocation group name, add -b for burst job
srun --mpi=pmix_v3 ~/projects/moose/modules/combined-opt -i moose_input_file.i