Difference between revisions of "MOOSE Configuration"

From UFRC
Jump to navigation Jump to search
(28 intermediate revisions by 3 users not shown)
Line 1: Line 1:
 
== Configure and test your own "MOOSE" framework ==
 
== Configure and test your own "MOOSE" framework ==
 +
;Note: Make sure your .bashrc and .bash_profile shell initialization scripts are 'clean'. If you encounter errors while setting up MOOSE make sure there are no environment changes caused by those scripts.
 +
 
===Basic Steps===
 
===Basic Steps===
 
# mkdir projects
 
# mkdir projects
 
# cd projects
 
# cd projects
# module load moose/12-aug-20
+
# module load moose/26-jul-21
 
# git clone https://github.com/idaholab/moose.git
 
# git clone https://github.com/idaholab/moose.git
 
# cd moose
 
# cd moose
 
# git checkout master
 
# git checkout master
 
# export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
 
# export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
# Run "./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required"
+
# Run "./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required --with-vtk-lib=${HPC_VTK_LIB} --with-vtk-include=${HPC_VTK_INC}" and allow it to complete.  It should finish without errors.
# Interrupt (''ctrl-c'') the above configure/build just after the ''git clone'' of libmesh is completed.
+
# cd test; make -j 4; ./run_tests -j 4
# Patch the "configure" script using the accompanying patch file (see below).
 
# Rerun "./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required" and allow it to complete.  It should finish without errors.
 
# cd test; run_tests -j 4
 
 
# Build and test the "phase field" module:
 
# Build and test the "phase field" module:
 
## cd moose/modules/phase_field
 
## cd moose/modules/phase_field
Line 21: Line 20:
 
## make -j 4
 
## make -j 4
 
##./run_tests -j 4
 
##./run_tests -j 4
=== Patch File (configure.patch) ===
 
The patch file below can be applied via the ''patch'' utility.
 
<source lang=bash>
 
  cd libmesh
 
  patch -p1 < configure.patch
 
</source>
 
 
<source lang=bash>
 
======================================configure.patch==========================================================
 
--- a/configure 2020-08-07 07:56:18.955696774 -0400
 
+++ b/configure 2020-08-07 08:00:45.329361095 -0400
 
@@ -38452,24 +38452,26 @@
 
                  VTK_LIBRARY_WITH_VERSION="-L$VTK_LIB -lvtkIOCore-$vtkmajorminor -lvtkCommonCore-$vtkmajorminor -lvtkCommonDataModel-$vtkmajorminor \
 
                                            -lvtkFiltersCore-$vtkmajorminor -lvtkIOXML-$vtkmajorminor -lvtkImagingCore-$vtkmajorminor \
 
                                            -lvtkIOImage-$vtkmajorminor -lvtkImagingMath-$vtkmajorminor \
 
-                                          -lvtkParallelMPI-$vtkmajorminor -lvtkParallelCore-$vtkmajorminor"
 
+                                          -lvtkParallelMPI-$vtkmajorminor -lvtkParallelCore-$vtkmajorminor \
 
+                                          -lvtkCommonExecutionModel-$vtkmajorminor"
 
 
                                                                    VTK_LIBRARY_NO_VERSION="-L$VTK_LIB -lvtkIOCore -lvtkCommonCore -lvtkCommonDataModel \
 
                                          -lvtkFiltersCore -lvtkIOXML -lvtkImagingCore \
 
                                          -lvtkIOImage -lvtkImagingMath \
 
-                                        -lvtkParallelMPI -lvtkParallelCore"
 
+                                        -lvtkParallelMPI -lvtkParallelCore -lvtkCommonExecutionModel"
 
 
else
 
 
                                                    VTK_LIBRARY_WITH_VERSION="-L$VTK_LIB -lvtkIOCore-$vtkmajorminor -lvtkCommonCore-$vtkmajorminor -lvtkCommonDataModel-$vtkmajorminor \
 
                                            -lvtkFiltersCore-$vtkmajorminor -lvtkIOXML-$vtkmajorminor -lvtkImagingCore-$vtkmajorminor \
 
                                            -lvtkIOImage-$vtkmajorminor -lvtkImagingMath-$vtkmajorminor -lvtkIOParallelXML-$vtkmajorminor \
 
-                                          -lvtkParallelMPI-$vtkmajorminor -lvtkParallelCore-$vtkmajorminor"
 
+                                          -lvtkParallelMPI-$vtkmajorminor -lvtkParallelCore-$vtkmajorminor \
 
+                                          -lvtkCommonExecutionModel-$vtkmajorminor"
 
 
                                                                    VTK_LIBRARY_NO_VERSION="-L$VTK_LIB -lvtkIOCore -lvtkCommonCore -lvtkCommonDataModel \
 
                                          -lvtkFiltersCore -lvtkIOXML -lvtkImagingCore \
 
                                          -lvtkIOImage -lvtkImagingMath -lvtkIOParallelXML \
 
-                                        -lvtkParallelMPI -lvtkParallelCore"
 
+                                        -lvtkParallelMPI -lvtkParallelCore -lvtkCommonExecutionModel"
 
 
fi
 
======================================================================================================
 
</source>
 
  
 +
===Additional Information===
 +
See [https://support.rc.ufl.edu/show_bug.cgi?id=43807 Bugzilla Request #43807]
  
 
==More Detailed Instructions==
 
==More Detailed Instructions==
===Load your dev session===
+
===Load the MOOSE Module===
Build libmesh and the MOOSE executables in the dev session, but run the tests on the login node. This is an alias you can add to your ~/.bash_profile:
+
There are multiple versions. Each version was built with different versions of the required libraries. In general, the latest version of the libraries will be used in the latest version of the module. The module listed below is good in general, but you may have specific reasons to use a different version.
  
<source lang=bash>
+
module load moose/26-jul-21
vim ~/.bash_profile
 
alias devsess =’srundev --mem=32gb --ntasks=4 --cpus-per-task=4 --time=12:00:00’
 
</source>
 
  
===Load the MOOSE module===
 
There are multiple versions. Each version was built with different versions of the required libraries. In general, the latest version of the libraries will be used in the latest version of the module. The module listed below is good in general, but you may have specific reasons to use a different version.
 
<source lang=bash>
 
module load moose/12-aug-20
 
</source>
 
  
 
Check the module list, you should see something like:
 
Check the module list, you should see something like:
  
<source lang=bash>
+
<pre>
 
module list
 
module list
  
 
Currently Loaded Modules:
 
Currently Loaded Modules:
1) ufrc
+
  1) gcc/9.3.0
2) gui/2.0.0
+
  2) mkl/2020.0.166
3) python/3.8
+
  3) openmpi/4.1.1
4) gcc/8.2.0
+
  4) petsc/3.15.1  
5) openmpi/4.0.3
+
  5) qt/5.12.9
6) mkl/2020.0.166
+
  6) vtk/8.2.0  
7) hypre/2.15.1
+
  7) moose/26-jul-21
8) petsc/3.13.4
+
</pre>
9) vtk/8.2.0
 
10) moose/12-aug-20
 
</source>
 
  
 
===Clone MOOSE===
 
===Clone MOOSE===
<source lang=bash>
+
<pre>
 
mkdir ~/projects
 
mkdir ~/projects
 
cd ~/projects
 
cd ~/projects
Line 106: Line 53:
 
cd moose
 
cd moose
 
git checkout master
 
git checkout master
</source>
+
</pre>
  
 
===Export VTK paths===
 
===Export VTK paths===
Make sure you export after the MOOSE module is loaded, otherwise the path may be empty.
+
Make sure you export the VTK paths '''after''' the MOOSE module is loaded, otherwise the path may be empty.
<source lang=bash>
+
export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
 
</source>
 
 
 
===Update and Build '''libmesh''' – But Interrupt It===
 
Run the update and build libmesh script but interrupt (ctrl-c) the configure/build just after the git clone of libmesh is completed. The goal here is to simply populate the ''libmesh'' directory so that the associated ''configure'' script can be patched.
 
<source lang=bash>
 
cd ~/projects/moose/scripts
 
./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required
 
(ctrl-c)
 
</source>
 
  
===Patch the ''libmesh'' configure script===
+
===Peacock===
Patch the "configure" script according to the following command. The configure.patch file is available in the folder.  
+
To use the [https://mooseframework.inl.gov/application_usage/peacock.html peacock] input file syntax front-end, add the path to the 'python' directory in your MOOSE install as peacock requires the mooseutils python module. E.g.
Note: in the future, this patch will be merged and distributed with MOOSE/libmesh.
+
export PYTHONPATH=/my/moose/dir/python:$PYTHONPATH
<source lang=bash>
 
cd ~/projects/moose/libmesh
 
patch -p1 < configure.patch
 
</source>
 
  
 
===Update and Build ''libmesh''===
 
===Update and Build ''libmesh''===
Run the update and build libmesh script again and let it complete. It should finish without errors.
+
Run the update and build libmesh script. This step takes a while but should finish without errors (though you '''will''' see some warnings).
<source lang=bash>
+
<pre>
cd ~/projects/moose/scripts
+
cd ~/projects/moose
./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required
+
./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required --with-vtk-lib=${HPC_VTK_LIB} --with-vtk-include=${HPC_VTK_INC}
</source>
+
</pre>
  
 
===Compile MOOSE Tests===
 
===Compile MOOSE Tests===
Build the MOOSE test executable.
+
Build the MOOSE test executables.
<source lang=bash>
+
<pre>
 
cd ~/projects/moose/test
 
cd ~/projects/moose/test
 
make -j 4
 
make -j 4
</source>
+
</pre>
  
 
===Run the Test Suites===
 
===Run the Test Suites===
 
Now run the tests. The test suite in moose/test is important to check if all libraries are correct. You need to exit the dev session to run the tests. If the build was successful, I would recommend building the module you are interested in and then exit and run both test scripts. Don’t forget that you may need to reload the moose module on the login node.
 
Now run the tests. The test suite in moose/test is important to check if all libraries are correct. You need to exit the dev session to run the tests. If the build was successful, I would recommend building the module you are interested in and then exit and run both test scripts. Don’t forget that you may need to reload the moose module on the login node.
<source lang=bash>
+
<pre>
 
cd ~/projects/moose/test
 
cd ~/projects/moose/test
 
./run_tests -j 4 (login node)
 
./run_tests -j 4 (login node)
</source>
+
</pre>
  
 
===Compile and Test Your Own Module===
 
===Compile and Test Your Own Module===
 
Build the executable and run the specific tests.
 
Build the executable and run the specific tests.
<source lang=bash>
+
<pre>
 
cd ~/projects/moose/modules/combined
 
cd ~/projects/moose/modules/combined
 
make -j16 (dev session)
 
make -j16 (dev session)
./run_tests -j10 (login node)
+
./run_tests -j 4(login node)
</source>
+
</pre>
 
 
  
 
===General suggestions===
 
===General suggestions===
 
If you are not getting a fresh clone of MOOSE, it’s recommended to do a clean-up before you recompile your executables.
 
If you are not getting a fresh clone of MOOSE, it’s recommended to do a clean-up before you recompile your executables.
<source lang=bash>
+
<pre>
 
cd ~/projects/moose/modules/combined
 
cd ~/projects/moose/modules/combined
 
make clobberall
 
make clobberall
 
make -j 4
 
make -j 4
</source>
+
</pre>
 
 
You may consider adding the module load and export steps to your ~/.bash_profile:
 
<source lang=bash>
 
vim ~/.bash_profile
 
 
 
module purge
 
module load moose/12-aug-20
 
export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
 
</source>
 
  
 
===MOOSE-based Applications===
 
===MOOSE-based Applications===
 
Perform all the previous steps related to the MOOSE installation. Then build your application and test it.
 
Perform all the previous steps related to the MOOSE installation. Then build your application and test it.
<source lang=bash>
+
<pre>
cd ~/projects/your_app
+
cd ~/projects/YourAppName
 
make clobberall
 
make clobberall
 
make -j 4
 
make -j 4
 
./run_tests -j 4 (login node)
 
./run_tests -j 4 (login node)
</source>
+
</pre>
  
===Slurm script===
+
===SLURM Job Script===
This is an example of a slurm script:
+
This is an example of a slurm job script:
<source lang=bash>
+
<pre>
 
#!/bin/sh
 
#!/bin/sh
 
#SBATCH --job-name=moose                #Job name
 
#SBATCH --job-name=moose                #Job name
Line 205: Line 128:
  
 
srun --mpi=pmix_v3 ~/projects/moose/modules/combined-opt -i moose_input_file.i
 
srun --mpi=pmix_v3 ~/projects/moose/modules/combined-opt -i moose_input_file.i
</source>
+
</pre>

Revision as of 17:34, 18 January 2022

Configure and test your own "MOOSE" framework

Note
Make sure your .bashrc and .bash_profile shell initialization scripts are 'clean'. If you encounter errors while setting up MOOSE make sure there are no environment changes caused by those scripts.

Basic Steps

  1. mkdir projects
  2. cd projects
  3. module load moose/26-jul-21
  4. git clone https://github.com/idaholab/moose.git
  5. cd moose
  6. git checkout master
  7. export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
  8. Run "./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required --with-vtk-lib=${HPC_VTK_LIB} --with-vtk-include=${HPC_VTK_INC}" and allow it to complete. It should finish without errors.
  9. cd test; make -j 4; ./run_tests -j 4
  10. Build and test the "phase field" module:
    1. cd moose/modules/phase_field
    2. make -j 4
    3. ./run_tests -j 4
  11. Build and test the "combined" module:
    1. cd moose/modules/combined
    2. make -j 4
    3. ./run_tests -j 4

Additional Information

See Bugzilla Request #43807

More Detailed Instructions

Load the MOOSE Module

There are multiple versions. Each version was built with different versions of the required libraries. In general, the latest version of the libraries will be used in the latest version of the module. The module listed below is good in general, but you may have specific reasons to use a different version.

module load moose/26-jul-21


Check the module list, you should see something like:

module list

Currently Loaded Modules:
  1) gcc/9.3.0
  2) mkl/2020.0.166 
  3) openmpi/4.1.1 
  4) petsc/3.15.1 
  5) qt/5.12.9 
  6) vtk/8.2.0 
  7) moose/26-jul-21

Clone MOOSE

mkdir ~/projects
cd ~/projects
git clone https://github.com/idaholab/moose.git
cd moose
git checkout master

Export VTK paths

Make sure you export the VTK paths after the MOOSE module is loaded, otherwise the path may be empty.

export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}

Peacock

To use the peacock input file syntax front-end, add the path to the 'python' directory in your MOOSE install as peacock requires the mooseutils python module. E.g.

export PYTHONPATH=/my/moose/dir/python:$PYTHONPATH

Update and Build libmesh

Run the update and build libmesh script. This step takes a while but should finish without errors (though you will see some warnings).

cd ~/projects/moose
./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required --with-vtk-lib=${HPC_VTK_LIB} --with-vtk-include=${HPC_VTK_INC}

Compile MOOSE Tests

Build the MOOSE test executables.

cd ~/projects/moose/test
make -j 4

Run the Test Suites

Now run the tests. The test suite in moose/test is important to check if all libraries are correct. You need to exit the dev session to run the tests. If the build was successful, I would recommend building the module you are interested in and then exit and run both test scripts. Don’t forget that you may need to reload the moose module on the login node.

cd ~/projects/moose/test
./run_tests -j 4 (login node)

Compile and Test Your Own Module

Build the executable and run the specific tests.

cd ~/projects/moose/modules/combined
make -j16 (dev session)
./run_tests -j 4(login node)

General suggestions

If you are not getting a fresh clone of MOOSE, it’s recommended to do a clean-up before you recompile your executables.

cd ~/projects/moose/modules/combined
make clobberall
make -j 4

MOOSE-based Applications

Perform all the previous steps related to the MOOSE installation. Then build your application and test it.

cd ~/projects/YourAppName
make clobberall
make -j 4
./run_tests -j 4 (login node)

SLURM Job Script

This is an example of a slurm job script:

#!/bin/sh
#SBATCH --job-name=moose                 #Job name
#SBATCH --nodes=1                        #Number of nodes (servers, 32 proc/node)
#SBATCH --ntasks=16                      #Number of tasks/MPI RankS
#SBATCH --ntasks-per-node=16             #Tasks per node
#SBATCH --ntasks-per-socket=8            #Tasks per socket
#SBATCH --cpus-per-task=1                #Number of CPU per task
#SBATCH --mem-per-cpu=3600mb             #Memory (120 gig/server)
#SBATCH --distribution=cyclic:cyclic     #Distribute tasks cyclically 
#SBATCH --time=12:00:00                  #Walltime days-hh:mm:ss
#SBATCH --output=moose-%j.out            #Output and error log
#SBATCH --mail-type=END,FAIL             #When to email user
#SBATCH --mail-user=your-email@ufl.edu   #Email address to send mail to
#SBATCH --account=michael.tonks          #Allocation group name, add -b for burst job

srun --mpi=pmix_v3 ~/projects/moose/modules/combined-opt -i moose_input_file.i