|
|
(One intermediate revision by the same user not shown) |
Line 27: |
Line 27: |
| See [https://support.rc.ufl.edu/show_bug.cgi?id=43807 Bugzilla Request #43807] | | See [https://support.rc.ufl.edu/show_bug.cgi?id=43807 Bugzilla Request #43807] |
|
| |
|
| ==More Detailed Instructions== | | ==Running MOOSE== |
| From a development session ([[Development and Testing]])
| | Now that Moose is configured, learn how to use it at [[Running MOOSE]]. Or from a development session ([[Development and Testing]]). |
| ===Load the MOOSE Module===
| |
| There are multiple versions. Each version was built with different versions of the required libraries. In general, the latest version of the libraries will be used in the latest version of the module. The module listed below is good in general, but you may have specific reasons to use a different version.
| |
| | |
| module load moose/26-jul-21
| |
| | |
| | |
| Check the module list, you should see something like:
| |
| | |
| <pre>
| |
| module list
| |
| | |
| Currently Loaded Modules:
| |
| 1) gcc/9.3.0
| |
| 2) mkl/2020.0.166
| |
| 3) openmpi/4.1.1
| |
| 4) petsc/3.15.1
| |
| 5) qt/5.12.9
| |
| 6) vtk/8.2.0
| |
| 7) moose/26-jul-21
| |
| </pre>
| |
| | |
| ===Clone MOOSE===
| |
| <pre>
| |
| mkdir ~/projects
| |
| cd ~/projects
| |
| git clone https://github.com/idaholab/moose.git
| |
| cd moose
| |
| git checkout master
| |
| </pre>
| |
| | |
| ===Export VTK paths===
| |
| Make sure you export the VTK paths '''after''' the MOOSE module is loaded, otherwise the path may be empty.
| |
| export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
| |
| | |
| ===Peacock===
| |
| To use the [https://mooseframework.inl.gov/application_usage/peacock.html peacock] input file syntax front-end, add the path to the 'python' directory in your MOOSE install as peacock requires the mooseutils python module. E.g.
| |
| export PYTHONPATH=/my/moose/dir/python:$PYTHONPATH
| |
| | |
| ===Update and Build ''libmesh''===
| |
| Run the update and build libmesh script. This step takes a while but should finish without errors (though you '''will''' see some warnings).
| |
| <pre>
| |
| cd ~/projects/moose
| |
| ./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required --with-vtk-lib=${HPC_VTK_LIB} --with-vtk-include=${HPC_VTK_INC}
| |
| </pre>
| |
| | |
| ===Compile MOOSE Tests===
| |
| Build the MOOSE test executables.
| |
| <pre>
| |
| cd ~/projects/moose/test
| |
| make -j 4
| |
| </pre>
| |
| | |
| ===Run the Test Suites===
| |
| Now run the tests. The test suite in moose/test is important to check if all libraries are correct. You need to exit the dev session to run the tests. If the build was successful, I would recommend building the module you are interested in and then exit and run both test scripts. Don’t forget that you may need to reload the moose module on the login node.
| |
| <pre>
| |
| cd ~/projects/moose/test
| |
| ./run_tests -j 4 (login node)
| |
| </pre>
| |
| | |
| ===Compile and Test Your Own Module===
| |
| Build the executable and run the specific tests.
| |
| <pre>
| |
| cd ~/projects/moose/modules/combined
| |
| make -j16 (dev session)
| |
| ./run_tests -j 4(login node)
| |
| </pre>
| |
| | |
| ===General suggestions===
| |
| If you are not getting a fresh clone of MOOSE, it’s recommended to do a clean-up before you recompile your executables.
| |
| <pre>
| |
| cd ~/projects/moose/modules/combined
| |
| make clobberall
| |
| make -j 4
| |
| </pre>
| |
| | |
| ===MOOSE-based Applications===
| |
| Perform all the previous steps related to the MOOSE installation. Then build your application and test it.
| |
| <pre>
| |
| cd ~/projects/YourAppName
| |
| make clobberall
| |
| make -j 4
| |
| ./run_tests -j 4 (login node)
| |
| </pre>
| |
| | |
| ===SLURM Job Script===
| |
| This is an example of a slurm job script:
| |
| <pre>
| |
| #!/bin/sh
| |
| #SBATCH --job-name=moose #Job name
| |
| #SBATCH --nodes=1 #Number of nodes (servers, 32 proc/node)
| |
| #SBATCH --ntasks=16 #Number of tasks/MPI RankS
| |
| #SBATCH --ntasks-per-node=16 #Tasks per node
| |
| #SBATCH --ntasks-per-socket=8 #Tasks per socket
| |
| #SBATCH --cpus-per-task=1 #Number of CPU per task
| |
| #SBATCH --mem-per-cpu=3600mb #Memory (120 gig/server)
| |
| #SBATCH --distribution=cyclic:cyclic #Distribute tasks cyclically
| |
| #SBATCH --time=12:00:00 #Walltime days-hh:mm:ss
| |
| #SBATCH --output=moose-%j.out #Output and error log
| |
| #SBATCH --mail-type=END,FAIL #When to email user
| |
| #SBATCH --mail-user=your-email@ufl.edu #Email address to send mail to
| |
| #SBATCH --account=michael.tonks #Allocation group name, add -b for burst job
| |
| | |
| srun --mpi=pmix_v3 ~/projects/moose/modules/combined-opt -i moose_input_file.i
| |
| </pre>
| |
Configure and test your own "MOOSE" framework
- Note
- Make sure your .bashrc and .bash_profile shell initialization scripts are 'clean'. If you encounter errors while setting up MOOSE make sure there are no environment changes caused by those scripts.
Basic Steps
- mkdir projects (from a development session Development and Testing)
- cd projects
- module load moose/26-jul-21
- git clone https://github.com/idaholab/moose.git
- cd moose
- git checkout master
- export VTKLIB_DIR=${HPC_VTK_LIB} VTKINCLUDE_DIR=${HPC_VTK_INC}
- Run "./scripts/update_and_rebuild_libmesh.sh --enable-vtk-required --with-vtk-lib=${HPC_VTK_LIB} --with-vtk-include=${HPC_VTK_INC}" and allow it to complete. It should finish without errors.
- cd test; make -j 4; ./run_tests -j 4 (from a login node)
- Build and test the "phase field" module:
- cd moose/modules/phase_field
- make -j 4
- ./run_tests -j 4
- Build and test the "combined" module:
- cd moose/modules/combined
- make -j 4
- ./run_tests -j 4
Additional Information
See Bugzilla Request #43807
Running MOOSE
Now that Moose is configured, learn how to use it at Running MOOSE. Or from a development session (Development and Testing).