Difference between revisions of "MCNPX"

From UFRC
Jump to navigation Jump to search
Line 25: Line 25:
 
  module load intel mcnpx
 
  module load intel mcnpx
  
Please note, MCNPX is installed on the clusters that have been upgraded to RedHat EL6. To run it interactively, please go to EL6 nodes by:
+
==Single-Threaded Execution==
ssh rhel6
+
==Required Modules==
module load intel mcnpx
+
<source lang=bash>
 +
module load intel
 +
module load mcnpx
 +
</source>
 +
===Sample Submission Script===
 +
<source lang=bash>
 +
#!/bin/bash
 +
#PBS -N mcnpx
 +
#PBS -r n
 +
#PBS -o mcnpout
 +
#PBS -e mcnperr
 +
#PBS -j oe
 +
#PBS -m abe
 +
#PBS -M <your_email_address>
 +
#PBS -l walltime=99:00:00
 +
#PBS -l nodes=1:ppn=1
 +
#PBS -l pmem=900mb
 +
#PBS -q submit
 +
#
 +
module load intel
 +
module load mcnpx
 +
#
 +
cd $PBS_O_WORKDIR
 +
mcnpx i=m24-0-0B o=m24-0-0Bo
 +
</source>
  
To use them via batch job, please specify el6 queue:
+
==Parallel (MPI) Execution==
 
+
==Required Modules==
#PBS -q el6
+
<source lang=bash>
 
+
module load intel
and add the following to your job scripts:
+
module load openmpi
+
module load mcnpx
module load intel mcnpx
+
</source>
 +
===Sample Submission Script===
 +
<source lang=bash>
 +
#!/bin/bash
 +
#PBS -N mpitest
 +
#PBS -r n
 +
#PBS -o mcnpout
 +
#PBS -e mcnperr
 +
#PBS -j oe
 +
#PBS -m abe
 +
#PBS -M <your_email_address>
 +
#PBS -l walltime=99:00:00
 +
#PBS -l nodes=2:ppn=8:infiniband
 +
#PBS -l pmem=900mb
 +
#PBS -q submit
 +
#
 +
module load intel
 +
module load openmpi
 +
module load mcnpx
 +
#
 +
cd $PBS_O_WORKDIR
 +
mpiexec mcnpx i=m24-0-0B o=m24-0-0Bo
 +
</source>
  
 
== Reference  ==
 
== Reference  ==
  
 
MCNPX website: http://mcnpx.lanl.gov/
 
MCNPX website: http://mcnpx.lanl.gov/

Revision as of 00:43, 6 August 2012

Introduction

MCNPX , Monte Carlo N-Particle eXtended, is a general-purpose Monte Carlo radiation transport code for modeling the interaction of radiation with everything. It extends the capabilities of MCNP4C3 to nearly all particles, nearly all energies, and to nearly all applications without an additional computational time penalty.

Please note, MCNPX is installed on HPC Center's RedHat EL6 systems. Please read the section of "Execution Instructions..." below for information on how to access the software.

Version 2.7.0

The configuration and compilation flags used are:

configure --with-FC=ifort --with-CC=icc

Installation location:

/apps/mcnpx/2.7.0 

Execution Instruction for Using Module System

  • What is a Module System:
Module system is a utility to manage application execution environment - compilers, runtime libraries, application executables. Your execution environment can be set by simply loading the appropriate modules. For more information, please refer to From_mpi-selector_to_environment_modules.
  • To use MCNPX:
module load intel mcnpx

Single-Threaded Execution

Required Modules

module load intel
module load mcnpx

Sample Submission Script

#!/bin/bash
#PBS -N mcnpx
#PBS -r n
#PBS -o mcnpout
#PBS -e mcnperr
#PBS -j oe
#PBS -m abe
#PBS -M <your_email_address>
#PBS -l walltime=99:00:00
#PBS -l nodes=1:ppn=1
#PBS -l pmem=900mb
#PBS -q submit
#
module load intel
module load mcnpx
#
cd $PBS_O_WORKDIR
mcnpx i=m24-0-0B o=m24-0-0Bo

Parallel (MPI) Execution

Required Modules

module load intel
module load openmpi
module load mcnpx

Sample Submission Script

#!/bin/bash
#PBS -N mpitest
#PBS -r n
#PBS -o mcnpout
#PBS -e mcnperr
#PBS -j oe
#PBS -m abe
#PBS -M <your_email_address>
#PBS -l walltime=99:00:00
#PBS -l nodes=2:ppn=8:infiniband
#PBS -l pmem=900mb
#PBS -q submit
#
module load intel
module load openmpi
module load mcnpx
#
cd $PBS_O_WORKDIR
mpiexec mcnpx i=m24-0-0B o=m24-0-0Bo

Reference

MCNPX website: http://mcnpx.lanl.gov/