GARLI is a program that performs phylogenetic inference using the maximum-likelihood criterion. Several sequence types are supported, including nucleotide, amino acid and codon. Version 2.0 adds support for partitioned models and morphology-like datatypes.
Garli 2.0 is installed in /apps/garli/2.0
Running the application using modules
To use garli with the environment modules system at HPC the following commands are available:
Get module information for garli:
$module spider garli
Load the default application module:
$module load garli
The modulefile for this software adds the directory with executable files to the shell execution PATH and sets the following environment variables:
- HPC_GARLI_DIR - directory where garli is located.
- garli/2.0 - the serial version of Garli (default).
- load the intel module to enable the OpenMP (multi-threaded) version of Garli.
- load both the intel and the openmpi modules to enable the MPI version of Garli
module load intel/2012 garli
module load intel/2012 openmpi/1.6 garli
How To Run
See Garli FAQ While the garli module provides the serial version of Garli there is a garli/2.0-mp module that provides both the multithreaded (OpenMP) and an MPI (OpenMPI) versions of Garli.
To use the OpenMP version of Garli, you must set the enviroment variables OMP_NUM_THREADS and OMP_THREAD_LIMIT to the appropriate values based on the core number request in your submission script. For example if you use #PBS -l nodes=1:ppn=8 in your script, also include export OMP_NUM_THREADS=7; export OMP_THREAD_LIMIT=8 (for a bash script) in the script itself (see the user manual for more information). To decide whether the performance gain with the multithreaded version is worth it for your particular job see the Garli FAQ on using the OpenMP version.
The MPI version of Garli is discussed in the Garli FAQ. See the FAQ entry help to decide whether it is appropriate to use the MPI version of Garli for your job. If you decide to use the MPI version of Garli, use the
mpirun MPI wrapper to run it. For example,
mpirun -np 12 Garli-mpi 12