Difference between revisions of "SLURM Partition Limits"

From UFRC
Jump to navigation Jump to search
Line 12: Line 12:
 
==Jupyter==
 
==Jupyter==
 
* JupyterHub: Sessions are preset with individual limits shown in the menu
 
* JupyterHub: Sessions are preset with individual limits shown in the menu
* JupyterLab in Open OnDemand Maximum: 72 hours
+
* JupyterLab in Open OnDemand Maximum: 72 hours for the GPU partition, other partitions follow standard partition limits
 +
 
 
==GPU/HPG-AI Partitions==
 
==GPU/HPG-AI Partitions==
 
* Default: 10 min
 
* Default: 10 min

Revision as of 20:03, 23 March 2023

Different sets of hardware resources presented as SLURM partitions have individual time limits.

Interactive Work

Partitions: hpg-dev, gpu, hpg-ai

  • Default time limit if not specified (Default): 10 min
  • hpg-dev Maximum: 12 hours
  • gpu
    • Maximum: 12 hours for srun .... --pty bash -i sessions
    • Maximum: 72 hours for Jupyter sessions in Open OnDemand.
  • hpg-ai
    • Maximum: 12 hours for srun .... --pty bash -i sessions

Jupyter

  • JupyterHub: Sessions are preset with individual limits shown in the menu
  • JupyterLab in Open OnDemand Maximum: 72 hours for the GPU partition, other partitions follow standard partition limits

GPU/HPG-AI Partitions

  • Default: 10 min
  • Maximum: 7 days

Compute Partitions

Partitions
hpg-default, hpg2-compute, bigmem

Investment QOS

  • Default: 10 min
  • Maximum: 31 days (744 hours)

Burst QOS

  • Default: 10 min
  • Maximum: 4 days (96 hours)

Hardware Accelerated GUI

Partition
hwgui
  • Default: 10 min
  • Maximum: 4 days (96 hours)