SLURM Partition Limits: Difference between revisions

From UFRC
Jump to navigation Jump to search
Created page with "Different sets of hardware resources presented as SLURM partitions have individual time limits. ==Interactive Work== Partitions: hpg-dev, gpu, hpg-ai * Default time limit if n..."
 
No edit summary
Line 4: Line 4:
* Default time limit if not specified (Default): 10 min
* Default time limit if not specified (Default): 10 min
* hpg-dev maximum: 12 hours
* hpg-dev maximum: 12 hours
* gpu and hpg-ai Maximum: 12 hours for '''srun .... --pty bash -i''' sessions, 72 hours for Jupyter sesions in Open OnDemand.
* gpu and hpg-ai  
** Maximum: 12 hours for '''srun .... --pty bash -i''' sessions
** Maximum: 72 hours for Jupyter sesions in Open OnDemand.
==Jupyter==
==Jupyter==
* JupyterHub: Sessions are preset with individual limits shown in the menu
* JupyterHub: Sessions are preset with individual limits shown in the menu

Revision as of 19:48, 23 March 2023

Different sets of hardware resources presented as SLURM partitions have individual time limits.

Interactive Work

Partitions: hpg-dev, gpu, hpg-ai

  • Default time limit if not specified (Default): 10 min
  • hpg-dev maximum: 12 hours
  • gpu and hpg-ai
    • Maximum: 12 hours for srun .... --pty bash -i sessions
    • Maximum: 72 hours for Jupyter sesions in Open OnDemand.

Jupyter

  • JupyterHub: Sessions are preset with individual limits shown in the menu
  • JupyterLab in Open OnDemand Maximum: 72 hours

GPU/HPG-AI Partitions

  • Default: 10 min
  • Maximum: 7 days

Compute Partitions

Partitions
hpg-default, hpg2-compute, bigmem

Investment QOS

  • Default: 10 min
  • Maximum: 31 days (744 hours)

Burst QOS

  • Default: 10 min
  • Maximum: 4 days (96 hours)

Hardware Accelerated GUI

Partition
hwgui
  • Default: 10 min
  • Maximum: 4 days (96 hours)