Difference between revisions of "Monai Usage"

From UFRC
Jump to navigation Jump to search
Line 8: Line 8:
 
==MONAI label==
 
==MONAI label==
 
NGC container usage:
 
NGC container usage:
#Server: to start the server as a slurm job:
+
=== 1. Server - to start the server as a slurm job: ===
#*<pre>ml purge</pre>
+
*Load modules:
#*<pre>ml ngc-monailabel/<version></pre>
+
  ml purge
#*Copy to your directory the file:
+
  ml ngc-monailabel/<version>
#**<pre>/apps/nvidia/containers/monai/start_monai_server_readonly.sh</pre>
+
*Copy to your directory the file:
#*Copy these directories to a place you own, e.g.
+
  /apps/nvidia/containers/monai/start_monai_server_readonly.sh
#**<pre>cp -r /apps/nvidia/containers/monai/apps/deepedit <my_place></pre>
+
*Copy these directories to a place you own, e.g.
#**<pre>cp -r /apps/nvidia/containers/monai/datasets/Task09_Spleen <my_place></pre>
+
  cp -r /apps/nvidia/containers/monai/apps/deepedit <my_place>
#*Modify the start_monai_server_readonly.sh line to read:
+
  cp -r /apps/nvidia/containers/monai/datasets/Task09_Spleen <my_place>
#**<pre>singularity exec -B /apps/nvidia/containers/monai /apps/nvidia/containers/monai/monailabel/ monailabel start_server --app <my_place>/deepedit --studies <my_place>/Task09_Spleen/imagesTr</pre>
+
*Modify the start_monai_server_readonly.sh line to read:
#*<pre>sbatch start_monai_server_readonly.sh</pre>
+
  singularity exec -B /apps/nvidia/containers/monai /apps/nvidia/containers/monai/monailabel/ monailabel start_server --app <my_place>/deepedit --studies <my_place>/Task09_Spleen/imagesTr
#*Note server address from job output: used in next step
+
*Start server as a batch job:
#3DSlicer client
+
  sbatch start_monai_server_readonly.sh
#*Start Open On Demand (OOD) session
+
*Note server address from job output: used in next step
#*Start Console in hwgui with 1 GPU: gpu:geforce:1
+
=== 2. 3DSlicer client ===
#*In console: load & start Slicer
+
*Start Open On Demand (OOD) session
#**<pre>ml qt/5.15.4 slicer/4.13.0</pre>
+
*Start Console in hwgui with 1 GPU: gpu:geforce:1
#**<pre>vglrun -d :0.$CUDA_VISIBLE_DEVICES Slicer</pre>
+
*In console: load & start Slicer
#*In Slicer GUI:
+
  ml qt/5.15.4 slicer/4.13.0
#**Select module: Active Learning -> MONAILabel
+
  vglrun -d :0.$CUDA_VISIBLE_DEVICES Slicer
#**Fill in server address, e.g.: http://c1007a-s17:8000/
+
*In Slicer GUI:
#**Click on refresh button next to server address
+
**Select module: Active Learning -> MONAILabel
#**Load Next Sample
+
**Fill in server address, e.g.: http://c1007a-s17:8000/
#*You are good to go! Enjoy!
+
**Click on refresh button next to server address
 +
**Load Next Sample
 +
*You are good to go! Enjoy!

Revision as of 21:48, 1 August 2023

Back to Monai

MONAI core

NGC container usage:

ml purge
ml ngc-monai/<version>
python <your__python_script>

MONAI label

NGC container usage:

1. Server - to start the server as a slurm job:

  • Load modules:
  ml purge
  ml ngc-monailabel/<version>
  • Copy to your directory the file:
  /apps/nvidia/containers/monai/start_monai_server_readonly.sh
  • Copy these directories to a place you own, e.g.
  cp -r /apps/nvidia/containers/monai/apps/deepedit <my_place>
  cp -r /apps/nvidia/containers/monai/datasets/Task09_Spleen <my_place>
  • Modify the start_monai_server_readonly.sh line to read:
  singularity exec -B /apps/nvidia/containers/monai /apps/nvidia/containers/monai/monailabel/ monailabel start_server --app <my_place>/deepedit --studies <my_place>/Task09_Spleen/imagesTr
  • Start server as a batch job:
 sbatch start_monai_server_readonly.sh
  • Note server address from job output: used in next step

2. 3DSlicer client

  • Start Open On Demand (OOD) session
  • Start Console in hwgui with 1 GPU: gpu:geforce:1
  • In console: load & start Slicer
 ml qt/5.15.4 slicer/4.13.0
 vglrun -d :0.$CUDA_VISIBLE_DEVICES Slicer
  • In Slicer GUI:
    • Select module: Active Learning -> MONAILabel
    • Fill in server address, e.g.: http://c1007a-s17:8000/
    • Click on refresh button next to server address
    • Load Next Sample
  • You are good to go! Enjoy!