Difference between revisions of "Monai Usage"
Jump to navigation
Jump to search
(One intermediate revision by the same user not shown) | |||
Line 8: | Line 8: | ||
==MONAI label== | ==MONAI label== | ||
NGC container usage: | NGC container usage: | ||
− | + | === 1. Server - to start the server as a slurm job: === | |
− | + | *Load modules: | |
− | + | ml purge | |
− | + | ml ngc-monailabel/<version> | |
− | + | *Copy to your directory the file: | |
− | + | /apps/nvidia/containers/monai/start_monai_server_readonly.sh | |
− | + | *Copy e.g. these directories to a place you own (may vary by use case): | |
− | + | cp -r /apps/nvidia/containers/monai/apps/deepedit <my_place> | |
− | + | cp -r /apps/nvidia/containers/monai/datasets/Task09_Spleen <my_place> | |
− | + | *Modify the start_monai_server_readonly.sh line to read: | |
− | + | singularity exec -B /apps/nvidia/containers/monai /apps/nvidia/containers/monai/monailabel/ monailabel start_server --app <my_place>/deepedit --studies <my_place>/Task09_Spleen/imagesTr | |
− | + | or, for a newer version e.g. (the apps/... and datasets/... directories may be different) | |
− | + | singularity exec -B /apps/nvidia/containers/monai /apps/nvidia/containers/monai/monailabel.0.6.0/0.6.0 monailabel start_server --app <my_place>/deepedit --studies <my_place>/Task09_Spleen/imagesTr | |
− | + | *Start server as a batch job: | |
− | + | sbatch start_monai_server_readonly.sh | |
− | + | *Note server address from job output: used in next step | |
− | + | === 2. 3DSlicer client === | |
− | + | *Start Open On Demand (OOD) session | |
− | + | *Start Console in hwgui with 1 GPU: gpu:geforce:1 | |
− | + | *In console: load & start Slicer | |
− | + | ml qt/5.15.4 slicer/4.13.0 | |
− | + | vglrun -d :0.$CUDA_VISIBLE_DEVICES Slicer | |
− | + | *In Slicer GUI: | |
− | + | **Select module: Active Learning -> MONAILabel | |
+ | **Fill in server address, e.g.: http://c1007a-s17:8000/ | ||
+ | **Click on refresh button next to server address | ||
+ | **Load Next Sample | ||
+ | *You are good to go! Enjoy! |
Latest revision as of 16:41, 2 August 2023
Back to Monai
MONAI core
NGC container usage:
ml purge ml ngc-monai/<version> python <your__python_script>
MONAI label
NGC container usage:
1. Server - to start the server as a slurm job:
- Load modules:
ml purge ml ngc-monailabel/<version>
- Copy to your directory the file:
/apps/nvidia/containers/monai/start_monai_server_readonly.sh
- Copy e.g. these directories to a place you own (may vary by use case):
cp -r /apps/nvidia/containers/monai/apps/deepedit <my_place> cp -r /apps/nvidia/containers/monai/datasets/Task09_Spleen <my_place>
- Modify the start_monai_server_readonly.sh line to read:
singularity exec -B /apps/nvidia/containers/monai /apps/nvidia/containers/monai/monailabel/ monailabel start_server --app <my_place>/deepedit --studies <my_place>/Task09_Spleen/imagesTr or, for a newer version e.g. (the apps/... and datasets/... directories may be different) singularity exec -B /apps/nvidia/containers/monai /apps/nvidia/containers/monai/monailabel.0.6.0/0.6.0 monailabel start_server --app <my_place>/deepedit --studies <my_place>/Task09_Spleen/imagesTr
- Start server as a batch job:
sbatch start_monai_server_readonly.sh
- Note server address from job output: used in next step
2. 3DSlicer client
- Start Open On Demand (OOD) session
- Start Console in hwgui with 1 GPU: gpu:geforce:1
- In console: load & start Slicer
ml qt/5.15.4 slicer/4.13.0 vglrun -d :0.$CUDA_VISIBLE_DEVICES Slicer
- In Slicer GUI:
- Select module: Active Learning -> MONAILabel
- Fill in server address, e.g.: http://c1007a-s17:8000/
- Click on refresh button next to server address
- Load Next Sample
- You are good to go! Enjoy!