AI Help

From UFRC
Jump to navigation Jump to search

New User's Guide

For new users on HiPerGator, please read Getting Started to get yourself familiar with HiPerGator system and take New user training with step-by-step instructions on how to use HiPerGator.

AI Education and Training provides learning materials and training videos on various AI topics. JupyterHub and Jupyter Notebooks on HiPerGator are popular platforms for developing and running AI programs.

AI Software

A comprehensive software stack for AI research is available on HiPerGator for both CPU and GPU accelerated applications. The NLP page has more information for software environment on Natural Language Processing.

AI Frameworks

AI frameworks provide building blocks for designing and training machine learning and deep learning models. The following AI frameworks are available on HiPerGator.

PyTorch

PyTorch is a deep learning framework developed by Facebook AI Research Lab and has interfaces for Python, Java, and C++, but is most commonly used with Python. It supports training on both GPU and CPUs, as well as distributed training and multi-GPU models. See our PyTorch quickstart page for help getting started using PyTorch on HiPerGator.

Tensorflow/Keras

TensorFlow is an open-source AI framework/platform developed by the Google Brain team. Keras is an open-source neural network library which runs on top of TensorFlow. With TensorFlow 2.0, the Keras API has been integrated in TensorFlow's core library and serves as a high-level Python interface for TensorFlow. TensorFlow supports both GPU and CPUs, as well as multi-GPU and distributed training. APIs are available for Python, Java, Go and C++. See our TensorFlow quickstart page for help getting started using TensorFlow on HiPerGator.

Tensorboard is a visualization tool for monitoring neural network training.

Sci-kit Learn

Sci-kit learn is a Python library for machine learning and statistical modeling. It is available in many of the Python modules on HiPerGator.

MATLAB

Matlab provides convenient toolboxes for machine learning, deep learning, computer vision and automatic driving, which are supported on both CPUs and GPUs.

Fastai

Fastai simplifies training fast and accurate neural nets using modern best practices. It can be used without any installation by using Google Colab.

NVIDIA AI Software

Nvidia provides comprehensive, GPU-accelerated software libraries, toolkits, frameworks and packages for big-data and AI applications. Many of the libraries are included in the CUDA installation on HiPerGator, such as cuDNN. The following domain specific CUDA enabled tools are available on HiPerGator:

Clara Parabricks

Clara Parabricks ([1]) is a computation framework for genomics applications. It builds GPU accelerated libraries, pipelines, and reference AI workflows for genomics research. We have a license for this software through 2021.

Clara MONAI

Clara MONAI ([2]) is the open-source foundation being created by Project MONAI. MONAI is a freely available, community-supported, PyTorch-based framework for deep learning in healthcare imaging. It provides domain-optimized foundational capabilities for developing healthcare imaging training workflows in a native PyTorch paradigm.

Megatron-LM

Megatron-LM can train several architectures of language models, including an GPT, T5, and an improved BERT. Megatron also recently added a transformer-based image classification architecture.

Modulus

Modulus ([3]) is a neural network framework that blends the power of physics in the form of governing partial differential equations (PDEs) with data to build high-fidelity, parameterized surrogate models with near-real-time latency.

NeMo

Nemo ([4]) is an open-source Python, GPU-accelerated toolkit for conversational AI, including speech recognition (ASR), natural language processing (NLP) and text to speech (TTS) applications. NeMo is available via a container in the apps folder.

RAPIDS

RAPIDS ([5]) The RAPIDS suite of open source software libraries and APIs gives you the ability to execute end-to-end data science and analytics pipelines entirely on GPUs.

Triton

Triton ([6]) Nvidia Triton Inference Server provides a cloud and edge inferencing solution optimized for both CPUs and GPUs. Triton supports an HTTP/REST and GRPC protocol that allows remote clients to request inferencing for any model being managed by the server.

AI Reference Datasets

A variety of reference machine learning and AI datasets are located in /data/ai/ref-data. Browse the catalog of all available AI reference datasets to learn more.