NLP

From UFRC
Revision as of 16:20, 25 August 2022 by Ericeric (talk | contribs)
Jump to navigation Jump to search

Description

This page describes natural language processing software and resources on HiperGator. NLP is involved in many other fields of AI, such as image recognition. Research computing can help with language modeling for knowledge exploration, measurement, classification, summarization, conversational AI, or other uses via support requests or consulting. NVIDIA Megatron and NeMo are open-source software using transformer neural networks that can scale to multiple nodes of GPUs. See the directory /data/ai for more information.

Environment Modules for NLP

  • nlp: module load nlp provides a Python environment with pytorch, torchtext, nltk, Spacy, transformers, sentence-transformers, Flair, BERTopic for topic modeling, sentencepiece, RAPIDSai for data processing and machine learning algorithms, gensim, scikit-learn, and more.
  • ngc-pytorch: module load ngc-pytorch will provides a singularity container Pythong environment with pytorch including the Nvidia Apex optimizers required for Megatron-LM. Research computing has pretrained, large parameter Megatron language models available to HiperGator users. See /data/ai/examples/nlp or AI_Examples for more information.
  • nemo: module load nemo will provide a singularity container environment with Python and Nvidia NeMo. NeMo has NLP task training, plus speech-to-text and text-to-speech models, and the option to apply your own pretrained Megatron language models.


  • pytorch or tensorflow: Note, use module spider pytorch or tensorflow to list the version we have available. If the nlp environments or these environments do not have libraries you require, you made need to create a Conda environment. See Conda and Managing_Python_environments_and_Jupyter_kernels for more details.


  • spark-nlp: See our Spark help doc to start a Spark cluster. Spark-nlp Python module is available in tensorflow/2.4.1.


  • parlai: Conversational AI framework by Facebook, includes a wide variety of models from 110M to 9B parameters.

UF Language Models

Research Computing has provided starter large parameter language models for researchers. These will be followed by improved models available to researchers and UF business operations. See the examples or /data/ai/models/nlp/UFLMs for details. These models are the property of UF Research Computing and should not be duplicated.

Examples and Reference Data

Please see /data/ai/ folder, AI_Examples, and AI_Reference_Datasets for helpful resources. Notebooks and batch scripts cover everything from pretraining and inferencing to summarization, information extraction, and topic modeling. Addition reference data, including benchmarks such as the popular superglue, are already available in /data/ai/benchmarks/nlp.