Difference between revisions of "NLP"

From UFRC
Jump to navigation Jump to search
(Created page with "Category:SoftwareCategory:NLP {|<!--CONFIGURATION: REQUIRED--> |{{#vardefine:app|nlp}} |{{#vardefine:url|}} <!--CONFIGURATION: OPTIONAL (|1}} means it's ON)--> |{{#var...")
 
 
(23 intermediate revisions by 5 users not shown)
Line 1: Line 1:
[[Category:Software]][[Category:NLP]]
+
[[Category:Software]][[Category:Machine Learning]][[Category:Data Science]]
{|<!--CONFIGURATION: REQUIRED-->
+
==Description==
|{{#vardefine:app|nlp}}
 
|{{#vardefine:url|}}
 
<!--CONFIGURATION: OPTIONAL (|1}} means it's ON)-->
 
|{{#vardefine:conf|}}          <!--CONFIGURATION-->
 
|{{#vardefine:exe|}}            <!--ADDITIONAL INFO-->
 
|{{#vardefine:job|}}            <!--JOB SCRIPTS-->
 
|{{#vardefine:policy|}}        <!--POLICY-->
 
|{{#vardefine:testing|}}      <!--PROFILING-->
 
|{{#vardefine:faq|}}            <!--FAQ-->
 
|{{#vardefine:citation|}}      <!--CITATION-->
 
|{{#vardefine:installation|}} <!--INSTALLATION-->
 
|}
 
<!--BODY-->
 
<!--Description-->
 
{{#if: {{#var: url}}|
 
{{App_Description|app={{#var:app}}|url={{#var:url}}|name={{#var:app}}}}|}}
 
  
A collection of natural language processing libraries. This includes but not limited to:
+
Natural language processing software and resources on HiperGator include several different software environments and examples with Nvidia Megatron and other software. NLP is involved in many other fields of AI, such as image recognition. Research Computing can help with language modeling for knowledge exploration, measurement, classification, summarization, conversational AI, or other uses via [https://support.rc.ufl.edu/ support requests] or [https://www.rc.ufl.edu/get-started/purchase-allocation/training--consultation-rates/ consulting].
* pytorch
 
* torchtext
 
* rapidsai
 
* bertopic
 
* nltk
 
* gensim
 
* spacy
 
* scikit-learn
 
  
 +
==Environment Modules for NLP==
 +
*'''nlp:''' <code>module load nlp</code> provides a Python environment with pytorch, torchtext, nltk, Spacy, transformers, sentence-transformers, Flair, BERTopic for topic modeling, sentencepiece, RAPIDSai for data processing and machine learning algorithms, gensim, scikit-learn, and more.
  
<!--Modules-->
 
==Environment Modules==
 
Run <code>module spider {{#var:app}}</code> to find out what environment modules are available for this application.
 
==System Variables==
 
* HPC_{{uc:{{#var:app}}}}_DIR - installation directory
 
* HPC_{{uc:{{#var:app}}}}_BIN - executable directory
 
  
<!--Configuration-->
+
*'''ngc-pytorch:''' <code>module load ngc-pytorch</code> will provides a singularity container Python environment with pytorch including the Nvidia Apex optimizers required for [https://github.com/NVIDIA/Megatron-LM Megatron-LM]. Research computing has pretrained, large parameter Megatron language models available to HiperGator users. See /data/ai/examples/nlp or [[AI_Examples]] for more information.
{{#if: {{#var: conf}}|==Configuration==
 
See the [[{{PAGENAME}}_Configuration]] page for {{#var: app}} configuration details.
 
|}}
 
<!--Run-->
 
{{#if: {{#var: exe}}|==Additional Information==
 
  
WRITE_ADDITIONAL_INSTRUCTIONS_ON_RUNNING_THE_SOFTWARE_IF_NECESSARY
 
  
|}}
+
*'''Flair NLP:''' See [[FlairNLP]] for more information.
<!--Job Scripts-->
 
{{#if: {{#var: job}}|==Job Script Examples==
 
See the [[{{PAGENAME}}_Job_Scripts]] page for {{#var: app}} Job script examples.
 
|}}
 
<!--Policy-->
 
{{#if: {{#var: policy}}|==Usage Policy==
 
  
WRITE USAGE POLICY HERE (Licensing, usage, access).
 
  
|}}
+
*'''nemo:''' <code>module load nemo</code> will provide a singularity container environment with Python and Nvidia NeMo. NeMo has NLP task training, plus speech-to-text and text-to-speech models, and the option to apply your own pretrained Megatron language models.
<!--Performance-->
 
{{#if: {{#var: testing}}|==Performance==
 
  
WRITE_PERFORMANCE_TESTING_RESULTS_HERE
 
  
|}}
+
*'''pytorch or tensorflow:''' Note, use <code>module spider pytorch</code> or <code>tensorflow</code> to list the version we have available. If the nlp environments or these environments do not have libraries you require, you made need to create a Conda environment. See [[Conda]] and [[Managing_Python_environments_and_Jupyter_kernels]] for more details.
<!--Faq-->
 
{{#if: {{#var: faq}}|==FAQ==
 
*'''Q:''' **'''A:'''|}}
 
<!--Citation-->
 
{{#if: {{#var: citation}}|==Citation==
 
If you publish research that uses {{#var:app}} you have to cite it as follows:
 
  
WRITE_CITATION_HERE
 
  
|}}
+
*'''spark-nlp:''' See our [[Spark]] help doc to start a Spark cluster. Spark-nlp Python module is available in tensorflow/2.4.1.
<!--Installation-->
+
 
{{#if: {{#var: installation}}|==Installation==
+
 
See the [[{{PAGENAME}}_Install]] page for {{#var: app}} installation notes.|}}
+
*'''parlai:''' Conversational AI framework by Facebook, includes a wide variety of models from 110M to 9B parameters. 
<!--Turn the Table of Contents and Edit paragraph links ON/OFF-->
+
 
__NOTOC____NOEDITSECTION__
+
==Large Language Models==
 +
 
 +
Many large models are available for open source download, although may require different software frameworks or end user license agreements. Starter LLMs trained using Megatron-LM are available in the examples and reference data folder. These models can be applied as is, trained more, or fine-tuned. Starter models include a 20B GPT and a 9B parameter BERT. Please create a help ticket for more information.
 +
 
 +
 
 +
==Examples and Reference Data==
 +
 
 +
Please see <code>/data/ai/</code> folder, [[AI_Examples]], and [[AI_Reference_Datasets]] for helpful resources.  Notebooks and batch scripts cover everything from pretraining and inferencing to summarization, information extraction, and topic modeling. Addition reference data, including benchmarks such as the popular [https://super.gluebenchmark.com/tasks superglue], are already available in <code>/data/ai/benchmarks/nlp</code>.

Latest revision as of 15:06, 26 September 2023

Description

Natural language processing software and resources on HiperGator include several different software environments and examples with Nvidia Megatron and other software. NLP is involved in many other fields of AI, such as image recognition. Research Computing can help with language modeling for knowledge exploration, measurement, classification, summarization, conversational AI, or other uses via support requests or consulting.

Environment Modules for NLP

  • nlp: module load nlp provides a Python environment with pytorch, torchtext, nltk, Spacy, transformers, sentence-transformers, Flair, BERTopic for topic modeling, sentencepiece, RAPIDSai for data processing and machine learning algorithms, gensim, scikit-learn, and more.


  • ngc-pytorch: module load ngc-pytorch will provides a singularity container Python environment with pytorch including the Nvidia Apex optimizers required for Megatron-LM. Research computing has pretrained, large parameter Megatron language models available to HiperGator users. See /data/ai/examples/nlp or AI_Examples for more information.


  • Flair NLP: See FlairNLP for more information.


  • nemo: module load nemo will provide a singularity container environment with Python and Nvidia NeMo. NeMo has NLP task training, plus speech-to-text and text-to-speech models, and the option to apply your own pretrained Megatron language models.


  • pytorch or tensorflow: Note, use module spider pytorch or tensorflow to list the version we have available. If the nlp environments or these environments do not have libraries you require, you made need to create a Conda environment. See Conda and Managing_Python_environments_and_Jupyter_kernels for more details.


  • spark-nlp: See our Spark help doc to start a Spark cluster. Spark-nlp Python module is available in tensorflow/2.4.1.


  • parlai: Conversational AI framework by Facebook, includes a wide variety of models from 110M to 9B parameters.

Large Language Models

Many large models are available for open source download, although may require different software frameworks or end user license agreements. Starter LLMs trained using Megatron-LM are available in the examples and reference data folder. These models can be applied as is, trained more, or fine-tuned. Starter models include a 20B GPT and a 9B parameter BERT. Please create a help ticket for more information.


Examples and Reference Data

Please see /data/ai/ folder, AI_Examples, and AI_Reference_Datasets for helpful resources. Notebooks and batch scripts cover everything from pretraining and inferencing to summarization, information extraction, and topic modeling. Addition reference data, including benchmarks such as the popular superglue, are already available in /data/ai/benchmarks/nlp.