Llama: Difference between revisions

From UFRC
Jump to navigation Jump to search
No edit summary
No edit summary
Line 18: Line 18:
{{App_Description|app={{#var:app}}|url={{#var:url}}|name={{#var:app}}}}|}}
{{App_Description|app={{#var:app}}|url={{#var:url}}|name={{#var:app}}}}|}}


The goal of this module is to provide a scalable library for fine-tuning Meta Llama models, along with some example scripts and notebooks to quickly get started with using the models in a variety of use cases, including fine-tuning for domain adaptation and building LLM-based applications with Meta Llama and other tools in the LLM ecosystem.  
The goal of this module is to provide a scalable library for fine-tuning Meta Llama models, along with some example scripts and notebooks to quickly get started with using the models in a variety of use cases, including fine-tuning for domain adaptation and building LLM-based applications with Meta Llama and other tools in the LLM ecosystem. The module can also showcase how to run Meta Llama locally, in the cloud, and on-prem.


<!--Modules-->
<!--Modules-->

Revision as of 03:44, 10 May 2024

Description

llama website  

The goal of this module is to provide a scalable library for fine-tuning Meta Llama models, along with some example scripts and notebooks to quickly get started with using the models in a variety of use cases, including fine-tuning for domain adaptation and building LLM-based applications with Meta Llama and other tools in the LLM ecosystem. The module can also showcase how to run Meta Llama locally, in the cloud, and on-prem.

Environment Modules

Run module spider llama to find out what environment modules are available for this application.

System Variables

  • HPC_LLAMA_DIR - installation directory
  • HPC_LLAMA_BIN - bin subdirectory