Llama
Jump to navigation
Jump to search
Description
The goal of this module is to provide a scalable library for fine-tuning Meta Llama models, along with some example scripts and notebooks to quickly get started with using the models in a variety of use cases, including fine-tuning for domain adaptation and building LLM-based applications with Meta Llama and other tools in the LLM ecosystem. The module can also showcase how to run Meta Llama locally, in the cloud, and on-prem.
Environment Modules
Run module spider llama
to find out what environment modules are available for this application.
System Variables
- HPC_LLAMA_DIR - installation directory
- HPC_LLAMA_BIN - bin subdirectory