Llama

From UFRC
Revision as of 03:47, 10 May 2024 by Zhao.qian (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Description

llama website  

This module aims to provide a scalable library for fine-tuning Meta Llama models, which can help users quickly get started with using the models in various use cases. This includes fine-tuning for domain adaptation and building LLM-based applications with Meta Llama and other tools in the LLM ecosystem. The module can also showcase how to run Meta Llama locally, in the cloud, and on-premises.

Environment Modules

Run module spider llama to find out what environment modules are available for this application.

System Variables

  • HPC_LLAMA_DIR - installation directory
  • HPC_LLAMA_BIN - bin subdirectory