Difference between revisions of "Llama"

From UFRC
Jump to navigation Jump to search
(Created page with "Category:Software {|<!--CONFIGURATION: REQUIRED--> |{{#vardefine:app|llama}} |{{#vardefine:url|https://github.com/meta-llama/llama-recipes}} <!--CONFIGURATION: OPTIONAL (|...")
 
Line 25: Line 25:
 
==System Variables==
 
==System Variables==
 
* HPC_{{uc:{{#var:app}}}}_DIR - installation directory
 
* HPC_{{uc:{{#var:app}}}}_DIR - installation directory
 +
* HPC_{{uc:{{#var:app}}}}_BIN - bin subdirectory
 
<!--Configuration-->
 
<!--Configuration-->
 
{{#if: {{#var: conf}}|==Configuration==
 
{{#if: {{#var: conf}}|==Configuration==

Revision as of 03:42, 10 May 2024

Description

llama website  

The goal of this module is to provide a scalable library for fine-tuning Meta Llama models, along with some example scripts and notebooks to quickly get started with using the models in a variety of use cases, including fine-tuning for domain adaptation and building LLM-based applications with Meta Llama and other tools in the LLM ecosystem.

Environment Modules

Run module spider llama to find out what environment modules are available for this application.

System Variables

  • HPC_LLAMA_DIR - installation directory
  • HPC_LLAMA_BIN - bin subdirectory