Difference between revisions of "Llama"
Jump to navigation
Jump to search
Line 18: | Line 18: | ||
{{App_Description|app={{#var:app}}|url={{#var:url}}|name={{#var:app}}}}|}} | {{App_Description|app={{#var:app}}|url={{#var:url}}|name={{#var:app}}}}|}} | ||
− | + | This module aims to provide a scalable library for fine-tuning Meta Llama models, which can help users quickly get started with using the models in various use cases. This includes fine-tuning for domain adaptation and building LLM-based applications with Meta Llama and other tools in the LLM ecosystem. The module can also showcase how to run Meta Llama locally, in the cloud, and on-premises. | |
<!--Modules--> | <!--Modules--> |
Latest revision as of 03:47, 10 May 2024
Description
This module aims to provide a scalable library for fine-tuning Meta Llama models, which can help users quickly get started with using the models in various use cases. This includes fine-tuning for domain adaptation and building LLM-based applications with Meta Llama and other tools in the LLM ecosystem. The module can also showcase how to run Meta Llama locally, in the cloud, and on-premises.
Environment Modules
Run module spider llama
to find out what environment modules are available for this application.
System Variables
- HPC_LLAMA_DIR - installation directory
- HPC_LLAMA_BIN - bin subdirectory