2023-12-13 04:48:15 +08:00
# Llama
2023-11-30 02:38:20 +08:00
2023-12-13 04:48:15 +08:00
An example of generating text with Llama (1 or 2) using MLX.
2023-11-30 02:38:20 +08:00
2023-12-13 04:48:15 +08:00
Llama is a set of open source language models from Meta AI Research[^1][^2]
2023-12-16 11:51:51 +08:00
ranging from 7B to 70B parameters. This example also supports Llama Chat and
Code Llama.
2023-11-30 02:38:20 +08:00
### Setup
Install the dependencies:
```
pip install -r requirements.txt
```
Next, download and convert the model. If you do not have access to the model
2023-12-13 05:32:05 +08:00
weights you will need to request access from Meta:
- [Request Llama v1 ](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform )
- [Request Llama v2 ](https://ai.meta.com/resources/models-and-libraries/llama-downloads/ )
2023-11-30 02:38:20 +08:00
2023-12-13 04:48:15 +08:00
Alternatively, you can also download a select converted checkpoints from the
[mlx-llama ](https://huggingface.co/mlx-llama ) community organisation on Hugging
Face and skip the conversion step.
2023-12-07 02:13:14 +08:00
2023-11-30 02:38:20 +08:00
Convert the weights with:
```
2023-12-13 04:48:15 +08:00
python convert.py --model_path < path_to_torch_model >
2023-11-30 02:38:20 +08:00
```
2023-12-13 04:48:15 +08:00
The conversion script will save the converted weights in the same location.
2023-11-30 02:38:20 +08:00
### Run
Once you've converted the weights to MLX format, you can interact with the
2023-12-13 04:48:15 +08:00
LlaMA model:
2023-11-30 02:38:20 +08:00
```
2023-12-13 04:48:15 +08:00
python llama.py < path_to_model > < path_to_tokenizer.model > "hello"
2023-11-30 02:38:20 +08:00
```
Run `python llama.py --help` for more details.
2023-12-13 04:48:15 +08:00
[^1]: For Llama v1 refer to the [arXiv paper ](https://arxiv.org/abs/2302.13971 ) and [blog post ](https://ai.meta.com/blog/large-language-model-llama-meta-ai/ ) for more details.
[^2]: For Llama v2 refer to the [blob post ](https://ai.meta.com/llama/ )