mlx-examples/mistral/README.md

48 lines
1.1 KiB
Markdown
Raw Normal View History

2023-12-06 03:02:52 +08:00
# Mistral
An example of generating text with Mistral using MLX.
2023-12-13 00:36:40 +08:00
Mistral 7B is one of the top large language models in its size class. It is
also fully open source with a permissive license[^1].
2023-12-06 03:02:52 +08:00
### Setup
Install the dependencies:
```
pip install -r requirements.txt
```
2023-12-06 03:24:30 +08:00
Next, download the model and tokenizer:
2023-12-06 03:02:52 +08:00
```
curl -O https://files.mistral-7b-v0-1.mistral.ai/mistral-7B-v0.1.tar
tar -xf mistral-7B-v0.1.tar
```
Then, convert the weights with:
```
2023-12-06 03:24:30 +08:00
python convert.py
2023-12-06 03:02:52 +08:00
```
2023-12-13 00:36:40 +08:00
The conversion script will save the converted weights in the same location.
> [!TIP]
> Alternatively, you can also download a few converted checkpoints from the the [MLX Community](https://huggingface.co/mlx-community) organisation on Hugging Face and skip the conversion step.
2023-12-06 03:02:52 +08:00
### Run
2023-12-06 03:24:30 +08:00
Once you've converted the weights to MLX format, you can generate text with
the Mistral model:
2023-12-06 03:02:52 +08:00
```
2023-12-06 03:24:30 +08:00
python mistral.py --prompt "It is a truth universally acknowledged," --temp 0
2023-12-06 03:02:52 +08:00
```
Run `python mistral.py --help` for more details.
2023-12-13 00:36:40 +08:00
[^1]: Refer to the [blog post](https://mistral.ai/news/announcing-mistral-7b/)
and [github repository](https://github.com/mistralai/mistral-src) for more
details.