mlx-examples/llms/mistral
Awni Hannun 27c0a8c002
Add llms subdir + update README (#145)
* add llms subdir + update README

* nits

* use same pre-commit as mlx

* update readmes a bit

* format
2023-12-20 10:22:25 -08:00
..
.gitignore Add llms subdir + update README (#145) 2023-12-20 10:22:25 -08:00
convert.py Add llms subdir + update README (#145) 2023-12-20 10:22:25 -08:00
mistral.py Add llms subdir + update README (#145) 2023-12-20 10:22:25 -08:00
README.md Add llms subdir + update README (#145) 2023-12-20 10:22:25 -08:00
requirements.txt Add llms subdir + update README (#145) 2023-12-20 10:22:25 -08:00
test.py Add llms subdir + update README (#145) 2023-12-20 10:22:25 -08:00

Mistral

An example of generating text with Mistral using MLX.

Mistral 7B is one of the top large language models in its size class. It is also fully open source with a permissive license1.

Setup

Install the dependencies:

pip install -r requirements.txt

Next, download the model and tokenizer:

curl -O https://files.mistral-7b-v0-1.mistral.ai/mistral-7B-v0.1.tar
tar -xf mistral-7B-v0.1.tar

Then, convert the weights with:

python convert.py

The conversion script will save the converted weights in the same location.

Tip

Alternatively, you can also download a few converted checkpoints from the MLX Community organization on Hugging Face and skip the conversion step.

Run

Once you've converted the weights to MLX format, you can generate text with the Mistral model:

python mistral.py --prompt "It is a truth universally acknowledged,"  --temp 0

Run python mistral.py --help for more details.


  1. Refer to the blog post and github repository for more details. ↩︎