mirror of
https://github.com/ml-explore/mlx-examples.git
synced 2025-12-16 02:08:55 +08:00
mixtral runs a bit faster
This commit is contained in:
@@ -2,7 +2,8 @@
|
||||
|
||||
An example of generating text with Mistral using MLX.
|
||||
|
||||
Mistral 7B is one of the top large language models in its size class. It is also fully open source with a permissive license[^1].
|
||||
Mistral 7B is one of the top large language models in its size class. It is
|
||||
also fully open source with a permissive license[^1].
|
||||
|
||||
### Setup
|
||||
|
||||
@@ -25,6 +26,8 @@ Then, convert the weights with:
|
||||
python convert.py
|
||||
```
|
||||
|
||||
The conversion script will save the converted weights in the same location.
|
||||
|
||||
### Run
|
||||
|
||||
Once you've converted the weights to MLX format, you can generate text with
|
||||
@@ -36,4 +39,6 @@ python mistral.py --prompt "It is a truth universally acknowledged," --temp 0
|
||||
|
||||
Run `python mistral.py --help` for more details.
|
||||
|
||||
[^1]: Refer to the [blog post](https://mistral.ai/news/announcing-mistral-7b/) and [github repository](https://github.com/mistralai/mistral-src) for more details.
|
||||
[^1]: Refer to the [blog post](https://mistral.ai/news/announcing-mistral-7b/)
|
||||
and [github repository](https://github.com/mistralai/mistral-src) for more
|
||||
details.
|
||||
|
||||
Reference in New Issue
Block a user