llama v2 with sharded weights

This commit is contained in:
Awni Hannun
2023-12-12 12:48:15 -08:00
parent 9a02dce35c
commit f0c57c1361
5 changed files with 189 additions and 123 deletions

View File

@@ -46,7 +46,7 @@ rm mixtral-8x7b-32kseqlen/*.pth*
As easy as:
```
python mixtral.py --model_path mixtral mixtral-8x7b-32kseqlen/
python mixtral.py --model_path mixtral-8x7b-32kseqlen/
```
[^mixtral]: Refer to Mistral's [blog post](https://mistral.ai/news/mixtral-of-experts/) for more details.