Commit Graph

14 Commits

Author SHA1 Message Date
Awni Hannun
485fb9ac0f
quantize linear (#250) 2024-01-07 18:48:59 -08:00
Lawrence Wu
37856f70a8
add numpy as a requirement to run lora.py (#238)
* add numpy as a requirement to run lora.py

* removed unused imports
2024-01-05 16:16:28 -08:00
Awni Hannun
37b41cec60
Qlora (#219)
qlora
2024-01-04 21:05:59 -08:00
Todsaporn Banjerdkit
7ae445f6c7
feat: add mistral tps (#173)
* feat: add mistral tps

* eval params before timing + format

---------

Co-authored-by: Awni Hannun <awni@apple.com>
2023-12-22 07:55:57 -08:00
wyanzhao
22620de3ee
1. Add user warning for sequences over 2048 tokens in iterate_batches. (#166) 2023-12-21 06:29:31 -08:00
Awni Hannun
27c0a8c002
Add llms subdir + update README (#145)
* add llms subdir + update README

* nits

* use same pre-commit as mlx

* update readmes a bit

* format
2023-12-20 10:22:25 -08:00
Awni Hannun
1e7f4a5921
fix use for llama 2 from meta (#144) 2023-12-18 19:33:17 -08:00
Awni Hannun
84f02ef58b use lower precision base weights 2023-12-15 10:29:42 -08:00
Awni Hannun
d108c558fc more nits 2023-12-15 10:06:14 -08:00
Awni Hannun
985f413f99 custom data with lora 2023-12-15 09:56:10 -08:00
Awni Hannun
98f4346c81 black format 2023-12-09 14:15:25 -08:00
Awni Hannun
b8332a1e66 generalize lora finetuning for llama and mistral 2023-12-09 14:13:55 -08:00
Awni Hannun
31bc57c4ff add copyright in source 2023-11-30 11:08:53 -08:00
Awni Hannun
5d6353aab7 lora 2023-11-29 14:14:11 -08:00