mirror of
https://github.com/ml-explore/mlx-examples.git
synced 2025-06-25 01:41:19 +08:00

* Adding full model weights finetuning * Updating the LORA.md and ACKNOWLEDGMENTS.md files. * removing --use-dora and --fulll-training and adding --fine-tune-type * some clean up * reformating and fixing dora training * updated CONFIG_DEFAULTS * update config example * update in the config example fie * Update LORA.md * merge and commit * adding argument for dora linear layer * clean up * clean up in the example yaml file * fix * final fix before sending * small addition to re md file * fix for loading the fully trained model by saving all the files and configs correctly * clean up * removing the unnesesairy files * changing lora layers back to 16 * removed max file size * nits * resolve merge * some consistency changes --------- Co-authored-by: Awni Hannun <awni@apple.com>
742 B
742 B
Individual Contributors
If you wish to be acknowledged for your contributions, please list your name with a short description of your contribution(s) below. For example:
- Jane Smith: Added the
foo
example.
MLX Examples was developed with contributions from the following individuals:
- Juarez Bochi: Added support for T5 models.
- Sarthak Yadav: Added the
cifar
andspeechcommands
examples. - Shunta Saito: Added support for PLaMo models.
- Gabrijel Boduljak: Implemented
CLIP
. - Markus Enzweiler: Added the
cvae
examples. - Prince Canuma: Helped add support for
Starcoder2
models. - Shiyu Li: Added the
Segment Anything Model
. - Gökdeniz Gülmez: Added support for
MiniCPM
,Mamba
and support forfull-fine-tuning
.