mlx-examples/llms/mlx_lm/tuner
Ivan Fioravanti 4576946151
Add checkpoints directory for adapter weights (#431)
* Add checkpoints directory for adapter weights

The code was modified to create a checkpoints directory if it doesn't exist yet. Adapter weights are now saved to this checkpoints directory during the training iterations.
Corrected indentation of Save adapter weights code because it was part of "if eval"

* Fixing a blank added by mistake
2024-02-12 10:50:05 -08:00
..
__init__.py feat: move lora into mlx-lm (#337) 2024-01-23 08:44:37 -08:00
lora.py feat(mlx-lm): add de-quant for fuse.py (#365) 2024-01-25 18:59:32 -08:00
trainer.py Add checkpoints directory for adapter weights (#431) 2024-02-12 10:50:05 -08:00
utils.py fix(mlx-lm): apply lora layer doesn't update the lora weights (#396) 2024-01-31 11:51:26 -08:00