mlx-examples/llms/mlx_lm/tuner
Ivan Fioravanti d2a99172a6
Add dropout parameter to lora configuration (#599)
* Add dropout parameter to lora configuration

A dropout parameter has been added to the lora configuration settings in lora_config.yaml. The LoRALinear class in utils.py has been updated to take this new parameter. Additionally, a AttributeError: 'types.SimpleNamespace' object has no attribute 'prompt' related to `args.prompt` has been removed from lora.py.

* Update lora_config.yaml

Set dropout to 0.0 in the sample config file

* format

---------

Co-authored-by: Awni Hannun <awni@apple.com>
2024-03-20 08:44:40 -07:00
..
__init__.py feat: move lora into mlx-lm (#337) 2024-01-23 08:44:37 -08:00
datasets.py Support for OpenAI’s fine-tuning dataset format (#548) 2024-03-19 16:45:46 -07:00
lora.py LoRA on all linear transformer block layers (#546) 2024-03-12 07:37:40 -07:00
trainer.py LoRA: report last train info (#595) 2024-03-19 17:29:50 -07:00
utils.py Add dropout parameter to lora configuration (#599) 2024-03-20 08:44:40 -07:00