mlx-examples/llms/mlx_lm/models
Awni Hannun d4666615bb
Lazy import + refactor Lora layer addition (#426)
* lazy model import in mlx_lm

* change lora loading

* fix olmo lora

* remove a bunch of unused stuff from plamo

* move phixtral to mlx-lm and out of llms/
2024-02-12 10:51:02 -08:00
..
__init__.py Mlx llm package (#301) 2024-01-12 10:25:56 -08:00
base.py Mlx llm package (#301) 2024-01-12 10:25:56 -08:00
llama.py Lazy import + refactor Lora layer addition (#426) 2024-02-12 10:51:02 -08:00
mixtral.py Lazy import + refactor Lora layer addition (#426) 2024-02-12 10:51:02 -08:00
olmo.py Lazy import + refactor Lora layer addition (#426) 2024-02-12 10:51:02 -08:00
phi.py Lazy import + refactor Lora layer addition (#426) 2024-02-12 10:51:02 -08:00
phixtral.py Lazy import + refactor Lora layer addition (#426) 2024-02-12 10:51:02 -08:00
plamo.py Lazy import + refactor Lora layer addition (#426) 2024-02-12 10:51:02 -08:00
qwen2.py Lazy import + refactor Lora layer addition (#426) 2024-02-12 10:51:02 -08:00
qwen.py Lazy import + refactor Lora layer addition (#426) 2024-02-12 10:51:02 -08:00
stablelm_epoch.py Lazy import + refactor Lora layer addition (#426) 2024-02-12 10:51:02 -08:00