mlx-examples/llms/mlx_lm
Miller Liang 5b1043a458
llms: convert() add 'revision' argument (#506)
* llms: convert() add 'revision' argument

* Update README.md

* Update utils.py

* Update README.md

* Update llms/mlx_lm/utils.py

---------

Co-authored-by: Awni Hannun <awni.hannun@gmail.com>
2024-03-02 06:28:26 -08:00
..
examples Support for slerp merging models (#455) 2024-02-19 20:37:15 -08:00
models Update to StableLM code (#514) 2024-03-01 09:53:38 -08:00
tuner Update to StableLM code (#514) 2024-03-01 09:53:38 -08:00
__init__.py Fix import warning (#479) 2024-02-27 08:47:56 -08:00
convert.py Fix import warning (#479) 2024-02-27 08:47:56 -08:00
fuse.py feat(mlx-lm): add de-quant for fuse.py (#365) 2024-01-25 18:59:32 -08:00
generate.py chore(mlx-lm): add adapter support in generate.py (#494) 2024-02-28 07:49:25 -08:00
LORA.md chore(mlx-lm): add adapter support in generate.py (#494) 2024-02-28 07:49:25 -08:00
lora.py chore(mlx-lm): add adapter support in generate.py (#494) 2024-02-28 07:49:25 -08:00
MERGE.md Support for slerp merging models (#455) 2024-02-19 20:37:15 -08:00
merge.py Prevent llms/mlx_lm from serving the local directory as a webserver (#498) 2024-02-27 19:40:42 -08:00
py.typed Add py.typed to support PEP-561 (type-hinting) (#389) 2024-01-30 21:17:38 -08:00
README.md feat: move lora into mlx-lm (#337) 2024-01-23 08:44:37 -08:00
requirements.txt [mlx-lm] Add precompiled normalizations (#451) 2024-02-22 12:40:55 -08:00
SERVER.md Prevent llms/mlx_lm from serving the local directory as a webserver (#498) 2024-02-27 19:40:42 -08:00
server.py Prevent llms/mlx_lm from serving the local directory as a webserver (#498) 2024-02-27 19:40:42 -08:00
UPLOAD.md Mlx llm package (#301) 2024-01-12 10:25:56 -08:00
utils.py llms: convert() add 'revision' argument (#506) 2024-03-02 06:28:26 -08:00
version.py Fix import warning (#479) 2024-02-27 08:47:56 -08:00

Generate Text with MLX and 🤗 Hugging Face

This an example of large language model text generation that can pull models from the Hugging Face Hub.

For more information on this example, see the README in the parent directory.

This package also supports fine tuning with LoRA or QLoRA. For more information see the LoRA documentation.