..
examples
Example of response generation with optional arguments ( #853 )
2024-07-09 06:49:59 -07:00
models
Unify attention mask in LLMs ( #911 )
2024-07-25 16:45:22 -07:00
tuner
Add GPT-neox model ( #863 )
2024-07-11 06:13:17 -07:00
__init__.py
mlx_lm: Add Streaming Capability to Generate Function ( #807 )
2024-06-03 09:04:39 -07:00
convert.py
Create executables for generate, lora, server, merge, convert ( #682 )
2024-04-16 16:08:49 -07:00
fuse.py
Block sparse MM MoEs ( #782 )
2024-05-21 15:58:08 -07:00
generate.py
mlx_lm: Add Streaming Capability to Generate Function ( #807 )
2024-06-03 09:04:39 -07:00
gguf.py
fix(mlx-lm): type hints in gguf.py ( #621 )
2024-03-26 07:56:01 -07:00
LORA.md
Configuration-based use of HF hub-hosted datasets for training ( #701 )
2024-06-26 10:20:50 -07:00
lora.py
Pass use_dora parameter to linear_to_lora_layers ( #885 )
2024-07-11 14:34:34 -07:00
MANAGE.md
Add model management functionality for local caches ( #736 )
2024-05-03 12:20:13 -07:00
manage.py
Add model management functionality for local caches ( #736 )
2024-05-03 12:20:13 -07:00
MERGE.md
Create executables for generate, lora, server, merge, convert ( #682 )
2024-04-16 16:08:49 -07:00
merge.py
Create executables for generate, lora, server, merge, convert ( #682 )
2024-04-16 16:08:49 -07:00
py.typed
Add py.typed
to support PEP-561 (type-hinting) ( #389 )
2024-01-30 21:17:38 -08:00
README.md
feat: move lora into mlx-lm ( #337 )
2024-01-23 08:44:37 -08:00
requirements.txt
Example of response generation with optional arguments ( #853 )
2024-07-09 06:49:59 -07:00
sample_utils.py
Use async eval ( #670 )
2024-04-11 13:18:23 -07:00
SERVER.md
Adapters loading ( #902 )
2024-08-01 16:18:18 -07:00
server.py
Adapters loading ( #902 )
2024-08-01 16:18:18 -07:00
tokenizer_utils.py
fix yi ( #852 )
2024-06-27 06:38:19 -07:00
UPLOAD.md
Mlx llm package ( #301 )
2024-01-12 10:25:56 -08:00
utils.py
support load model by custom get_model_classes ( #899 )
2024-07-25 11:01:17 -07:00
version.py
Configuration-based use of HF hub-hosted datasets for training ( #701 )
2024-06-26 10:20:50 -07:00