Examples in the MLX framework
mlx
Go to file
madroid b0bcd86a40
Support for OpenAI’s fine-tuning dataset format (#548)
* LoRA: move load_dataset to tuner/datasets.py file

* LoRA: support OpenAI chat format datasets

see https://platform.openai.com/docs/guides/fine-tuning/example-format

* LoRA: support OpenAI completion format datasets

* LoRA: formatting dataset timing to reduce memory footprint

* Refactor dataset item access in PromptCompletionDataset

* Update mlx_lm/LORA.md

* Update mlx_lm/LORA.md

* check Unsupported data format

* add tests, fine-tune doc

* add tests, fine-tune doc

* add jinja2 for chat template

* nits in readme

* nits in readme

---------

Co-authored-by: Awni Hannun <awni@apple.com>
2024-03-19 16:45:46 -07:00
.circleci LoRA on all linear transformer block layers (#546) 2024-03-12 07:37:40 -07:00
bert bert encoder inherits from nn.Module now (#571) 2024-03-13 10:24:21 -07:00
cifar Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
clip chore(clip): update the clip example to make it compatible with HF format (#472) 2024-02-23 06:49:53 -08:00
cvae Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
gcn Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
llava LlaVA in MLX (#461) 2024-03-01 10:28:35 -08:00
llms Support for OpenAI’s fine-tuning dataset format (#548) 2024-03-19 16:45:46 -07:00
lora Bug fix in lora.py (#468) 2024-02-20 12:53:30 -08:00
mnist Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
normalizing_flow Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
speechcommands Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
stable_diffusion Fix requirements and image2image strength/steps mismatch (#585) 2024-03-14 12:22:54 -07:00
t5 Fix scaling when embeddings are tied (#591) 2024-03-18 13:41:07 -07:00
transformer_lm Typo: SGD->AdamW (#471) 2024-02-20 15:47:17 -08:00
whisper Update README.md (#530) 2024-03-07 06:23:43 -08:00
.gitignore Align CLI args and some smaller fixes (#167) 2023-12-22 14:34:32 -08:00
.pre-commit-config.yaml Update black version to 24.2.0 (#445) 2024-02-16 06:02:52 -08:00
ACKNOWLEDGMENTS.md Refactoring of mlx_lm example (#501) 2024-03-06 06:24:31 -08:00
CODE_OF_CONDUCT.md contribution + code of conduct 2023-11-29 12:31:18 -08:00
CONTRIBUTING.md feat: add update_config functionality (#531) 2024-03-14 06:36:05 -07:00
LICENSE consistent copyright 2023-11-30 11:11:04 -08:00
README.md LlaVA in MLX (#461) 2024-03-01 10:28:35 -08:00

MLX Examples

This repo contains a variety of standalone examples using the MLX framework.

The MNIST example is a good starting point to learn how to use MLX.

Some more useful examples are listed below.

Text Models

Image Models

Audio Models

Multimodal models

  • Joint text and image embeddings with CLIP.
  • Text generation from image and text inputs with LLaVA.

Other Models

  • Semi-supervised learning on graph-structured data with GCN.
  • Real NVP normalizing flow for density estimation and sampling.

Hugging Face

Note: You can now directly download a few converted checkpoints from the MLX Community organization on Hugging Face. We encourage you to join the community and contribute new models.

Contributing

We are grateful for all of our contributors. If you contribute to MLX Examples and wish to be acknowledged, please add your name to the list in your pull request.

Citing MLX Examples

The MLX software suite was initially developed with equal contribution by Awni Hannun, Jagrit Digani, Angelos Katharopoulos, and Ronan Collobert. If you find MLX Examples useful in your research and wish to cite it, please use the following BibTex entry:

@software{mlx2023,
  author = {Awni Hannun and Jagrit Digani and Angelos Katharopoulos and Ronan Collobert},
  title = {{MLX}: Efficient and flexible machine learning on Apple silicon},
  url = {https://github.com/ml-explore},
  version = {0.0},
  year = {2023},
}