Examples in the MLX framework
mlx
Go to file
Madroid Ma 8eee4399f4
LoRA: Add printing and callbacks for learning rate during training (#457)
* LoRA:Refactor TrainingCallback to enhance flexibility and extensibility

This commit refactors the TrainingCallback class to accept a dictionary parameter for both on_train_loss_report and on_val_loss_report methods. By switching from multiple parameters to a single dict parameter, this change significantly improves the class's flexibility and makes it easier to extend with new training or validation metrics in the future without altering the method signatures. This approach simplifies the addition of new information to be logged or processed and aligns with best practices for scalable and maintainable code design.

* LoRA: Add printing and callbacks for learning rate during training
2024-02-20 13:07:21 -08:00
.circleci Basic CircleCI (#449) 2024-02-16 22:13:55 -08:00
bert docs: added missing imports (#375) 2024-01-25 10:44:53 -08:00
cifar Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
clip CLIP (ViT) (#315) 2024-01-31 14:19:53 -08:00
cvae Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
gcn Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
llms LoRA: Add printing and callbacks for learning rate during training (#457) 2024-02-20 13:07:21 -08:00
lora Bug fix in lora.py (#468) 2024-02-20 12:53:30 -08:00
mnist Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
normalizing_flow Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
speechcommands Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
stable_diffusion Fix Qwen2 and SD (#441) 2024-02-14 13:43:12 -08:00
t5 add speculative decoding example for llama (#149) 2023-12-28 15:20:43 -08:00
transformer_lm clean up loss function extraction (#433) 2024-02-12 05:46:00 -08:00
whisper work with tuple shape (#393) 2024-02-01 13:03:47 -08:00
.gitignore Align CLI args and some smaller fixes (#167) 2023-12-22 14:34:32 -08:00
.pre-commit-config.yaml Update black version to 24.2.0 (#445) 2024-02-16 06:02:52 -08:00
ACKNOWLEDGMENTS.md Example of a Convolutional Variational Autoencoder (CVAE) on MNIST (#264) 2024-02-06 20:02:27 -08:00
CODE_OF_CONDUCT.md contribution + code of conduct 2023-11-29 12:31:18 -08:00
CONTRIBUTING.md Update CONTRIBUTING.md 2023-12-09 08:02:34 +09:00
LICENSE consistent copyright 2023-11-30 11:11:04 -08:00
README.md Example of a Convolutional Variational Autoencoder (CVAE) on MNIST (#264) 2024-02-06 20:02:27 -08:00

MLX Examples

This repo contains a variety of standalone examples using the MLX framework.

The MNIST example is a good starting point to learn how to use MLX.

Some more useful examples are listed below.

Text Models

Image Models

Audio Models

Multimodal models

  • Joint text and image embeddings with CLIP.

Other Models

  • Semi-supervised learning on graph-structured data with GCN.
  • Real NVP normalizing flow for density estimation and sampling.

Hugging Face

Note: You can now directly download a few converted checkpoints from the MLX Community organization on Hugging Face. We encourage you to join the community and contribute new models.

Contributing

We are grateful for all of our contributors. If you contribute to MLX Examples and wish to be acknowledged, please add your name to the list in your pull request.

Citing MLX Examples

The MLX software suite was initially developed with equal contribution by Awni Hannun, Jagrit Digani, Angelos Katharopoulos, and Ronan Collobert. If you find MLX Examples useful in your research and wish to cite it, please use the following BibTex entry:

@software{mlx2023,
  author = {Awni Hannun and Jagrit Digani and Angelos Katharopoulos and Ronan Collobert},
  title = {{MLX}: Efficient and flexible machine learning on Apple silicon},
  url = {https://github.com/ml-explore},
  version = {0.0},
  year = {2023},
}