mlx-examples/README.md

42 lines
1.5 KiB
Markdown
Raw Normal View History

2023-12-06 03:58:58 +08:00
# MLX Examples
2023-11-30 00:17:26 +08:00
2023-12-06 03:58:58 +08:00
This repo contains a variety of standalone examples using the [MLX
framework](https://github.com/ml-explore/mlx).
The [MNIST](mnist) example is a good starting point to learn how to use MLX.
Some more useful examples include:
- [Transformer language model](transformer_lm) training.
2023-12-15 02:10:50 +08:00
- Large scale text generation with [LLaMA](llama), [Mistral](mistral) or [Phi](phi2).
- Mixture-of-experts (MoE) language model with [Mixtral 8x7B](mixtral)
2023-12-06 03:58:58 +08:00
- Parameter efficient fine-tuning with [LoRA](lora).
- Generating images with [Stable Diffusion](stable_diffusion).
- Speech recognition with [OpenAI's Whisper](whisper).
- Bidirectional language understanding with [BERT](bert)
- Semi-supervised learning on graph-structured data with [GCN](gcn).
2023-11-30 04:31:18 +08:00
## Contributing
2023-12-18 23:55:25 +08:00
We are grateful for all of [our
contributors](ACKNOWLEDGMENTS.md#Individual-Contributors). If you contribute
2023-12-19 02:08:11 +08:00
to MLX Examples and wish to be acknowledged, please add your name to to the list in your
2023-12-18 23:55:25 +08:00
pull request.
## Citing MLX Examples
The MLX software suite was initially developed with equal contribution by Awni
Hannun, Jagrit Digani, Angelos Katharopoulos, and Ronan Collobert. If you find
2023-12-19 02:08:11 +08:00
MLX Examples useful in your research and wish to cite it, please use the following
2023-12-18 23:55:25 +08:00
BibTex entry:
```
@software{mlx2023,
author = {Awni Hannun and Jagrit Digani and Angelos Katharopoulos and Ronan Collobert},
title = {{MLX}: Efficient and flexible machine learning on Apple silicon},
url = {https://github.com/ml-explore},
version = {0.0},
year = {2023},
}
```