Add llms subdir + update README (#145)

* add llms subdir + update README

* nits

* use same pre-commit as mlx

* update readmes a bit

* format
This commit is contained in:
Awni Hannun
2023-12-20 10:22:25 -08:00
committed by GitHub
parent aed14618ca
commit 27c0a8c002
62 changed files with 164 additions and 146 deletions

View File

@@ -5,18 +5,37 @@ framework](https://github.com/ml-explore/mlx).
The [MNIST](mnist) example is a good starting point to learn how to use MLX.
Some more useful examples include:
Some more useful examples are listed below.
### Text Models
- [Transformer language model](transformer_lm) training.
- Large scale text generation with [LLaMA](llama), [Mistral](mistral) or [Phi](phi2).
- Mixture-of-experts (MoE) language model with [Mixtral 8x7B](mixtral)
- Large scale text generation with [LLaMA](llms/llama),
[Mistral](llms/mistral), [Phi-2](llms/phi2), and more in the [LLMs](llms)
directory.
- A mixture-of-experts (MoE) language model with [Mixtral 8x7B](llms/mixtral).
- Parameter efficient fine-tuning with [LoRA](lora).
- Text-to-text multi-task Transformers with [T5](t5).
- Bidirectional language understanding with [BERT](bert).
### Image Models
- Generating images with [Stable Diffusion](stable_diffusion).
### Audio Models
- Speech recognition with [OpenAI's Whisper](whisper).
- Bidirectional language understanding with [BERT](bert)
### Other Models
- Semi-supervised learning on graph-structured data with [GCN](gcn).
Note: You can now directly download a few converted checkpoints from the [MLX Community](https://huggingface.co/mlx-community) organisation on Hugging Face.
### Hugging Face
Note: You can now directly download a few converted checkpoints from the [MLX
Community](https://huggingface.co/mlx-community) organization on Hugging Face.
We encourage you to join the community and [contribute new
models](https://github.com/ml-explore/mlx-examples/issues/155).
## Contributing