Merge pull request #103 from arpitingle/patch-1

added phi in readme
This commit is contained in:
Awni Hannun 2023-12-14 10:19:40 -08:00 committed by GitHub
commit 0e88a6afa1
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -8,7 +8,7 @@ The [MNIST](mnist) example is a good starting point to learn how to use MLX.
Some more useful examples include: Some more useful examples include:
- [Transformer language model](transformer_lm) training. - [Transformer language model](transformer_lm) training.
- Large scale text generation with [LLaMA](llama) or [Mistral](mistral). - Large scale text generation with [LLaMA](llama), [Mistral](mistral) or [Phi](phi2).
- Mixture-of-experts (MoE) language model with [Mixtral 8x7B](mixtral) - Mixture-of-experts (MoE) language model with [Mixtral 8x7B](mixtral)
- Parameter efficient fine-tuning with [LoRA](lora). - Parameter efficient fine-tuning with [LoRA](lora).
- Generating images with [Stable Diffusion](stable_diffusion). - Generating images with [Stable Diffusion](stable_diffusion).