From 5b08da2395191baba7421d7eeaaa9d3ffae02476 Mon Sep 17 00:00:00 2001 From: arpit Date: Thu, 14 Dec 2023 23:40:50 +0530 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 37c977ed..7988e37a 100644 --- a/README.md +++ b/README.md @@ -8,7 +8,7 @@ The [MNIST](mnist) example is a good starting point to learn how to use MLX. Some more useful examples include: - [Transformer language model](transformer_lm) training. -- Large scale text generation with [LLaMA](llama) or [Mistral](mistral). +- Large scale text generation with [LLaMA](llama), [Mistral](mistral) or [Phi](phi2). - Mixture-of-experts (MoE) language model with [Mixtral 8x7B](mixtral) - Parameter efficient fine-tuning with [LoRA](lora). - Generating images with [Stable Diffusion](stable_diffusion).