From 68c4282766f7db0b6e3b9bba1089372133f5bee9 Mon Sep 17 00:00:00 2001 From: bbelescot Date: Fri, 8 Dec 2023 13:55:26 +0100 Subject: [PATCH] =?UTF-8?q?=F0=9F=93=9D=20clarify=20python=20command=20for?= =?UTF-8?q?=20llama=20example?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- llama/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/llama/README.md b/llama/README.md index 3a3e9e89..d1ffba88 100644 --- a/llama/README.md +++ b/llama/README.md @@ -32,7 +32,7 @@ Once you've converted the weights to MLX format, you can interact with the LLaMA model: ``` -python llama.py mlx_llama_weights.npz "hello" +python llama.py "hello" ``` Run `python llama.py --help` for more details.