From 170e4b2d4372d8a073f5c171942973825c4344a0 Mon Sep 17 00:00:00 2001 From: Awni Hannun Date: Wed, 6 Dec 2023 08:12:06 -0800 Subject: [PATCH] fix links (#32) --- docs/src/examples/llama-inference.rst | 6 +++--- docs/src/examples/mlp.rst | 2 +- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/src/examples/llama-inference.rst b/docs/src/examples/llama-inference.rst index c1a6a7e84..20019e911 100644 --- a/docs/src/examples/llama-inference.rst +++ b/docs/src/examples/llama-inference.rst @@ -321,7 +321,7 @@ which can then be used to update the model. Note that the method above incurs several unnecessary copies from disk to numpy and then from numpy to MLX. It will be replaced in the future with direct loading to MLX. -You can download the full example code in `mlx-examples `_. Assuming, the +You can download the full example code in `mlx-examples`_. Assuming, the existence of ``weights.pth`` and ``tokenizer.model`` in the current working directory we can play around with our inference script as follows (the timings are representative of an M1 Ultra and the 7B parameter Llama model): @@ -369,9 +369,9 @@ Scripts .. admonition:: Download the code - The full example code is available in `mlx-examples `_. + The full example code is available in `mlx-examples`_. -.. code: `https://github.com/ml-explore/mlx-examples/tree/main/llama`_ +.. _mlx-examples: https://github.com/ml-explore/mlx-examples/tree/main/llama .. [1] Su, J., Lu, Y., Pan, S., Murtadha, A., Wen, B. and Liu, Y., 2021. Roformer: Enhanced transformer with rotary position embedding. arXiv diff --git a/docs/src/examples/mlp.rst b/docs/src/examples/mlp.rst index 5763eeba0..c003618ce 100644 --- a/docs/src/examples/mlp.rst +++ b/docs/src/examples/mlp.rst @@ -127,5 +127,5 @@ Finally, we put it all together by instantiating the model, the This should not be confused with :func:`mlx.core.value_and_grad`. The model should train to a decent accuracy (about 95%) after just a few passes -over the training set. The `full example `_ +over the training set. The `full example `_ is available in the MLX GitHub repo.