diff --git a/README.md b/README.md index 9b82b653e..0770af0f8 100644 --- a/README.md +++ b/README.md @@ -9,9 +9,9 @@ by Apple machine learning research. Some key features of MLX include: - - **Familiar APIs**: MLX has a Python API which closely follows NumPy. - MLX also has a fully featured C++ API which closely mirrors the Python API. - MLX has higher level packages like `mlx.nn` and `mlx.optimizers` with APIs + - **Familiar APIs**: MLX has a Python API that closely follows NumPy. + MLX also has a fully featured C++ API, which closely mirrors the Python API. + MLX has higher-level packages like `mlx.nn` and `mlx.optimizers` with APIs that closely follow PyTorch to simplify building more complex models. - **Composable function transformations**: MLX has composable function @@ -26,15 +26,15 @@ Some key features of MLX include: slow compilations, and debugging is simple and intuitive. - **Multi-device**: Operations can run on any of the supported devices - (currently the CPU and GPU). + (currently, the CPU and GPU). - - **Unified memory**: A noteable difference from MLX and other frameworks + - **Unified memory**: A notable difference from MLX and other frameworks is the *unified memory model*. Arrays in MLX live in shared memory. Operations on MLX arrays can be performed on any of the supported device types without moving data. MLX is designed by machine learning researchers for machine learning -researchers. The framework is intended to be user friendly, but still efficient +researchers. The framework is intended to be user-friendly, but still efficient to train and deploy models. The design of the framework itself is also conceptually simple. We intend to make it easy for researchers to extend and improve MLX with the goal of quickly exploring new ideas. @@ -47,10 +47,10 @@ The design of MLX is inspired by frameworks like ## Examples The [MLX examples repo](https://github.com/ml-explore/mlx-examples) has a -variety of examples including: +variety of examples, including: - [Transformer language model](https://github.com/ml-explore/mlx-examples/tree/main/transformer_lm) training. -- Large scale text generation with +- Large-scale text generation with [LLaMA](https://github.com/ml-explore/mlx-examples/tree/main/llama) and finetuning with [LoRA](https://github.com/ml-explore/mlx-examples/tree/main/lora). - Generating images with [Stable Diffusion](https://github.com/ml-explore/mlx-examples/tree/main/stable_diffusion). @@ -64,7 +64,7 @@ in the documentation. ## Installation -MLX is available on [PyPi](https://pypi.org/project/mlx/). To install the Python API run: +MLX is available on [PyPi](https://pypi.org/project/mlx/). To install the Python API, run: ``` pip install mlx