| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | # MLX
 | 
					
						
							|  |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  | [**Quickstart**](#quickstart) | [**Installation**](#installation) | | 
					
						
							|  |  |  | [**Documentation**](https://ml-explore.github.io/mlx/build/html/index.html) | | 
					
						
							| 
									
										
										
										
											2023-12-06 13:32:41 -08:00
										 |  |  | [**Examples**](#examples)  | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | [](https://circleci.com/gh/ml-explore/mlx) | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2024-10-25 19:25:35 -07:00
										 |  |  | MLX is an array framework for machine learning on Apple silicon, | 
					
						
							| 
									
										
										
										
											2024-02-12 12:25:04 -08:00
										 |  |  | brought to you by Apple machine learning research. | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  | Some key features of MLX include: | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2025-07-25 12:13:19 -07:00
										 |  |  |  - **Familiar APIs**: MLX has a Python API that closely follows NumPy. MLX | 
					
						
							| 
									
										
										
										
											2024-02-20 09:54:49 -08:00
										 |  |  |    also has fully featured C++, [C](https://github.com/ml-explore/mlx-c), and | 
					
						
							|  |  |  |    [Swift](https://github.com/ml-explore/mlx-swift/) APIs, which closely mirror | 
					
						
							| 
									
										
										
										
											2025-07-25 12:13:19 -07:00
										 |  |  |    the Python API. MLX has higher-level packages like `mlx.nn` and | 
					
						
							| 
									
										
										
										
											2024-02-20 09:54:49 -08:00
										 |  |  |    `mlx.optimizers` with APIs that closely follow PyTorch to simplify building | 
					
						
							|  |  |  |    more complex models. | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-12-10 17:47:37 -05:00
										 |  |  |  - **Composable function transformations**: MLX supports composable function | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  |    transformations for automatic differentiation, automatic vectorization, | 
					
						
							|  |  |  |    and computation graph optimization. | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  |  - **Lazy computation**: Computations in MLX are lazy. Arrays are only | 
					
						
							|  |  |  |    materialized when needed. | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-12-10 17:47:37 -05:00
										 |  |  |  - **Dynamic graph construction**: Computation graphs in MLX are constructed | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  |    dynamically. Changing the shapes of function arguments does not trigger | 
					
						
							|  |  |  |    slow compilations, and debugging is simple and intuitive. | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  |  - **Multi-device**: Operations can run on any of the supported devices | 
					
						
							| 
									
										
										
										
											2023-12-10 17:47:37 -05:00
										 |  |  |    (currently the CPU and the GPU). | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-12-06 08:31:27 +03:00
										 |  |  |  - **Unified memory**: A notable difference from MLX and other frameworks | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  |    is the *unified memory model*. Arrays in MLX live in shared memory. | 
					
						
							|  |  |  |    Operations on MLX arrays can be performed on any of the supported | 
					
						
							| 
									
										
										
										
											2023-12-10 17:47:37 -05:00
										 |  |  |    device types without transferring data. | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  | MLX is designed by machine learning researchers for machine learning | 
					
						
							| 
									
										
										
										
											2023-12-06 08:31:27 +03:00
										 |  |  | researchers. The framework is intended to be user-friendly, but still efficient | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  | to train and deploy models. The design of the framework itself is also | 
					
						
							|  |  |  | conceptually simple. We intend to make it easy for researchers to extend and | 
					
						
							|  |  |  | improve MLX with the goal of quickly exploring new ideas.  | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | The design of MLX is inspired by frameworks like | 
					
						
							|  |  |  | [NumPy](https://numpy.org/doc/stable/index.html), | 
					
						
							|  |  |  | [PyTorch](https://pytorch.org/), [Jax](https://github.com/google/jax), and | 
					
						
							|  |  |  | [ArrayFire](https://arrayfire.org/). | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | ## Examples
 | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | The [MLX examples repo](https://github.com/ml-explore/mlx-examples) has a | 
					
						
							| 
									
										
										
										
											2023-12-06 08:31:27 +03:00
										 |  |  | variety of examples, including: | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  | 
 | 
					
						
							|  |  |  | - [Transformer language model](https://github.com/ml-explore/mlx-examples/tree/main/transformer_lm) training. | 
					
						
							| 
									
										
										
										
											2023-12-06 08:31:27 +03:00
										 |  |  | - Large-scale text generation with | 
					
						
							| 
									
										
										
										
											2023-12-25 20:53:20 -08:00
										 |  |  |   [LLaMA](https://github.com/ml-explore/mlx-examples/tree/main/llms/llama) and | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  |   finetuning with [LoRA](https://github.com/ml-explore/mlx-examples/tree/main/lora). | 
					
						
							|  |  |  | - Generating images with [Stable Diffusion](https://github.com/ml-explore/mlx-examples/tree/main/stable_diffusion). | 
					
						
							|  |  |  | - Speech recognition with [OpenAI's Whisper](https://github.com/ml-explore/mlx-examples/tree/main/whisper). | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | ## Quickstart
 | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | See the [quick start | 
					
						
							| 
									
										
										
										
											2024-01-06 00:58:33 +01:00
										 |  |  | guide](https://ml-explore.github.io/mlx/build/html/usage/quick_start.html) | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  | in the documentation. | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | ## Installation
 | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2025-07-25 12:13:19 -07:00
										 |  |  | MLX is available on [PyPI](https://pypi.org/project/mlx/). To install MLX on | 
					
						
							|  |  |  | macOS, run: | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2025-07-25 12:13:19 -07:00
										 |  |  | ```bash | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  | pip install mlx | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | ``` | 
					
						
							|  |  |  | 
 | 
					
						
							| 
									
										
										
										
											2025-08-06 06:19:12 -07:00
										 |  |  | To install the CUDA backend on Linux, run: | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | ```bash | 
					
						
							|  |  |  | pip install mlx[cuda] | 
					
						
							|  |  |  | ``` | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | To install a CPU-only Linux package, run: | 
					
						
							| 
									
										
										
										
											2024-01-22 23:53:54 -05:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2025-07-25 12:13:19 -07:00
										 |  |  | ```bash | 
					
						
							| 
									
										
										
										
											2025-08-04 15:33:05 -07:00
										 |  |  | pip install mlx[cpu] | 
					
						
							| 
									
										
										
										
											2024-01-22 23:53:54 -05:00
										 |  |  | ``` | 
					
						
							|  |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-11-29 16:23:42 -08:00
										 |  |  | Checkout the | 
					
						
							|  |  |  | [documentation](https://ml-explore.github.io/mlx/build/html/install.html#) | 
					
						
							|  |  |  | for more information on building the C++ and Python APIs from source. | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-11-29 12:54:28 -08:00
										 |  |  | ## Contributing 
 | 
					
						
							| 
									
										
										
										
											2023-11-29 10:52:08 -08:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2024-05-20 09:16:40 -04:00
										 |  |  | Check out the [contribution guidelines](https://github.com/ml-explore/mlx/tree/main/CONTRIBUTING.md) for more information | 
					
						
							| 
									
										
										
										
											2023-12-18 10:07:00 -08:00
										 |  |  | on contributing to MLX. See the | 
					
						
							|  |  |  | [docs](https://ml-explore.github.io/mlx/build/html/install.html) for more | 
					
						
							|  |  |  | information on building from source, and running tests. | 
					
						
							| 
									
										
										
										
											2023-12-14 15:58:45 -05:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-12-18 10:07:00 -08:00
										 |  |  | We are grateful for all of [our | 
					
						
							| 
									
										
										
										
											2024-05-20 09:16:40 -04:00
										 |  |  | contributors](https://github.com/ml-explore/mlx/tree/main/ACKNOWLEDGMENTS.md#Individual-Contributors). If you contribute | 
					
						
							| 
									
										
										
										
											2023-12-30 00:58:00 +04:00
										 |  |  | to MLX and wish to be acknowledged, please add your name to the list in your | 
					
						
							| 
									
										
										
										
											2023-12-18 10:07:00 -08:00
										 |  |  | pull request. | 
					
						
							| 
									
										
										
										
											2023-12-14 15:58:45 -05:00
										 |  |  | 
 | 
					
						
							| 
									
										
										
										
											2023-12-18 10:07:00 -08:00
										 |  |  | ## Citing MLX
 | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | The MLX software suite was initially developed with equal contribution by Awni | 
					
						
							|  |  |  | Hannun, Jagrit Digani, Angelos Katharopoulos, and Ronan Collobert. If you find | 
					
						
							|  |  |  | MLX useful in your research and wish to cite it, please use the following | 
					
						
							|  |  |  | BibTex entry: | 
					
						
							|  |  |  | 
 | 
					
						
							|  |  |  | ``` | 
					
						
							|  |  |  | @software{mlx2023, | 
					
						
							|  |  |  |   author = {Awni Hannun and Jagrit Digani and Angelos Katharopoulos and Ronan Collobert}, | 
					
						
							|  |  |  |   title = {{MLX}: Efficient and flexible machine learning on Apple silicon}, | 
					
						
							|  |  |  |   url = {https://github.com/ml-explore}, | 
					
						
							|  |  |  |   version = {0.0}, | 
					
						
							|  |  |  |   year = {2023}, | 
					
						
							|  |  |  | } | 
					
						
							|  |  |  | ``` |