Files
mlx/docs/build/doctrees/index.doctree

51 lines
9.8 KiB
Plaintext
Raw Normal View History

2024-01-17 17:15:29 -08:00
<EFBFBD><05>?'<00>sphinx.addnodes<65><73>document<6E><74><EFBFBD>)<29><>}<7D>(<28> rawsource<63><65><00><>children<65>]<5D><>docutils.nodes<65><73>section<6F><6E><EFBFBD>)<29><>}<7D>(hhh]<5D>(h <09>title<6C><65><EFBFBD>)<29><>}<7D>(h<05>MLX<4C>h]<5D>h <09>Text<78><74><EFBFBD><EFBFBD>MLX<4C><58><EFBFBD><EFBFBD><EFBFBD>}<7D>(<28>parent<6E>h<11> _document<6E>h<03>source<63>N<EFBFBD>line<6E>Nuba<62>
attributes<EFBFBD>}<7D>(<28>ids<64>]<5D><>classes<65>]<5D><>names<65>]<5D><>dupnames<65>]<5D><>backrefs<66>]<5D>u<EFBFBD>tagname<6D>hhh hhh<1D>./Users/awnihannun/repos/mlx/docs/src/index.rst<73>hKubh <09> paragraph<70><68><EFBFBD>)<29><>}<7D>(h<05><>MLX is a NumPy-like array framework designed for efficient and flexible machine
learning on Apple silicon, brought to you by Apple machine learning research.<2E>h]<5D>h<16><>MLX is a NumPy-like array framework designed for efficient and flexible machine
learning on Apple silicon, brought to you by Apple machine learning research.<2E><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh/hhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+h-hh,hKhh hhubh.)<29><>}<7D>(h<05><>The Python API closely follows NumPy with a few exceptions. MLX also has a
fully featured C++ API which closely follows the Python API.<2E>h]<5D>h<16><>The Python API closely follows NumPy with a few exceptions. MLX also has a
fully featured C++ API which closely follows the Python API.<2E><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh=hhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+h-hh,hKhh hhubh.)<29><>}<7D>(h<05>/The main differences between MLX and NumPy are:<3A>h]<5D>h<16>/The main differences between MLX and NumPy are:<3A><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hhKhhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+h-hh,hK
hh hhubh <09> block_quote<74><65><EFBFBD>)<29><>}<7D>(hXr- **Composable function transformations**: MLX has composable function
transformations for automatic differentiation, automatic vectorization,
and computation graph optimization.
- **Lazy computation**: Computations in MLX are lazy. Arrays are only
materialized when needed.
- **Multi-device**: Operations can run on any of the supported devices (CPU,
GPU, ...)
<EFBFBD>h]<5D>h <09> bullet_list<73><74><EFBFBD>)<29><>}<7D>(hhh]<5D>(h <09> list_item<65><6D><EFBFBD>)<29><>}<7D>(h<05><>**Composable function transformations**: MLX has composable function
transformations for automatic differentiation, automatic vectorization,
and computation graph optimization.<2E>h]<5D>h.)<29><>}<7D>(h<05><>**Composable function transformations**: MLX has composable function
transformations for automatic differentiation, automatic vectorization,
and computation graph optimization.<2E>h]<5D>(h <09>strong<6E><67><EFBFBD>)<29><>}<7D>(h<05>'**Composable function transformations**<2A>h]<5D>h<16>#Composable function transformations<6E><73><EFBFBD><EFBFBD><EFBFBD>}<7D>(hhphhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+hnhhjubh<16><>: MLX has composable function
transformations for automatic differentiation, automatic vectorization,
and computation graph optimization.<2E><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hhjhhhNhNubeh}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+h-hh,hK hhfubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+hdhhaubhe)<29><>}<7D>(h<05>]**Lazy computation**: Computations in MLX are lazy. Arrays are only
materialized when needed.<2E>h]<5D>h.)<29><>}<7D>(h<05>]**Lazy computation**: Computations in MLX are lazy. Arrays are only
materialized when needed.<2E>h]<5D>(ho)<29><>}<7D>(h<05>**Lazy computation**<2A>h]<5D>h<16>Lazy computation<6F><6E><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+hnhh<>ubh<16>I: Computations in MLX are lazy. Arrays are only
materialized when needed.<2E><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubeh}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+h-hh,hKhh<>ubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+hdhhaubhe)<29><>}<7D>(h<05>U**Multi-device**: Operations can run on any of the supported devices (CPU,
GPU, ...)
<EFBFBD>h]<5D>h.)<29><>}<7D>(h<05>T**Multi-device**: Operations can run on any of the supported devices (CPU,
GPU, ...)<29>h]<5D>(ho)<29><>}<7D>(h<05>**Multi-device**<2A>h]<5D>h<16> Multi-device<63><65><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+hnhh<>ubh<16>D: Operations can run on any of the supported devices (CPU,
GPU, …)<29><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubeh}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+h-hh,hKhh<>ubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+hdhhaubeh}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D><>bullet<65><74>-<2D>uh+h_hh,hK hh[ubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+hYhh,hK hh hhubh.)<29><>}<7D>(hX<>The design of MLX is inspired by frameworks like `PyTorch
<https://pytorch.org/>`_, `Jax <https://github.com/google/jax>`_, and
`ArrayFire <https://arrayfire.org/>`_. A notable difference from these
frameworks and MLX is the *unified memory model*. Arrays in MLX live in shared
memory. Operations on MLX arrays can be performed on any of the supported
device types without performing data copies. Currently supported device types
are the CPU and GPU.<2E>h]<5D>(h<16>1The design of MLX is inspired by frameworks like <20><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubh <09> reference<63><65><EFBFBD>)<29><>}<7D>(h<05>!`PyTorch
<https://pytorch.org/>`_<>h]<5D>h<16>PyTorch<63><68><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D><>name<6D><65>PyTorch<63><68>refuri<72><69>https://pytorch.org/<2F>uh+h<>hh<>ubh <09>target<65><74><EFBFBD>)<29><>}<7D>(h<05>
<https://pytorch.org/><3E>h]<5D>h}<7D>(h!]<5D><>pytorch<63>ah#]<5D>h%]<5D><>pytorch<63>ah']<5D>h)]<5D><>refuri<72>juh+j<00>
referenced<EFBFBD>Khh<>ubh<16>, <20><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubh<62>)<29><>}<7D>(h<05>&`Jax <https://github.com/google/jax>`_<>h]<5D>h<16>Jax<61><78><EFBFBD><EFBFBD><EFBFBD>}<7D>(hjhhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D><>name<6D><65>Jax<61>j<00>https://github.com/google/jax<61>uh+h<>hh<>ubj)<29><>}<7D>(h<05> <https://github.com/google/jax><3E>h]<5D>h}<7D>(h!]<5D><>jax<61>ah#]<5D>h%]<5D><>jax<61>ah']<5D>h)]<5D><>refuri<72>j(uh+jjKhh<>ubh<16>, and
<EFBFBD><EFBFBD><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubh<62>)<29><>}<7D>(h<05>%`ArrayFire <https://arrayfire.org/>`_<>h]<5D>h<16> ArrayFire<72><65><EFBFBD><EFBFBD><EFBFBD>}<7D>(hj:hhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D><>name<6D><65> ArrayFire<72>j<00>https://arrayfire.org/<2F>uh+h<>hh<>ubj)<29><>}<7D>(h<05> <https://arrayfire.org/><3E>h]<5D>h}<7D>(h!]<5D><> arrayfire<72>ah#]<5D>h%]<5D><> arrayfire<72>ah']<5D>h)]<5D><>refuri<72>jJuh+jjKhh<>ubh<16><. A notable difference from these
frameworks and MLX is the <20><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubh <09>emphasis<69><73><EFBFBD>)<29><>}<7D>(h<05>*unified memory model*<2A>h]<5D>h<16>unified memory model<65><6C><EFBFBD><EFBFBD><EFBFBD>}<7D>(hj^hhhNhNubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+j\hh<>ubh<16><>. Arrays in MLX live in shared
memory. Operations on MLX arrays can be performed on any of the supported
device types without performing data copies. Currently supported device types
are the CPU and GPU.<2E><><EFBFBD><EFBFBD><EFBFBD>}<7D>(hh<>hhhNhNubeh}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>uh+h-hh,hKhh hhubh <09>compound<6E><64><EFBFBD>)<29><>}<7D>(hhh]<5D>h<00>toctree<65><65><EFBFBD>)<29><>}<7D>(hhh]<5D>h}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>h<1B>index<65><78>entries<65>]<5D>N<EFBFBD>install<6C><6C><EFBFBD>a<EFBFBD> includefiles<65>]<5D>j<EFBFBD>a<>maxdepth<74>K<01>caption<6F><6E>Install<6C><6C>glob<6F><62><EFBFBD>hidden<65><6E><EFBFBD> includehidden<65><6E><EFBFBD>numbered<65>K<00>
titlesonly<EFBFBD><EFBFBD><EFBFBD>
rawentries<EFBFBD>]<5D><>
rawcaption<EFBFBD>j<EFBFBD>uh+j{hh,hKhjxubah}<7D>(h!]<5D>h#]<5D><>toctree-wrapper<65>ah%]<5D>h']<5D>h)]<5D>uh+jvhh hhhh,hNubjw)<29><>}<7D>(hhh]<5D>j|)<29><>}<7D>(hhh]<5D>h}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>hj<>j<>]<5D>(N<>usage/quick_start<72><74><EFBFBD>N<EFBFBD>usage/lazy_evaluation<6F><6E><EFBFBD>N<EFBFBD>usage/unified_memory<72><79><EFBFBD>N<EFBFBD>usage/indexing<6E><67><EFBFBD>N<EFBFBD>usage/saving_and_loading<6E><67><EFBFBD>N<EFBFBD>usage/function_transforms<6D><73><EFBFBD>N<EFBFBD> usage/numpy<70><79><EFBFBD>N<EFBFBD>usage/using_streams<6D><73><EFBFBD>ej<65>]<5D>(j<>j<>j<>j<>j<>j<>j<>j<>ej<65>Kj<><00>Usage<67>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD>Kj<><00>j<EFBFBD>]<5D>j<EFBFBD>j<>uh+j{hh,hK"hj<>ubah}<7D>(h!]<5D>h#]<5D>j<EFBFBD>ah%]<5D>h']<5D>h)]<5D>uh+jvhh hhhh,hNubjw)<29><>}<7D>(hhh]<5D>j|)<29><>}<7D>(hhh]<5D>h}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>hj<>j<>]<5D>(N<>examples/linear_regression<6F><6E><EFBFBD>N<EFBFBD> examples/mlp<6C><70><EFBFBD>N<EFBFBD>examples/llama-inference<63><65><EFBFBD>ej<65>]<5D>(j<>j<>j<>ej<65>Kj<><00>Examples<65>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD>Kj<><00>j<EFBFBD>]<5D>j<EFBFBD>j<>uh+j{hh,hK/hj<>ubah}<7D>(h!]<5D>h#]<5D>j<EFBFBD>ah%]<5D>h']<5D>h)]<5D>uh+jvhh hhhh,hNubjw)<29><>}<7D>(hhh]<5D>j|)<29><>}<7D>(hhh]<5D>h}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>hj<>j<>]<5D>(N<> python/array<61><79><EFBFBD>N<EFBFBD>python/devices_and_streams<6D><73><EFBFBD>N<EFBFBD>
python/ops<70><73><EFBFBD>N<EFBFBD> python/random<6F><6D><EFBFBD>N<EFBFBD>python/transforms<6D><73><EFBFBD>N<EFBFBD>
python/fft<66><74><EFBFBD>N<EFBFBD> python/linalg<6C><67><EFBFBD>N<EFBFBD> python/nn<6E><6E><EFBFBD>N<EFBFBD>python/optimizers<72><73><EFBFBD>N<EFBFBD>python/tree_utils<6C><73><EFBFBD>ej<65>]<5D>(j<>j<>j<>j<>j<>j<>j<>j<>j<>jej<65>Kj<><00>Python API Reference<63>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD>Kj<><00>j<EFBFBD>]<5D>j<EFBFBD>juh+j{hh,hK7hj<>ubah}<7D>(h!]<5D>h#]<5D>j<EFBFBD>ah%]<5D>h']<5D>h)]<5D>uh+jvhh hhhh,hNubjw)<29><>}<7D>(hhh]<5D>j|)<29><>}<7D>(hhh]<5D>h}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>hj<>j<>]<5D>N<EFBFBD>cpp/ops<70><73><EFBFBD>aj<61>]<5D>jaj<61>Kj<><00>C++ API Reference<63>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD>Kj<><00>j<EFBFBD>]<5D>j<EFBFBD>juh+j{hh,hKFhj ubah}<7D>(h!]<5D>h#]<5D>j<EFBFBD>ah%]<5D>h']<5D>h)]<5D>uh+jvhh hhhh,hNubjw)<29><>}<7D>(hhh]<5D>j|)<29><>}<7D>(hhh]<5D>h}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D>hj<>j<>]<5D>N<EFBFBD>dev/extensions<6E><73><EFBFBD>aj<61>]<5D>j0aj<61>Kj<><00>Further Reading<6E>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD><00>j<EFBFBD>Kj<><00>j<EFBFBD>]<5D>j<EFBFBD>j3uh+j{hh,hKLhj#ubah}<7D>(h!]<5D>h#]<5D>j<EFBFBD>ah%]<5D>h']<5D>h)]<5D>uh+jvhh hhhh,hNubeh}<7D>(h!]<5D><>mlx<6C>ah#]<5D>h%]<5D><>mlx<6C>ah']<5D>h)]<5D>uh+h
hhhhhh,hKubah}<7D>(h!]<5D>h#]<5D>h%]<5D>h']<5D>h)]<5D><>source<63>h,uh+h<01>current_source<63>N<EFBFBD> current_line<6E>N<EFBFBD>settings<67><73>docutils.frontend<6E><64>Values<65><73><EFBFBD>)<29><>}<7D>(hN<> generator<6F>N<EFBFBD> datestamp<6D>N<EFBFBD> source_link<6E>N<EFBFBD>
source_url<EFBFBD>N<EFBFBD> toc_backlinks<6B><73>entry<72><79>footnote_backlinks<6B>K<01> sectnum_xform<72>K<01>strip_comments<74>N<EFBFBD>strip_elements_with_classes<65>N<EFBFBD> strip_classes<65>N<EFBFBD> report_level<65>K<02>
halt_level<EFBFBD>K<05>exit_status_level<65>K<05>debug<75>N<EFBFBD>warning_stream<61>N<EFBFBD> traceback<63><6B><EFBFBD>input_encoding<6E><67> utf-8-sig<69><67>input_encoding_error_handler<65><72>strict<63><74>output_encoding<6E><67>utf-8<><38>output_encoding_error_handler<65>jf<00>error_encoding<6E><67>utf-8<><38>error_encoding_error_handler<65><72>backslashreplace<63><65> language_code<64><65>en<65><6E>record_dependencies<65>N<EFBFBD>config<69>N<EFBFBD> id_prefix<69>h<06>auto_id_prefix<69><78>id<69><64> dump_settings<67>N<EFBFBD>dump_internals<6C>N<EFBFBD>dump_transforms<6D>N<EFBFBD>dump_pseudo_xml<6D>N<EFBFBD>expose_internals<6C>N<EFBFBD>strict_visitor<6F>N<EFBFBD>_disable_config<69>N<EFBFBD>_source<63>h,<2C> _destination<6F>N<EFBFBD> _config_files<65>]<5D><>file_insertion_enabled<65><64><EFBFBD> raw_enabled<65>K<01>line_length_limit<69>M'<27>pep_references<65>N<EFBFBD> pep_base_url<72><6C>https://peps.python.org/<2F><>pep_file_url_template<74><65>pep-%04d<34><64>rfc_references<65>N<EFBFBD> rfc_base_url<72><6C>&https://datatracker.ietf.org/doc/html/<2F><> tab_width<74>K<08>trim_footnote_reference_space<63><65><EFBFBD>syntax_highlight<68><74>long<6E><67> smart_quotes<65><73><EFBFBD>smartquotes_locales<65>]<5D><>character_level_inline_markup<75><70><EFBFBD>doctitle_xform<72><6D><EFBFBD> docinfo_xform<72>K<01>sectsubtitle_xform<72><6D><EFBFBD> image_loading<6E><67>link<6E><6B>embed_stylesheet<65><74><EFBFBD>cloak_email_addresses<65><73><EFBFBD>section_self_link<6E><6B><EFBFBD>env<6E>Nub<75>reporter<65>N<EFBFBD>indirect_targets<74>]<5D><>substitution_defs<66>}<7D><>substitution_names<65>}<7D><>refnames<65>}<7D><>refids<64>}<7D><>nameids<64>}<7D>(j@j=jj j2j/jTjQu<> nametypes<65>}<7D>(j@<00>j<00>j2<00>jT<00>uh!}<7D>(j=h j jj/j)jQjKu<> footnote_refs<66>}<7D><> citation_refs<66>}<7D><> autofootnotes<65>]<5D><>autofootnote_refs<66>]<5D><>symbol_footnotes<65>]<5D><>symbol_footnote_refs<66>]<5D><> footnotes<65>]<5D><> citations<6E>]<5D><>autofootnote_start<72>K<01>symbol_footnote_start<72>K<00>
id_counter<EFBFBD><EFBFBD> collections<6E><73>Counter<65><72><EFBFBD>}<7D><><EFBFBD>R<EFBFBD><52>parse_messages<65>]<5D><>transform_messages<65>]<5D><> transformer<65>N<EFBFBD> include_log<6F>]<5D><>
decoration<EFBFBD>Nhhub.