mlx-examples/transformer_lm
Awni Hannun f45a1ab83c
Update a few examples to use compile (#420)
* update a few examples to use compile

* update mnist

* add compile to vae and rename some stuff for simplicity

* update reqs

* use state in eval

* GCN example with RNG + dropout

* add a bit of prefetching
2024-02-08 13:00:41 -08:00
..
datasets.py Add llms subdir + update README (#145) 2023-12-20 10:22:25 -08:00
main.py Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00
README.md Add grad checkpointing and PE in the transformer example (#387) 2024-02-01 13:04:03 -08:00
requirements.txt Update a few examples to use compile (#420) 2024-02-08 13:00:41 -08:00

Transformer LM

This is an example of a decoder-only Transformer LM. The only dependency is MLX.

Run the example on the GPU with:

python main.py --gpu

By default the dataset is the PTB corpus. Choose a different dataset with the --dataset option.