mlx-examples/t5
2023-12-18 13:15:02 -08:00
..
.gitignore Add gitignore 2023-12-18 08:42:45 -05:00
convert.py fp16, abstract tokenizer a bit, format 2023-12-18 13:15:02 -08:00
hf_t5.py Increase hf max_length 2023-12-18 13:35:44 -05:00
README.md Fix example 2023-12-18 15:07:50 -05:00
requirements.txt Add skeleton 2023-12-14 15:21:36 -05:00
t5.py fp16, abstract tokenizer a bit, format 2023-12-18 13:15:02 -08:00

T5

The T5 models are encoder-decoder models pre-trained on a mixture of unsupervised and supervised tasks.1 These models work well on a variety of tasks by prepending task-specific prefixes to the input, e.g.: translate English to German: …, summarize: …., etc.

Setup

Download and convert the model:

python convert.py --model <model>

This will make the <model>.npz file which MLX can read.

The <model> can be any of the following:

Model Name Model Size
t5-small 60 million
t5-base 220 million
t5-large 770 million
t5-3b 3 billion
t5-11b 11 billion

Generate

Generate text with:

python t5.py --model t5-small --prompt "translate English to German: A tasty apple"

This should give the output: Ein leckerer Apfel

To see a list of options run:

python t5.py --help

  1. For more information on T5 see the original paper or the Hugging Face page. ↩︎