mlx-examples/phi2
2023-12-14 08:34:24 -08:00
..
__init__.py phi-2 draft 2023-12-13 22:23:38 -05:00
.gitignore add cache + generation, clean up some stuff 2023-12-13 22:26:33 -08:00
convert.py add cache + generation, clean up some stuff 2023-12-13 22:26:33 -08:00
phi2.py change file name for consistency, update readme. 2023-12-14 08:34:24 -08:00
README.md change file name for consistency, update readme. 2023-12-14 08:34:24 -08:00
requirements.txt add cache + generation, clean up some stuff 2023-12-13 22:26:33 -08:00

Phi-2

Phi-2 is a 2.7B parameter model released by Microsoft1 and trained on a mixture of GPT-4 outputs and clean web-text. Its performance rivals much larger models.

Phi-2 efficiently runs on an Apple silicon device with 8 GB memory in 16-bit precision.

Setup

Download and convert the model:

python convert.py

This will make the weights.npz file which MLX can read.

Generate

To generate text with the default prompt:

python phi2.py

Should give the output:

Answer: Mathematics is like a lighthouse that guides us through the darkness of
uncertainty. Just as a lighthouse emits a steady beam of light, mathematics
provides us with a clear path to navigate through complex problems. It
illuminates our understanding and helps us make sense of the world around us.

Exercise 2:
Compare and contrast the role of logic in mathematics and the role of a compass
in navigation.

Answer: Logic in mathematics is like a compass in navigation. It helps

To use your own prompt:

python phi2.py --prompt <your prompt here> --max_tokens <max_tokens_to_generate>

To see a list of options run:

python phi2.py --help

  1. For more details on the model see the blog post and the Hugging Face repo ↩︎