mirror of
https://github.com/ml-explore/mlx-examples.git
synced 2025-07-15 06:41:13 +08:00

* initial * file * remove debug * Adding README * typo * simplify readme * nits in readmes --------- Co-authored-by: Marcel Bischoff <marcel.bischoff@awarehq.com> Co-authored-by: Awni Hannun <awni@apple.com>
766 B
766 B
Phixtral
Phixtral is a Mixture of Experts (MoE) architecture inspired by Mixtral but made by combinding fine-tuned versions of Phi-2.12
Setup
Install the dependencies:
pip install -r requirements.txt
Run
python generate.py \
--model mlabonne/phixtral-4x2_8 \
--prompt "write a quick sort in Python"
Run python generate.py --help
to see all the options.
-
For more details on Phixtral, see the Hugging Face repo. ↩︎
-
For more details on Phi-2 see Microsoft's blog post and the Hugging Face repo. ↩︎