mirror of
https://github.com/ml-explore/mlx-examples.git
synced 2025-06-24 09:21:18 +08:00
![]() * feat: Add lora and qlora training to mlx-lm --------- Co-authored-by: Awni Hannun <awni@apple.com> |
||
---|---|---|
.. | ||
generate.py | ||
phixtral.py | ||
README.md | ||
requirements.txt |
Phixtral
Phixtral is a Mixture of Experts (MoE) architecture inspired by Mixtral but made by combinding fine-tuned versions of Phi-2.12
Setup
Install the dependencies:
pip install -r requirements.txt
Run
python generate.py \
--model mlabonne/phixtral-4x2_8 \
--prompt "write a quick sort in Python"
Run python generate.py --help
to see all the options.
-
For more details on Phixtral, see the Hugging Face repo. ↩︎
-
For more details on Phi-2 see Microsoft's blog post and the Hugging Face repo. ↩︎