mlx-examples/llms/phixtral
Anchen 362e88a744
feat: move lora into mlx-lm (#337)
* feat: Add lora and qlora training to mlx-lm


---------

Co-authored-by: Awni Hannun <awni@apple.com>
2024-01-23 08:44:37 -08:00
..
generate.py Phixtral (#290) 2024-01-13 08:35:03 -08:00
phixtral.py feat: move lora into mlx-lm (#337) 2024-01-23 08:44:37 -08:00
README.md Phixtral (#290) 2024-01-13 08:35:03 -08:00
requirements.txt Phixtral (#290) 2024-01-13 08:35:03 -08:00

Phixtral

Phixtral is a Mixture of Experts (MoE) architecture inspired by Mixtral but made by combinding fine-tuned versions of Phi-2.12

Setup

Install the dependencies:

pip install -r requirements.txt

Run

python generate.py \
  --model mlabonne/phixtral-4x2_8 \
  --prompt "write a quick sort in Python"

Run python generate.py --help to see all the options.


  1. For more details on Phixtral, see the Hugging Face repo. ↩︎

  2. For more details on Phi-2 see Microsoft's blog post and the Hugging Face repo. ↩︎