mirror of
				https://github.com/ml-explore/mlx-examples.git
				synced 2025-10-31 10:58:07 +08:00 
			
		
		
		
	 362e88a744
			
		
	
	362e88a744
	
	
	
		
			
			* feat: Add lora and qlora training to mlx-lm --------- Co-authored-by: Awni Hannun <awni@apple.com>
Phixtral
Phixtral is a Mixture of Experts (MoE) architecture inspired by Mixtral but made by combinding fine-tuned versions of Phi-2.1 2
Setup
Install the dependencies:
pip install -r requirements.txt
Run
python generate.py \
  --model mlabonne/phixtral-4x2_8 \
  --prompt "write a quick sort in Python"
Run python generate.py --help to see all the options.
- 
For more details on Phixtral, see the Hugging Face repo. ↩︎ 
- 
For more details on Phi-2 see Microsoft's blog post and the Hugging Face repo. ↩︎