mirror of
				https://github.com/ml-explore/mlx-examples.git
				synced 2025-10-31 19:18:09 +08:00 
			
		
		
		
	 cd3cff0858
			
		
	
	cd3cff0858
	
	
	
		
			
			* initial * file * remove debug * Adding README * typo * simplify readme * nits in readmes --------- Co-authored-by: Marcel Bischoff <marcel.bischoff@awarehq.com> Co-authored-by: Awni Hannun <awni@apple.com>
Phixtral
Phixtral is a Mixture of Experts (MoE) architecture inspired by Mixtral but made by combinding fine-tuned versions of Phi-2.1 2
Setup
Install the dependencies:
pip install -r requirements.txt
Run
python generate.py \
  --model mlabonne/phixtral-4x2_8 \
  --prompt "write a quick sort in Python"
Run python generate.py --help to see all the options.
- 
For more details on Phixtral, see the Hugging Face repo. ↩︎ 
- 
For more details on Phi-2 see Microsoft's blog post and the Hugging Face repo. ↩︎