| 
							
							
								 Vaibhav Srivastav | 0eaa323c10 | Fix conversion + inference errors. - Mistral (#176) * Fix conversion + inference errors.
* wire rope_theta throuugh to nn.RoPE
---------
Co-authored-by: Awni Hannun <awni@apple.com> | 2023-12-22 14:10:25 -08:00 |  | 
			
				
					| 
							
							
								 Awni Hannun | 3cf436b529 | Quantize example (#162) * testing quantization
* conversion + quantization working
* one config processor
* quantization in mistral / nits in llama
* args for quantization
* llama / mistral conversion in good shape
* phi2 quantized
* mixtral
* qwen conversion | 2023-12-21 12:59:37 -08:00 |  | 
			
				
					| 
							
							
								 Pedro Cuenca | ce30cc3d8f | Use config.json in llama (#159) * Use config.json in llama
* Fix pop
* Fix convert
* Typo | 2023-12-20 10:34:44 -08:00 |  | 
			
				
					| 
							
							
								 Awni Hannun | 27c0a8c002 | Add llms subdir + update README (#145) * add llms subdir + update README
* nits
* use same pre-commit as mlx
* update readmes a bit
* format | 2023-12-20 10:22:25 -08:00 |  |