mirror of
				https://github.com/ml-explore/mlx.git
				synced 2025-11-04 02:28:13 +08:00 
			
		
		
		
	Adding Relu2 (#2582)
* in. com. * upd. ackn. * update __init__ * nits * nits + format * used mx.maximum(x, 0) instead of calling the function and moves relu6 under relu2 to make it nicer * same with _make_activation_module * Update python/mlx/nn/layers/activations.py upd Co-authored-by: Awni Hannun <awni.hannun@gmail.com> * update funct.rst * upd. layers.rst --------- Co-authored-by: Awni Hannun <awni.hannun@gmail.com>
This commit is contained in:
		@@ -27,6 +27,7 @@ simple functions.
 | 
			
		||||
   mish
 | 
			
		||||
   prelu
 | 
			
		||||
   relu
 | 
			
		||||
   relu2
 | 
			
		||||
   relu6
 | 
			
		||||
   selu
 | 
			
		||||
   sigmoid
 | 
			
		||||
 
 | 
			
		||||
@@ -50,6 +50,7 @@ Layers
 | 
			
		||||
   QuantizedLinear
 | 
			
		||||
   RMSNorm
 | 
			
		||||
   ReLU
 | 
			
		||||
   ReLU2
 | 
			
		||||
   ReLU6
 | 
			
		||||
   RNN
 | 
			
		||||
   RoPE
 | 
			
		||||
 
 | 
			
		||||
		Reference in New Issue
	
	Block a user