Functions#
Layers without parameters (e.g. activation functions) are also provided as simple functions.
elu(x, alpha=1.0)  | 
|
celu(x, alpha=1.0)  | 
|
gelu(x) -> mlx.core.array  | 
|
gelu_approx(x)  | 
|
gelu_fast_approx(x)  | 
|
  | 
Applies the gated linear unit function.  | 
hard_shrink(x, lambd=0.5)  | 
|
hard_tanh(x, min_val=-1.0, max_val=1.0)  | 
|
hardswish(x)  | 
|
leaky_relu(x, negative_slope=0.01)  | 
|
log_sigmoid(x)  | 
|
log_softmax(x, axis=-1)  | 
|
mlx.core.array) -> mlx.core.array  | 
|
mlx.core.array) -> mlx.core.array  | 
|
relu(x)  | 
|
relu6(x)  | 
|
selu(x)  | 
|
sigmoid(x)  | 
|
silu(x)  | 
|
softmax(x, axis=-1)  | 
|
softmin(x, axis=-1)  | 
|
softplus(x)  | 
|
float = 0.5)  | 
|
float = 0.0)  | 
|
  | 
Applies the hyperbolic tangent function.  |