Add Step, ELU, SELU, Swish activation functions (#117)

* Add Step, ELU, SELU, Swish activation functions

This commit adds the Step, ELU, SELU and Swish activations functions

* add to the docs

* review
This commit is contained in:
Nicholas Santavas
2023-12-12 02:04:07 +01:00
committed by GitHub
parent b9226c367c
commit f5df47ec6e
7 changed files with 132 additions and 2 deletions

View File

@@ -97,7 +97,7 @@ Updating the parameters
MLX modules allow accessing and updating individual parameters. However, most
times we need to update large subsets of a module's parameters. This action is
performed by :meth:`Module.update`.
performed by :meth:`Module.update`.
Value and grad
--------------
@@ -148,6 +148,8 @@ Neural Network Layers
ReLU
GELU
SiLU
Step
SELU
Linear
Conv1d
Conv2d
@@ -170,6 +172,8 @@ simple functions.
gelu_fast_approx
relu
silu
step
selu
Loss Functions
--------------