mirror of
https://github.com/ml-explore/mlx.git
synced 2025-09-01 04:24:36 +08:00
Add Step, ELU, SELU, Swish activation functions (#117)
* Add Step, ELU, SELU, Swish activation functions This commit adds the Step, ELU, SELU and Swish activations functions * add to the docs * review
This commit is contained in:

committed by
GitHub

parent
b9226c367c
commit
f5df47ec6e
@@ -97,7 +97,7 @@ Updating the parameters
|
||||
|
||||
MLX modules allow accessing and updating individual parameters. However, most
|
||||
times we need to update large subsets of a module's parameters. This action is
|
||||
performed by :meth:`Module.update`.
|
||||
performed by :meth:`Module.update`.
|
||||
|
||||
Value and grad
|
||||
--------------
|
||||
@@ -148,6 +148,8 @@ Neural Network Layers
|
||||
ReLU
|
||||
GELU
|
||||
SiLU
|
||||
Step
|
||||
SELU
|
||||
Linear
|
||||
Conv1d
|
||||
Conv2d
|
||||
@@ -170,6 +172,8 @@ simple functions.
|
||||
gelu_fast_approx
|
||||
relu
|
||||
silu
|
||||
step
|
||||
selu
|
||||
|
||||
Loss Functions
|
||||
--------------
|
||||
|
Reference in New Issue
Block a user