Activations LeakyReLU / PReLU / Softplus / Mish (#109)

* Leaky_relu / prelu / softplus / mish

* added tests

* updated bench

* remove torch refs, add init to PReLU

* added arvix reference to mish

* added missing docs
This commit is contained in:
Diogo
2023-12-11 22:40:57 -05:00
committed by GitHub
parent f5df47ec6e
commit 02de234ef0
8 changed files with 133 additions and 31 deletions

View File

@@ -146,10 +146,12 @@ Neural Network Layers
Embedding
ReLU
PReLU
GELU
SiLU
Step
SELU
Mish
Linear
Conv1d
Conv2d
@@ -171,9 +173,11 @@ simple functions.
gelu_approx
gelu_fast_approx
relu
prelu
silu
step
selu
mish
Loss Functions
--------------