mirror of
https://github.com/ml-explore/mlx.git
synced 2025-12-16 01:49:05 +08:00
* in. com. * upd. ackn. * update __init__ * nits * nits + format * used mx.maximum(x, 0) instead of calling the function and moves relu6 under relu2 to make it nicer * same with _make_activation_module * Update python/mlx/nn/layers/activations.py upd Co-authored-by: Awni Hannun <awni.hannun@gmail.com> * update funct.rst * upd. layers.rst --------- Co-authored-by: Awni Hannun <awni.hannun@gmail.com>
41 lines
536 B
ReStructuredText
41 lines
536 B
ReStructuredText
.. _nn_functions:
|
|
|
|
.. currentmodule:: mlx.nn
|
|
|
|
Functions
|
|
---------
|
|
|
|
Layers without parameters (e.g. activation functions) are also provided as
|
|
simple functions.
|
|
|
|
.. autosummary::
|
|
:toctree: _autosummary_functions
|
|
:template: nn-module-template.rst
|
|
|
|
elu
|
|
celu
|
|
gelu
|
|
gelu_approx
|
|
gelu_fast_approx
|
|
glu
|
|
hard_shrink
|
|
hard_tanh
|
|
hardswish
|
|
leaky_relu
|
|
log_sigmoid
|
|
log_softmax
|
|
mish
|
|
prelu
|
|
relu
|
|
relu2
|
|
relu6
|
|
selu
|
|
sigmoid
|
|
silu
|
|
softmax
|
|
softmin
|
|
softplus
|
|
softshrink
|
|
step
|
|
tanh
|