Functions#

Layers without parameters (e.g. activation functions) are also provided as simple functions.

gelu(x)

Applies the Gaussian Error Linear Units function.

gelu_approx(x)

An approximation to Gaussian Error Linear Unit.

gelu_fast_approx(x)

A fast approximation to Gaussian Error Linear Unit.

relu(x)

Applies the Rectified Linear Unit.

prelu(x, alpha)

Applies the element-wise parametric ReLU.

silu(x)

Applies the Sigmoid Linear Unit.

step(x[, threshold])

Applies the Step Activation Function.

selu(x)

Applies the Scaled Exponential Linear Unit.

mish(x)

Applies the Mish function, element-wise.