Layers#
|
|
|
Applies Batch Normalization over a 2D or 3D input. |
|
Applies a 1-dimensional convolution over the multi-channel input sequence. |
|
Applies a 2-dimensional convolution over the multi-channel input image. |
|
Randomly zero a portion of the elements during training. |
|
Apply 2D channel-wise dropout during training. |
|
Apply 3D channel-wise dropout during training. |
|
Implements a simple lookup table that maps each input integer to a high-dimensional vector. |
|
Applies the Gaussian Error Linear Units. |
|
Applies Group Normalization [1] to the inputs. |
|
Applies instance normalization [1] on the inputs. |
|
Applies layer normalization [1] on the inputs. |
|
Applies an affine transformation to the input. |
|
Applies the Mish function, element-wise. |
|
Implements the scaled dot product attention with multiple heads. |
|
Applies the element-wise parametric ReLU. |
|
Applies an affine transformation to the input using a quantized weight matrix. |
|
Applies Root Mean Square normalization [1] to the inputs. |
|
Applies the Rectified Linear Unit. |
|
Implements the rotary positional encoding. |
|
Applies the Scaled Exponential Linear Unit. |
|
A layer that calls the passed callables in order. |
|
Applies the Sigmoid Linear Unit. |
|
Implements sinusoidal positional encoding. |
|
Applies the Step Activation Function. |
|
Implements a standard Transformer model. |