mlx.nn.LayerNorm
- class mlx.nn.LayerNorm(dims: int, eps: float = 1e-05, affine: bool = True)
Applies layer normalization [1] on the inputs.
Computes
\[y = \frac{x - E[x]}{\sqrt{Var[x]} + \epsilon} \gamma + \beta,\]where \(\gamma\) and \(\beta\) are learned per feature dimension parameters initialized at 1 and 0 respectively.