mirror of
https://github.com/ml-explore/mlx.git
synced 2025-09-26 15:58:14 +08:00
Transformer fix (#167)
* add transformer with dropout, fix transformer ffm, layernorm order * precommit changes * precommit changes * add docstring, activation, norm_first * run precommit * run precommit * add doctstring * precommit * style nits in docs --------- Co-authored-by: junwoo-yun <junwoo.yun@bagelcode.com> Co-authored-by: Awni Hannun <awni@apple.com>
This commit is contained in:
@@ -9,7 +9,7 @@ Layers
|
||||
:toctree: _autosummary
|
||||
:template: nn-module-template.rst
|
||||
|
||||
Embedding
|
||||
Sequential
|
||||
ReLU
|
||||
PReLU
|
||||
GELU
|
||||
@@ -17,17 +17,19 @@ Layers
|
||||
Step
|
||||
SELU
|
||||
Mish
|
||||
Embedding
|
||||
Linear
|
||||
QuantizedLinear
|
||||
Conv1d
|
||||
Conv2d
|
||||
BatchNorm
|
||||
LayerNorm
|
||||
RMSNorm
|
||||
GroupNorm
|
||||
RoPE
|
||||
MultiHeadAttention
|
||||
Sequential
|
||||
QuantizedLinear
|
||||
Dropout
|
||||
Dropout2d
|
||||
|
||||
Transformer
|
||||
MultiHeadAttention
|
||||
ALiBi
|
||||
RoPE
|
||||
SinusoidalPositionalEncoding
|
||||
|
Reference in New Issue
Block a user