mlx/python/mlx/nn/layers
Hazem Essam 022a944367
Added GLU activation function and Gated activation function (#329)
* Added GLU activation function and gated activation function

* Ran pre-commit

* Ran pre commit

* Removed old sigmoid implementation to match with main

* Removed gated activation from __init__.py

* Removed unused test cases

* Removed unused imports

* format / docstring

---------

Co-authored-by: Awni Hannun <awni@apple.com>
2024-01-08 06:13:16 -08:00
..
__init__.py Added GLU activation function and Gated activation function (#329) 2024-01-08 06:13:16 -08:00
activations.py Added GLU activation function and Gated activation function (#329) 2024-01-08 06:13:16 -08:00
base.py revert copy (#366) 2024-01-04 10:43:29 -08:00
containers.py copyright + ack 2023-11-30 11:12:53 -08:00
convolution.py Updated default argument for stride to 1 in Conv2d() in the docstring (#22) 2023-12-06 07:17:58 -08:00
dropout.py feat: Add Dropout3d layer to nn.layers (#313) 2023-12-31 14:01:21 -08:00
embedding.py copyright + ack 2023-11-30 11:12:53 -08:00
linear.py Fix the implementation of the Bilinear layer (#347) 2024-01-02 16:46:18 -08:00
normalization.py implemented InstanceNorm (#244) 2024-01-03 12:21:15 -08:00
positional_encoding.py Fix style check (#395) 2024-01-07 05:54:58 -08:00
quantized.py Support for quantized matmul with w and w^T (#349) 2024-01-03 14:22:36 -08:00
transformer.py Transformer fix (#167) 2023-12-27 08:48:36 -08:00