mirror of
https://github.com/ml-explore/mlx.git
synced 2025-09-01 12:49:44 +08:00
feat: add softsign, softmax, hardswish, logsoftmax activation function (#309)
* feat: add softsign activation function * run pre-commit * Add Softsign activation function * Add Softsign activation function * Add documentation for ReLU6, Softplus, and Softsign activations * Update activation functions in neural network layers * Add LogSoftmax and Hardswish activations * run pre-commit * Update activations.py * Added acknowledgements * Fix activation function comments * Fix activation functions in neural network layers
This commit is contained in:
@@ -6,7 +6,8 @@ with a short description of your contribution(s) below. For example:
|
||||
- Jane Smith: Added the `foo` and `bar` ops.
|
||||
|
||||
MLX was developed with contributions from the following individuals:
|
||||
|
||||
|
||||
- Nripesh Niketan: Added `softsign`, `softmax`, `hardswish`, `logsoftmax` activation functions.
|
||||
- Juarez Bochi: Fixed bug in cross attention.
|
||||
- Justin Deschenaux: Sine, Cosine, arange, randint, truncated normal, bernoulli, lion optimizer, Dropout2d, linear and logistic regression python example.
|
||||
- Diogo Da Cruz: Added tri, tril, triu and safetensor support
|
||||
|
Reference in New Issue
Block a user