Nripesh Niketan
|
5ad8fb7268
|
feat: add softsign, softmax, hardswish, logsoftmax activation function (#309)
* feat: add softsign activation function
* run pre-commit
* Add Softsign activation function
* Add Softsign activation function
* Add documentation for ReLU6, Softplus, and Softsign activations
* Update activation functions in neural network layers
* Add LogSoftmax and Hardswish activations
* run pre-commit
* Update activations.py
* Added acknowledgements
* Fix activation function comments
* Fix activation functions in neural network layers
|
2023-12-29 11:49:36 -08:00 |
|
Diogo
|
a83d5d60bd
|
Addition in acknowledgements (#302)
|
2023-12-27 13:46:47 -08:00 |
|
Justin Deschenaux
|
e8deca84e0
|
Add dropout2d (#250)
|
2023-12-22 08:02:29 -08:00 |
|
Justin Deschenaux
|
4912ff3ec2
|
Add Lion optimizer (#209)
* Add Lion optimizer
* Update acknowledgements also with past contributions
|
2023-12-20 13:54:58 -08:00 |
|
Juarez Bochi
|
f4f6e17d45
|
Fix cross-attention (#210)
* Fix cross-attention
With the current code, ln2 is a no-op. Its output should be passed to the cross-attention layer
* Add name to contributors
|
2023-12-18 12:27:27 -08:00 |
|
Awni Hannun
|
477397bc98
|
Citation + Contributor acknowledgment section (#207)
* cite
* nits
* nits
* comment
|
2023-12-18 10:07:00 -08:00 |
|
Jagrit Digani
|
266c4e3df6
|
Add metal cpp license to acknowledgements (#3)
|
2023-11-30 11:33:23 -08:00 |
|
Awni Hannun
|
46a39e5b1f
|
copyright + ack
|
2023-11-30 11:12:53 -08:00 |
|