Fix cross-attention (#210)

* Fix cross-attention

With the current code, ln2 is a no-op. Its output should be passed to the cross-attention layer

* Add name to contributors
This commit is contained in:
Juarez Bochi
2023-12-18 15:27:27 -05:00
committed by GitHub
parent 4d4af12c6f
commit f4f6e17d45
2 changed files with 2 additions and 1 deletions

View File

@@ -7,6 +7,7 @@ with a short description of your contribution(s) below. For example:
MLX was developed with contributions from the following individuals:
- Juarez Bochi: Fixed bug in cross attention.
# Third-Party Software