mlx-examples/ACKNOWLEDGMENTS.md
Gökdeniz Gülmez 76710f61af
Adding support for mamba (#940)
* initial commit

* initial commit

* Adding first lines

* adding x, and dt projection layers

* adding the clamping mechanism

* First succesful inference

* last commit for today - added custom geenrate function and it works as expected, will try training and then with loading a model from the hub

* clean up

* save up

* almost

* update

* update

* fixed cache handeling

* fixed loading

* added seperate generat_step method in the model and also in the utils to automaticaly use the generate step mthod in the model class

* quick update

* still not working

* save

* still not working

* initial commit

* utils.py logits = logits[:, -1, :] TypeError: tuple indices must be integers or slices, not tuple

* update

* update

* Fixing the Batching Depfwise Comnvolution and multi token input

* fixing generate and logits outputs

* Done!

* Fixing the cache handling, generating works now trying training

* update ACKNOWLEDGEMENTS

* removing the model_type if stuff in the _step loop in generate_step and adding MambaCache in base.py for training easier generations and removing mamba in tuner/utils.

* quick clean up

* update trainer/utils for right initialisation of the layers for LoRA, but not working.

* clean up

* Forther update to trainer/utils for correct layer selection. Successfull training

* removing extra mamba-infer.py file

* clean up, reformating will come later

* reformat and big clean up, final commit

* some speedups and cleanups

* fix test

* nits

* nits

---------

Co-authored-by: Awni Hannun <awni@apple.com>
2024-09-28 07:02:53 -07:00

711 B

Individual Contributors

If you wish to be acknowledged for your contributions, please list your name with a short description of your contribution(s) below. For example:

  • Jane Smith: Added the foo example.

MLX Examples was developed with contributions from the following individuals:

  • Juarez Bochi: Added support for T5 models.
  • Sarthak Yadav: Added the cifar and speechcommands examples.
  • Shunta Saito: Added support for PLaMo models.
  • Gabrijel Boduljak: Implemented CLIP.
  • Markus Enzweiler: Added the cvae examples.
  • Prince Canuma: Helped add support for Starcoder2 models.
  • Shiyu Li: Added the Segment Anything Model.
  • Gökdeniz Gülmez: Added support for MiniCPM and Mamba.