mlx-examples/ACKNOWLEDGMENTS.md
Gökdeniz Gülmez 56d2db23e1
adding OLMoE architecture (#1321)
* initial commit

* udpate ACKNOWLEDGMENTS.md

* adding olmoe to training

* clean up

* faster generation

* remove sanitize method

* more clean ups

* adding SwitchGLU

* clean up

* a little faster and adding norm_topk_prob

* formated
2025-03-05 13:46:06 -08:00

784 B

Individual Contributors

If you wish to be acknowledged for your contributions, please list your name with a short description of your contribution(s) below. For example:

  • Jane Smith: Added the foo example.

MLX Examples was developed with contributions from the following individuals:

  • Juarez Bochi: Added support for T5 models.
  • Sarthak Yadav: Added the cifar and speechcommands examples.
  • Shunta Saito: Added support for PLaMo models.
  • Gabrijel Boduljak: Implemented CLIP.
  • Markus Enzweiler: Added the cvae examples.
  • Prince Canuma: Helped add support for Starcoder2 models.
  • Shiyu Li: Added the Segment Anything Model.
  • Gökdeniz Gülmez: Added support for MiniCPM, Helium, Mamba version 1, OLMoE archtectures and support for full-fine-tuning.