mirror of
https://github.com/ml-explore/mlx-examples.git
synced 2025-06-24 01:17:28 +08:00
![]() * add segment anything model * add readme * reorg file structure * update * lint * minor updates * ack * fix weight loading * simplify * fix to run notebooks * amg in mlx * remove torch dependency * nit in README * return indices in nms * simplify * bugfix / simplify * fix bug' * simplify * fix notebook and remove output * couple more nits --------- Co-authored-by: Awni Hannun <awni@apple.com> |
||
---|---|---|
.. | ||
notebooks | ||
segment_anything | ||
convert.py | ||
main.py | ||
README.md | ||
requirements.txt |
Segment Anything
An implementation of the Segment Anything Model (SAM) in MLX. See the original repo by Meta AI for more details.1
Installation
pip install -r requirements.txt
Convert
python convert.py --hf-path facebook/sam-vit-base --mlx-path sam-vit-base
The safetensors
weight file and configs are downloaded from Hugging Face,
converted, and saved in the directory specified by --mlx-path
.
The model sizes are:
facebook/sam-vit-base
facebook/sam-vit-large
facebook/sam-vit-huge
Run
See examples notebooks/predictor_example.ipynb
and
notebooks/automatic_mask_generator_example.ipynb
to try the Segment Anything
Model with MLX.
You can also generate masks from the command line:
python main.py --model <path/to/model> --input <image_or_folder> --output <path/to/output>
-
The original Segment Anything GitHub repo. ↩︎