mlx-examples/segment_anything/README.md
Shiyu 8353bbbf93
Segment Anything Model (#552)
* add segment anything model

* add readme

* reorg file structure

* update

* lint

* minor updates

* ack

* fix weight loading

* simplify

* fix to run notebooks

* amg in mlx

* remove torch dependency

* nit in README

* return indices in nms

* simplify

* bugfix / simplify

* fix bug'

* simplify

* fix notebook and remove output

* couple more nits

---------

Co-authored-by: Awni Hannun <awni@apple.com>
2024-06-02 16:45:51 -07:00

40 lines
971 B
Markdown

# Segment Anything
An implementation of the Segment Anything Model (SAM) in MLX. See the original
repo by Meta AI for more details.[^1]
## Installation
```bash
pip install -r requirements.txt
```
## Convert
```bash
python convert.py --hf-path facebook/sam-vit-base --mlx-path sam-vit-base
```
The `safetensors` weight file and configs are downloaded from Hugging Face,
converted, and saved in the directory specified by `--mlx-path`.
The model sizes are:
- `facebook/sam-vit-base`
- `facebook/sam-vit-large`
- `facebook/sam-vit-huge`
## Run
See examples `notebooks/predictor_example.ipynb` and
`notebooks/automatic_mask_generator_example.ipynb` to try the Segment Anything
Model with MLX.
You can also generate masks from the command line:
```bash
python main.py --model <path/to/model> --input <image_or_folder> --output <path/to/output>
```
[^1]: The original Segment Anything [GitHub repo](https://github.com/facebookresearch/segment-anything/tree/main).