Add mobile-sam auto-annotation to segmentation datasets docs (#18654)
Signed-off-by: fatih akyon <34196005+fcakyon@users.noreply.github.com> Signed-off-by: Muhammad Rizwan Munawar <muhammadrizwanmunawar123@gmail.com> Co-authored-by: UltralyticsAssistant <web@ultralytics.com> Co-authored-by: fatih akyon <34196005+fcakyon@users.noreply.github.com>
This commit is contained in:
parent
590b9ad655
commit
9045d8fecc
2 changed files with 16 additions and 22 deletions
|
|
@ -140,15 +140,9 @@ To auto-annotate your dataset using the Ultralytics framework, you can use the `
|
|||
auto_annotate(data="path/to/images", det_model="yolo11x.pt", sam_model="sam_b.pt")
|
||||
```
|
||||
|
||||
| Argument | Type | Description | Default |
|
||||
| ------------ | ----------------------- | ----------------------------------------------------------------------------------------------------------- | -------------- |
|
||||
| `data` | `str` | Path to a folder containing images to be annotated. | `None` |
|
||||
| `det_model` | `str, optional` | Pre-trained YOLO detection model. Defaults to `'yolo11x.pt'`. | `'yolo11x.pt'` |
|
||||
| `sam_model` | `str, optional` | Pre-trained SAM segmentation model. Defaults to `'sam_b.pt'`. | `'sam_b.pt'` |
|
||||
| `device` | `str, optional` | Device to run the models on. Defaults to an empty string (CPU or GPU, if available). | `''` |
|
||||
| `output_dir` | `str or None, optional` | Directory to save the annotated results. Defaults to a `'labels'` folder in the same directory as `'data'`. | `None` |
|
||||
{% include "macros/sam-auto-annotate.md" %}
|
||||
|
||||
The `auto_annotate` function takes the path to your images, along with optional arguments for specifying the pre-trained detection and [SAM segmentation models](../../models/sam.md), the device to run the models on, and the output directory for saving the annotated results.
|
||||
The `auto_annotate` function takes the path to your images, along with optional arguments for specifying the pre-trained detection models i.e. [YOLO11](../../models/yolo11.md), [YOLOv8](../../models/yolov8.md) or other [models](../../models/index.md) and segmentation models i.e, [SAM](../../models/sam.md), [SAM2](../../models/sam-2.md) or [MobileSAM](../../models/mobile-sam.md), the device to run the models on, and the output directory for saving the annotated results.
|
||||
|
||||
By leveraging the power of pre-trained models, auto-annotation can significantly reduce the time and effort required for creating high-quality segmentation datasets. This feature is particularly useful for researchers and developers working with large image collections, as it allows them to focus on model development and evaluation rather than manual annotation.
|
||||
|
||||
|
|
@ -195,7 +189,7 @@ Auto-annotation in Ultralytics YOLO allows you to generate segmentation annotati
|
|||
```python
|
||||
from ultralytics.data.annotator import auto_annotate
|
||||
|
||||
auto_annotate(data="path/to/images", det_model="yolo11x.pt", sam_model="sam_b.pt")
|
||||
auto_annotate(data="path/to/images", det_model="yolo11x.pt", sam_model="sam_b.pt") # or sam_model="mobile_sam.pt"
|
||||
```
|
||||
|
||||
This function automates the annotation process, making it faster and more efficient. For more details, explore the [Auto-Annotation](#auto-annotation) section.
|
||||
This function automates the annotation process, making it faster and more efficient. For more details, explore the [Auto-Annotate Reference](https://docs.ultralytics.com/reference/data/annotator/#ultralytics.data.annotator.auto_annotate).
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue