Update YOLO11 Actions and Docs (#16596)

Signed-off-by: UltralyticsAssistant <web@ultralytics.com>
This commit is contained in:
Ultralytics Assistant 2024-10-01 16:58:12 +02:00 committed by GitHub
parent 51e93d6111
commit 97f38409fb
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
124 changed files with 1948 additions and 1948 deletions

View file

@ -20,7 +20,7 @@ The output of an object detector is a set of bounding boxes that enclose the obj
allowfullscreen>
</iframe>
<br>
<strong>Watch:</strong> Object Detection with Pre-trained Ultralytics YOLOv8 Model.
<strong>Watch:</strong> Object Detection with Pre-trained Ultralytics YOLO Model.
</p>
!!! tip
@ -215,7 +215,7 @@ Ultralytics YOLO11 offers various pretrained models for object detection, segmen
For a detailed list and performance metrics, refer to the [Models](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/cfg/models/11) section.
### How can I validate the accuracy of my trained YOLOv8 model?
### How can I validate the accuracy of my trained YOLO model?
To validate the accuracy of your trained YOLO11 model, you can use the `.val()` method in Python or the `yolo detect val` command in CLI. This will provide metrics like mAP50-95, mAP50, and more.

View file

@ -27,7 +27,7 @@ The output of an oriented object detector is a set of rotated bounding boxes tha
allowfullscreen>
</iframe>
<br>
<strong>Watch:</strong> Object Detection using Ultralytics YOLOv8 Oriented Bounding Boxes (YOLOv8-OBB)
<strong>Watch:</strong> Object Detection using Ultralytics YOLO Oriented Bounding Boxes (YOLO-OBB)
</p>
## Visual Samples
@ -94,7 +94,7 @@ Train YOLO11n-obb on the `dota8.yaml` dataset for 100 [epochs](https://www.ultra
allowfullscreen>
</iframe>
<br>
<strong>Watch:</strong> How to Train Ultralytics YOLOv8-OBB (Oriented Bounding Boxes) Models on DOTA Dataset using Ultralytics HUB
<strong>Watch:</strong> How to Train Ultralytics YOLO-OBB (Oriented Bounding Boxes) Models on DOTA Dataset using Ultralytics HUB
</p>
### Dataset format
@ -165,7 +165,7 @@ Use a trained YOLO11n-obb model to run predictions on images.
allowfullscreen>
</iframe>
<br>
<strong>Watch:</strong> How to Detect and Track Storage Tanks using Ultralytics YOLOv8-OBB | Oriented Bounding Boxes | DOTA
<strong>Watch:</strong> How to Detect and Track Storage Tanks using Ultralytics YOLO-OBB | Oriented Bounding Boxes | DOTA
</p>
See full `predict` mode details in the [Predict](../modes/predict.md) page.

View file

@ -22,7 +22,7 @@ The output of a pose estimation model is a set of points that represent the keyp
allowfullscreen>
</iframe>
<br>
<strong>Watch:</strong> Pose Estimation with Ultralytics YOLOv8.
<strong>Watch:</strong> Pose Estimation with Ultralytics YOLO.
</td>
<td align="center">
<iframe loading="lazy" width="720" height="405" src="https://www.youtube.com/embed/aeAX6vWpfR0"
@ -235,7 +235,7 @@ Validation of a YOLO11-pose model involves assessing its accuracy using the same
from ultralytics import YOLO
# Load a model
model = YOLO("yolov8n-pose.pt") # load an official model
model = YOLO("yolo11n-pose.pt") # load an official model
model = YOLO("path/to/best.pt") # load a custom model
# Validate the model
@ -252,7 +252,7 @@ Yes, you can export a YOLO11-pose model to various formats like ONNX, CoreML, Te
from ultralytics import YOLO
# Load a model
model = YOLO("yolov8n-pose.pt") # load an official model
model = YOLO("yolo11n-pose.pt") # load an official model
model = YOLO("path/to/best.pt") # load a custom trained model
# Export the model

View file

@ -21,7 +21,7 @@ The output of an instance segmentation model is a set of masks or contours that
allowfullscreen>
</iframe>
<br>
<strong>Watch:</strong> Run Segmentation with Pre-Trained Ultralytics YOLOv8 Model in Python.
<strong>Watch:</strong> Run Segmentation with Pre-Trained Ultralytics YOLO Model in Python.
</p>
!!! tip
@ -210,11 +210,11 @@ Object detection identifies and localizes objects within an image by drawing bou
### Why use YOLO11 for instance segmentation?
Ultralytics YOLO11 is a state-of-the-art model recognized for its high accuracy and real-time performance, making it ideal for instance segmentation tasks. YOLO11 Segment models come pretrained on the [COCO dataset](https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/coco.yaml), ensuring robust performance across a variety of objects. Additionally, YOLOv8 supports training, validation, prediction, and export functionalities with seamless integration, making it highly versatile for both research and industry applications.
Ultralytics YOLO11 is a state-of-the-art model recognized for its high accuracy and real-time performance, making it ideal for instance segmentation tasks. YOLO11 Segment models come pretrained on the [COCO dataset](https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/coco.yaml), ensuring robust performance across a variety of objects. Additionally, YOLO supports training, validation, prediction, and export functionalities with seamless integration, making it highly versatile for both research and industry applications.
### How do I load and validate a pretrained YOLOv8 segmentation model?
### How do I load and validate a pretrained YOLO segmentation model?
Loading and validating a pretrained YOLOv8 segmentation model is straightforward. Here's how you can do it using both Python and CLI:
Loading and validating a pretrained YOLO segmentation model is straightforward. Here's how you can do it using both Python and CLI:
!!! example
@ -240,9 +240,9 @@ Loading and validating a pretrained YOLOv8 segmentation model is straightforward
These steps will provide you with validation metrics like [Mean Average Precision](https://www.ultralytics.com/glossary/mean-average-precision-map) (mAP), crucial for assessing model performance.
### How can I export a YOLOv8 segmentation model to ONNX format?
### How can I export a YOLO segmentation model to ONNX format?
Exporting a YOLOv8 segmentation model to ONNX format is simple and can be done using Python or CLI commands:
Exporting a YOLO segmentation model to ONNX format is simple and can be done using Python or CLI commands:
!!! example