Add FAQs to Docs Datasets and Help sections (#14211)
Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com> Co-authored-by: UltralyticsAssistant <web@ultralytics.com>
This commit is contained in:
parent
64862f1b69
commit
d5db9c916f
73 changed files with 3296 additions and 110 deletions
|
|
@ -96,3 +96,69 @@ The example showcases the variety and complexity of the images in the Tiger-Pose
|
|||
## Citations and Acknowledgments
|
||||
|
||||
The dataset has been released available under the [AGPL-3.0 License](https://github.com/ultralytics/ultralytics/blob/main/LICENSE).
|
||||
|
||||
## FAQ
|
||||
|
||||
### What is the Ultralytics Tiger-Pose dataset used for?
|
||||
|
||||
The Ultralytics Tiger-Pose dataset is designed for pose estimation tasks, consisting of 263 images sourced from a [YouTube video](https://www.youtube.com/watch?v=MIBAT6BGE6U&pp=ygUbVGlnZXIgd2Fsa2luZyByZWZlcmVuY2UubXA0). The dataset is divided into 210 training images and 53 validation images. It is particularly useful for testing, training, and refining pose estimation algorithms using [Ultralytics HUB](https://hub.ultralytics.com) and [YOLOv8](https://github.com/ultralytics/ultralytics).
|
||||
|
||||
### How do I train a YOLOv8 model on the Tiger-Pose dataset?
|
||||
|
||||
To train a YOLOv8n-pose model on the Tiger-Pose dataset for 100 epochs with an image size of 640, use the following code snippets. For more details, visit the [Training](../../modes/train.md) page:
|
||||
|
||||
!!! Example "Train Example"
|
||||
|
||||
=== "Python"
|
||||
|
||||
```python
|
||||
from ultralytics import YOLO
|
||||
|
||||
# Load a model
|
||||
model = YOLO("yolov8n-pose.pt") # load a pretrained model (recommended for training)
|
||||
|
||||
# Train the model
|
||||
results = model.train(data="tiger-pose.yaml", epochs=100, imgsz=640)
|
||||
```
|
||||
|
||||
|
||||
=== "CLI"
|
||||
|
||||
```bash
|
||||
# Start training from a pretrained *.pt model
|
||||
yolo task=pose mode=train data=tiger-pose.yaml model=yolov8n.pt epochs=100 imgsz=640
|
||||
```
|
||||
|
||||
### What configurations does the `tiger-pose.yaml` file include?
|
||||
|
||||
The `tiger-pose.yaml` file is used to specify the configuration details of the Tiger-Pose dataset. It includes crucial data such as file paths and class definitions. To see the exact configuration, you can check out the [Ultralytics Tiger-Pose Dataset Configuration File](https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/tiger-pose.yaml).
|
||||
|
||||
### How can I run inference using a YOLOv8 model trained on the Tiger-Pose dataset?
|
||||
|
||||
To perform inference using a YOLOv8 model trained on the Tiger-Pose dataset, you can use the following code snippets. For a detailed guide, visit the [Prediction](../../modes/predict.md) page:
|
||||
|
||||
!!! Example "Inference Example"
|
||||
|
||||
=== "Python"
|
||||
|
||||
```python
|
||||
from ultralytics import YOLO
|
||||
|
||||
# Load a model
|
||||
model = YOLO("path/to/best.pt") # load a tiger-pose trained model
|
||||
|
||||
# Run inference
|
||||
results = model.predict(source="https://youtu.be/MIBAT6BGE6U", show=True)
|
||||
```
|
||||
|
||||
|
||||
=== "CLI"
|
||||
|
||||
```bash
|
||||
# Run inference using a tiger-pose trained model
|
||||
yolo task=pose mode=predict source="https://youtu.be/MIBAT6BGE6U" show=True model="path/to/best.pt"
|
||||
```
|
||||
|
||||
### What are the benefits of using the Tiger-Pose dataset for pose estimation?
|
||||
|
||||
The Tiger-Pose dataset, despite its manageable size of 210 images for training, provides a diverse collection of images that are ideal for testing pose estimation pipelines. The dataset helps identify potential errors and acts as a preliminary step before working with larger datasets. Additionally, the dataset supports the training and refinement of pose estimation algorithms using advanced tools like [Ultralytics HUB](https://hub.ultralytics.com) and [YOLOv8](https://github.com/ultralytics/ultralytics), enhancing model performance and accuracy.
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue