Add NVIDIA Jetpack4 and Jetpack5 Docker Images (#13100)
Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com> Co-authored-by: Lakshantha <lakshantha@ultralytics.com> Co-authored-by: UltralyticsAssistant <web@ultralytics.com>
This commit is contained in:
parent
05be0c54e5
commit
ca82d41ec8
5 changed files with 109 additions and 23 deletions
15
.github/workflows/docker.yaml
vendored
15
.github/workflows/docker.yaml
vendored
|
|
@ -23,9 +23,13 @@ on:
|
|||
type: boolean
|
||||
description: Use Dockerfile-arm64
|
||||
default: true
|
||||
Dockerfile-jetson:
|
||||
Dockerfile-jetson-jetpack5:
|
||||
type: boolean
|
||||
description: Use Dockerfile-jetson
|
||||
description: Use Dockerfile-jetson-jetpack5
|
||||
default: true
|
||||
Dockerfile-jetson-jetpack4:
|
||||
type: boolean
|
||||
description: Use Dockerfile-jetson-jetpack4
|
||||
default: true
|
||||
Dockerfile-python:
|
||||
type: boolean
|
||||
|
|
@ -58,8 +62,11 @@ jobs:
|
|||
- dockerfile: "Dockerfile-arm64"
|
||||
tags: "latest-arm64"
|
||||
platforms: "linux/arm64"
|
||||
- dockerfile: "Dockerfile-jetson"
|
||||
tags: "latest-jetson"
|
||||
- dockerfile: "Dockerfile-jetson-jetpack5"
|
||||
tags: "latest-jetson-jetpack5"
|
||||
platforms: "linux/arm64"
|
||||
- dockerfile: "Dockerfile-jetson-jetpack4"
|
||||
tags: "latest-jetson-jetpack4"
|
||||
platforms: "linux/arm64"
|
||||
- dockerfile: "Dockerfile-python"
|
||||
tags: "latest-python"
|
||||
|
|
|
|||
64
docker/Dockerfile-jetson-jetpack4
Normal file
64
docker/Dockerfile-jetson-jetpack4
Normal file
|
|
@ -0,0 +1,64 @@
|
|||
# Ultralytics YOLO 🚀, AGPL-3.0 license
|
||||
# Builds ultralytics/ultralytics:jetson-jetpack4 image on DockerHub https://hub.docker.com/r/ultralytics/ultralytics
|
||||
# Supports JetPack4.x for YOLOv8 on Jetson Nano, TX2, Xavier NX, AGX Xavier
|
||||
|
||||
# Start FROM https://catalog.ngc.nvidia.com/orgs/nvidia/containers/l4t-cuda
|
||||
FROM nvcr.io/nvidia/l4t-cuda:10.2.460-runtime
|
||||
|
||||
# Set environment variables
|
||||
ENV APP_HOME /usr/src/ultralytics
|
||||
|
||||
# Downloads to user config dir
|
||||
ADD https://github.com/ultralytics/assets/releases/download/v0.0.0/Arial.ttf \
|
||||
https://github.com/ultralytics/assets/releases/download/v0.0.0/Arial.Unicode.ttf \
|
||||
/root/.config/Ultralytics/
|
||||
|
||||
# Add NVIDIA repositories for TensorRT dependencies
|
||||
RUN wget -q -O - https://repo.download.nvidia.com/jetson/jetson-ota-public.asc | apt-key add - && \
|
||||
echo "deb https://repo.download.nvidia.com/jetson/common r32.7 main" > /etc/apt/sources.list.d/nvidia-l4t-apt-source.list && \
|
||||
echo "deb https://repo.download.nvidia.com/jetson/t194 r32.7 main" >> /etc/apt/sources.list.d/nvidia-l4t-apt-source.list
|
||||
|
||||
# Install dependencies
|
||||
RUN apt update && \
|
||||
apt install --no-install-recommends -y git python3.8 python3.8-dev python3-pip python3-libnvinfer libopenmpi-dev libopenblas-base libomp-dev gcc
|
||||
|
||||
# Create symbolic links for python3.8 and pip3
|
||||
RUN ln -sf /usr/bin/python3.8 /usr/bin/python3
|
||||
RUN ln -s /usr/bin/pip3 /usr/bin/pip
|
||||
|
||||
# Create working directory
|
||||
WORKDIR $APP_HOME
|
||||
|
||||
# Copy contents and assign permissions
|
||||
COPY . $APP_HOME
|
||||
RUN chown -R root:root $APP_HOME
|
||||
ADD https://github.com/ultralytics/assets/releases/download/v8.2.0/yolov8n.pt $APP_HOME
|
||||
|
||||
# Download onnxruntime-gpu, TensorRT, PyTorch and Torchvision
|
||||
# Other versions can be seen in https://elinux.org/Jetson_Zoo and https://forums.developer.nvidia.com/t/pytorch-for-jetson/72048
|
||||
ADD https://nvidia.box.com/shared/static/gjqofg7rkg97z3gc8jeyup6t8n9j8xjw.whl onnxruntime_gpu-1.8.0-cp38-cp38-linux_aarch64.whl
|
||||
ADD https://forums.developer.nvidia.com/uploads/short-url/hASzFOm9YsJx6VVFrDW1g44CMmv.whl tensorrt-8.2.0.6-cp38-none-linux_aarch64.whl
|
||||
ADD https://github.com/ultralytics/yolov5/releases/download/v1.0/torch-1.11.0a0+gitbc2c6ed-cp38-cp38-linux_aarch64.whl \
|
||||
torch-1.11.0a0+gitbc2c6ed-cp38-cp38-linux_aarch64.whl
|
||||
ADD https://github.com/ultralytics/yolov5/releases/download/v1.0/torchvision-0.12.0a0+9b5a3fe-cp38-cp38-linux_aarch64.whl \
|
||||
torchvision-0.12.0a0+9b5a3fe-cp38-cp38-linux_aarch64.whl
|
||||
|
||||
# Install pip packages
|
||||
RUN python3 -m pip install --upgrade pip wheel
|
||||
RUN pip install onnxruntime_gpu-1.8.0-cp38-cp38-linux_aarch64.whl tensorrt-8.2.0.6-cp38-none-linux_aarch64.whl \
|
||||
torch-1.11.0a0+gitbc2c6ed-cp38-cp38-linux_aarch64.whl torchvision-0.12.0a0+9b5a3fe-cp38-cp38-linux_aarch64.whl
|
||||
RUN pip install --no-cache-dir -e ".[export]"
|
||||
|
||||
# Usage Examples -------------------------------------------------------------------------------------------------------
|
||||
|
||||
# Build and Push
|
||||
# t=ultralytics/ultralytics:latest-jetson-jetpack4 && sudo docker build --platform linux/arm64 -f docker/Dockerfile-jetson-jetpack4 -t $t . && sudo docker push $t
|
||||
|
||||
# Run
|
||||
# t=ultralytics/ultralytics:latest-jetson-jetpack4 && sudo docker run -it --ipc=host $t
|
||||
|
||||
# Pull and Run
|
||||
# t=ultralytics/ultralytics:latest-jetson-jetpack4 && sudo docker pull $t && sudo docker run -it --ipc=host $t
|
||||
|
||||
# Pull and Run with NVIDIA runtime
|
||||
# t=ultralytics/ultralytics:latest-jetson-jetpack4 && sudo docker pull $t && sudo docker run -it --ipc=host --runtime=nvidia $t
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
# Ultralytics YOLO 🚀, AGPL-3.0 license
|
||||
# Builds ultralytics/ultralytics:jetson image on DockerHub https://hub.docker.com/r/ultralytics/ultralytics
|
||||
# Supports JetPack for YOLOv8 on Jetson Nano, TX1/TX2, Xavier NX, AGX Xavier, AGX Orin, and Orin NX
|
||||
# Builds ultralytics/ultralytics:jetson-jetson-jetpack5 image on DockerHub https://hub.docker.com/r/ultralytics/ultralytics
|
||||
# Supports JetPack5.x for YOLOv8 on Jetson Xavier NX, AGX Xavier, AGX Orin, Orin Nano and Orin NX
|
||||
|
||||
# Start FROM https://catalog.ngc.nvidia.com/orgs/nvidia/containers/l4t-pytorch
|
||||
FROM nvcr.io/nvidia/l4t-pytorch:r35.2.1-pth2.0-py3
|
||||
|
|
@ -43,13 +43,13 @@ RUN pip install --no-cache-dir -e ".[export]"
|
|||
# Usage Examples -------------------------------------------------------------------------------------------------------
|
||||
|
||||
# Build and Push
|
||||
# t=ultralytics/ultralytics:latest-jetson && sudo docker build --platform linux/arm64 -f docker/Dockerfile-jetson -t $t . && sudo docker push $t
|
||||
# t=ultralytics/ultralytics:latest-jetson-jetpack5 && sudo docker build --platform linux/arm64 -f docker/Dockerfile-jetson-jetpack5 -t $t . && sudo docker push $t
|
||||
|
||||
# Run
|
||||
# t=ultralytics/ultralytics:latest-jetson && sudo docker run -it --ipc=host $t
|
||||
# t=ultralytics/ultralytics:latest-jetson-jetpack5 && sudo docker run -it --ipc=host $t
|
||||
|
||||
# Pull and Run
|
||||
# t=ultralytics/ultralytics:latest-jetson && sudo docker pull $t && sudo docker run -it --ipc=host $t
|
||||
# t=ultralytics/ultralytics:latest-jetson-jetpack5 && sudo docker pull $t && sudo docker run -it --ipc=host $t
|
||||
|
||||
# Pull and Run with NVIDIA runtime
|
||||
# t=ultralytics/ultralytics:latest-jetson && sudo docker pull $t && sudo docker run -it --ipc=host --runtime=nvidia $t
|
||||
# t=ultralytics/ultralytics:latest-jetson-jetpack5 && sudo docker pull $t && sudo docker run -it --ipc=host --runtime=nvidia $t
|
||||
|
|
@ -12,7 +12,7 @@ This comprehensive guide provides a detailed walkthrough for deploying Ultralyti
|
|||
|
||||
!!! Note
|
||||
|
||||
This guide has been tested with [Seeed Studio reComputer J4012](https://www.seeedstudio.com/reComputer-J4012-p-5586.html) which is based on NVIDIA Jetson Orin NX 16GB running the latest stable JetPack release of [JP5.1.3](https://developer.nvidia.com/embedded/jetpack-sdk-513). Using this guide for older Jetson devices such as the Jetson Nano (this only supports until JP4.6.4) may not be guaranteed to work. However this is expected to work on all Jetson Orin, Xavier NX, AGX Xavier devices running JP5.1.3.
|
||||
This guide has been tested with both [Seeed Studio reComputer J4012](https://www.seeedstudio.com/reComputer-J4012-p-5586.html) which is based on NVIDIA Jetson Orin NX 16GB running the latest stable JetPack release of [JP5.1.3](https://developer.nvidia.com/embedded/jetpack-sdk-513) and [Seeed Studio reComputer J1020 v2](https://www.seeedstudio.com/reComputer-J1020-v2-p-5498.html) which is based on NVIDIA Jetson Nano 4GB running JetPack release of [JP4.6.1](https://developer.nvidia.com/embedded/jetpack-sdk-461). It is expected to work across all the NVIDIA Jetson hardware lineup including latest and legacy.
|
||||
|
||||
## What is NVIDIA Jetson?
|
||||
|
||||
|
|
@ -41,37 +41,41 @@ For a more detailed comparison table, please visit the **Technical Specification
|
|||
|
||||
The first step after getting your hands on an NVIDIA Jetson device is to flash NVIDIA JetPack to the device. There are several different way of flashing NVIDIA Jetson devices.
|
||||
|
||||
1. If you own an official NVIDIA Development Kit such as the Jetson Orin Nano Developer Kit, you can visit [this link](https://developer.nvidia.com/embedded/learn/get-started-jetson-orin-nano-devkit) to download an image and prepare an SD card with JetPack for booting the device.
|
||||
2. If you own any other NVIDIA Development Kit, you can visit [this link](https://docs.nvidia.com/sdk-manager/install-with-sdkm-jetson/index.html) to flash JetPack to the device using [SDK Manager](https://developer.nvidia.com/sdk-manager).
|
||||
3. If you own a Seeed Studio reComputer J4012 device, you can visit [this link](https://wiki.seeedstudio.com/reComputer_J4012_Flash_Jetpack) to flash JetPack to the included SSD.
|
||||
4. If you own any other third party device powered by the NVIDIA Jetson module, it is recommended to follow command-line flashing by visiting [this link](https://docs.nvidia.com/jetson/archives/r35.5.0/DeveloperGuide/IN/QuickStart.html).
|
||||
1. If you own an official NVIDIA Development Kit such as the Jetson Orin Nano Developer Kit, you can [download an image and prepare an SD card with JetPack for booting the device](https://developer.nvidia.com/embedded/learn/get-started-jetson-orin-nano-devkit).
|
||||
2. If you own any other NVIDIA Development Kit, you can [flash JetPack to the device using SDK Manager](https://docs.nvidia.com/sdk-manager/install-with-sdkm-jetson/index.html).
|
||||
3. If you own a Seeed Studio reComputer J4012 device, you can [flash JetPack to the included SSD](https://wiki.seeedstudio.com/reComputer_J4012_Flash_Jetpack) and if you own a Seeed Studio reComputer J1020 v2 device, you can [flash JetPack to the eMMC/ SSD](https://wiki.seeedstudio.com/reComputer_J2021_J202_Flash_Jetpack).
|
||||
4. If you own any other third party device powered by the NVIDIA Jetson module, it is recommended to follow [command-line flashing](https://docs.nvidia.com/jetson/archives/r35.5.0/DeveloperGuide/IN/QuickStart.html).
|
||||
|
||||
!!! Note
|
||||
|
||||
For methods 3 and 4 above, after flashing the system and booting the device, please enter "sudo apt update && sudo apt install nvidia-jetpack -y" on the device terminal to install all the remaining JetPack components needed.
|
||||
|
||||
## Set Up Ultralytics
|
||||
## Run on JetPack 5.x
|
||||
|
||||
If you own a Jetson Xavier NX, AGX Xavier, AGX Orin, Orin Nano or Orin NX which supports JetPack 5.x, you can continue to follow this guide. However, if you have a legacy device such as Jetson Nano, please skip to [Run on JetPack 4.x](#run-on-jetpack-4x).
|
||||
|
||||
### Set Up Ultralytics
|
||||
|
||||
There are two ways of setting up Ultralytics package on NVIDIA Jetson to build your next Computer Vision project. You can use either of them.
|
||||
|
||||
- [Start with Docker](#start-with-docker)
|
||||
- [Start without Docker](#start-without-docker)
|
||||
|
||||
### Start with Docker
|
||||
#### Start with Docker
|
||||
|
||||
The fastest way to get started with Ultralytics YOLOv8 on NVIDIA Jetson is to run with pre-built docker image for Jetson.
|
||||
|
||||
Execute the below command to pull the Docker container and run on Jetson. This is based on [l4t-pytorch](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/l4t-pytorch) docker image which contains PyTorch and Torchvision in a Python3 environment.
|
||||
|
||||
```bash
|
||||
t=ultralytics/ultralytics:latest-jetson && sudo docker pull $t && sudo docker run -it --ipc=host --runtime=nvidia $t
|
||||
t=ultralytics/ultralytics:latest-jetson-jp5 && sudo docker pull $t && sudo docker run -it --ipc=host --runtime=nvidia $t
|
||||
```
|
||||
|
||||
After this is done, skip to [Use TensorRT on NVIDIA Jetson section](#use-tensorrt-on-nvidia-jetson).
|
||||
|
||||
### Start without Docker
|
||||
#### Start without Docker
|
||||
|
||||
#### Install Ultralytics Package
|
||||
##### Install Ultralytics Package
|
||||
|
||||
Here we will install Ultralytics package on the Jetson with optional dependencies so that we can export the PyTorch models to other different formats. We will mainly focus on [NVIDIA TensorRT exports](../integrations/tensorrt.md) because TensorRT will make sure we can get the maximum performance out of the Jetson devices.
|
||||
|
||||
|
|
@ -95,7 +99,7 @@ Here we will install Ultralytics package on the Jetson with optional dependencie
|
|||
sudo reboot
|
||||
```
|
||||
|
||||
#### Install PyTorch and Torchvision
|
||||
##### Install PyTorch and Torchvision
|
||||
|
||||
The above ultralytics installation will install Torch and Torchvision. However, these 2 packages installed via pip are not compatible to run on Jetson platform which is based on ARM64 architecture. Therefore, we need to manually install pre-built PyTorch pip wheel and compile/ install Torchvision from source.
|
||||
|
||||
|
|
@ -125,7 +129,7 @@ The above ultralytics installation will install Torch and Torchvision. However,
|
|||
|
||||
Visit the [PyTorch for Jetson page](https://forums.developer.nvidia.com/t/pytorch-for-jetson/72048) to access all different versions of PyTorch for different JetPack versions. For a more detailed list on the PyTorch, Torchvision compatibility, visit the [PyTorch and Torchvision compatibility page](https://github.com/pytorch/vision).
|
||||
|
||||
#### Install `onnxruntime-gpu`
|
||||
##### Install `onnxruntime-gpu`
|
||||
|
||||
The [onnxruntime-gpu](https://pypi.org/project/onnxruntime-gpu/) package hosted in PyPI does not have `aarch64` binaries for the Jetson. So we need to manually install this package. This package is needed for some of the exports.
|
||||
|
||||
|
|
@ -142,6 +146,16 @@ pip install onnxruntime_gpu-1.17.0-cp38-cp38-linux_aarch64.whl
|
|||
|
||||
`pip install numpy==1.23.5`
|
||||
|
||||
## Run on JetPack 4.x
|
||||
|
||||
Here we support to run Ultralytics on legacy hardware such as the Jetson Nano. Currently we use Docker to achieve this.
|
||||
|
||||
Execute the below command to pull the Docker container and run on Jetson. This is based on [l4t-cuda](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/l4t-cuda) docker image which contains CUDA in a L4T environment.
|
||||
|
||||
```bash
|
||||
t=ultralytics/ultralytics:jetson-jp4 && sudo docker pull $t && sudo docker run -it --ipc=host --runtime=nvidia $t
|
||||
```
|
||||
|
||||
## Use TensorRT on NVIDIA Jetson
|
||||
|
||||
Out of all the model export formats supported by Ultralytics, TensorRT delivers the best inference performance when working with NVIDIA Jetson devices and our recommendation is to use TensorRT with Jetson. We also have a detailed document on TensorRT [here](../integrations/tensorrt.md).
|
||||
|
|
@ -276,7 +290,7 @@ The below table represents the benchmark results for five different models (YOLO
|
|||
| PaddlePaddle | ✅ | 520.8 | 0.7479 | 10619.53 |
|
||||
| NCNN | ✅ | 260.4 | 0.7646 | 376.38 |
|
||||
|
||||
Visit [this link](https://www.seeedstudio.com/blog/2023/03/30/yolov8-performance-benchmarks-on-nvidia-jetson-devices) to explore more benchmarking efforts by Seeed Studio running on different versions of NVIDIA Jetson hardware.
|
||||
[Explore more benchmarking efforts by Seeed Studio](https://www.seeedstudio.com/blog/2023/03/30/yolov8-performance-benchmarks-on-nvidia-jetson-devices) running on different versions of NVIDIA Jetson hardware.
|
||||
|
||||
## Reproduce Our Results
|
||||
|
||||
|
|
|
|||
|
|
@ -22,6 +22,7 @@ ayush.chaurarsia@gmail.com: AyushExel
|
|||
chr043416@gmail.com: RizwanMunawar
|
||||
glenn.jocher@ultralytics.com: glenn-jocher
|
||||
jpedrofonseca_94@hotmail.com: null
|
||||
lakshantha@ultralytics.com: lakshanthad
|
||||
lakshanthad@yahoo.com: lakshanthad
|
||||
muhammadrizwanmunawar123@gmail.com: RizwanMunawar
|
||||
not.committed.yet: null
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue