TensorRT Export
Environment Setup
Prerequisites
- OS: Linux, Windows (WSL2 is required!)
- 3.8 <= Python <= 3.12
- A virtual environment is recommended, such as conda or virtualenvwrapper
- CUDA >= 11.8
Install Packages
There are two main apt
packages to be installed
Install and Configure Docker
sudo apt-get install docker.io
sudo groupadd docker ## you may need to restart your shell session after this step
sudo usermod -aG docker $USER
## test with this command, if no permission errors, Docker has been set up correctly
docker image list
Install and Configure nvidia-container-toolkit
nvidia-container-toolkit
curl -s -L https://nvidia.github.io/nvidia-container-runtime/gpgkey | \
sudo apt-key add -
distribution=$(. /etc/os-release;echo $ID$VERSION_ID)
curl -s -L https://nvidia.github.io/nvidia-container-runtime/$distribution/nvidia-container-runtime.list | \
sudo tee /etc/apt/sources.list.d/nvidia-container-runtime.list
sudo apt-get update && sudo apt-get install -y nvidia-container-toolkit
sudo nvidia-ctk runtime configure --runtime=docker
sudo systemctl restart docker
Install Python Packages
colored==1.4.4
docker==7.0.0
numpy>=1.23.2,<2.0.0
nvidia-ml-py==12.560.30
opencv-python-headless==4.7.0.72
Pillow>=9.5.0
pywin32; platform_system=="Windows"
requests==2.31.0
tritonclient[all]==2.43.0
pip3 install -r requirements.txt
TensorRT
TensorRTConverter
TensorRTConverter
Main TensorRT module for conversion and inference.
Arguments
Attribute | Type | Description |
---|---|---|
docker_params | DockerParams | Optional Docker parameters for conversion and inference. If not provided, default settings will be used. |
Examples
Initialize TensorRT converter with specified Docker image tags:
from datature.utils.experimental.convert import TensorRTConverter
docker_params = {
"conversion_docker_image": "nvcr.io/nvidia/tensorflow:24.07-tf2-py3",
"inference_docker_image": "nvcr.io/nvidia/tritonserver:24.07-py3"
}
trt = TensorRTConverter(docker_params=DockerParams(**docker_params))
Conversion
This is an experimental tool that is not guaranteed to support conversion of all model architectures. For maximum compatibility, use the conversion tool to export the model and convert instead of providing a local exported model path.
convert
convert
Converts an ONNX model into a TensorRT plan file using the experimental conversion tool.
Arguments
Attribute | Type | Description |
---|---|---|
project | Project | Datature Project instance. This should be provided if you want to export and convert a specific model from your Nexus project. |
artifact_id | str | Artifact ID of the model to export. This should be provided if you want to export and conver a specific model from your Nexus project. |
model_path | str | The path to the saved model directory. This should be provided if you already have an exported model from Nexus that is stored on your local filesystem. |
output_path | str | The path to save the TensorRT plan file. This should have the structure of <MODEL_FOLDER>/<MODEL_NAME>/<VERSION_TAG>/model.plan |
conversion_params | ConversionParams | Experimental conversion settings. This is for advanced users and will likely break the conversion. If not provided, default settings will be used. |
Examples
Export and convert a model from Nexus project:
from datature.nexus import Client
from datature.utils.experimental.convert import TensorRTConverter
secret_key = "<YOUR_SECRET_KEY>"
project_key = "proj_<YOUR_PROJECT_KEY>"
run_id = "run_<YOUR_RUN_ID>"
client = Client(secret_key)
project = client.get_project(project_key)
artifact_id = project.artifacts.list(
filters={"run_ids": [run_id]}
)[0]["id"]
trt = TensorRTConverter()
trt.convert(
project,
artifact_id,
output_path="./trt_models/football/1/model.plan"
)
Convert a downloaded model on your local filesystem:
from datature.utils.experimental.convert import TensorRTConverter
trt = TensorRTConverter()
trt.convert(
model_path="./models/model.onnx"
output_path="./trt_models/football/1/model.plan"
)
Supported Model Architectures
Object Detection | Instance Segmentation | Semantic Segmentation | Keypoint Detection | Classification |
---|---|---|---|---|
YOLO11 [new!] | YOLO11-SEG [new!] | Segformer | YOLO11-POSE [new!] | YOLO11-CLS [new!] |
YOLOv9 | YOLOv8-SEG | DeepLabV3 | YOLOv8-POSE | MoViNet |
YOLOv8 | MaskRCNN | FCN | YOLOv8-CLS | |
FasterRCNN | UNet | |||
RetinaNet | ||||
EfficientDet | ||||
MobileNet |
Updated about 1 month ago