isaac_ros_triton#
Source code available on GitHub.
Quickstart#
Note
This quickstart demonstrates setting up isaac_ros_triton. It is often used with
an encoder and decoder node to perform pre-processing and post-processing respectively
to form an application.
To use the packages in useful contexts, refer here.
Set Up Development Environment#
Set up your development environment by following the instructions in getting started.
(Optional) Install dependencies for any sensors you want to use by following the sensor-specific guides.
Note
We strongly recommend installing all sensor dependencies before starting any quickstarts. Some sensor dependencies require restarting the development environment during installation, which will interrupt the quickstart process.
Build isaac_ros_triton#
Activate the Isaac ROS environment:
isaac-ros activateInstall the prebuilt Debian package:
sudo apt-get update
sudo apt-get install -y ros-jazzy-isaac-ros-triton
Install Git LFS:
sudo apt-get install -y git-lfs && git lfs install
Clone this repository under
${ISAAC_ROS_WS}/src:cd ${ISAAC_ROS_WS}/src && \ git clone -b release-4.0 https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_dnn_inference.git isaac_ros_dnn_inference
Activate the Isaac ROS environment:
isaac-ros activateUse
rosdepto install the package’s dependencies:sudo apt-get update
rosdep update && rosdep install --from-paths ${ISAAC_ROS_WS}/src/isaac_ros_dnn_inference/isaac_ros_triton --ignore-src -y
Build the package from source:
cd ${ISAAC_ROS_WS}/ && \ colcon build --symlink-install --packages-up-to isaac_ros_triton --base-paths ${ISAAC_ROS_WS}/src/isaac_ros_dnn_inference/isaac_ros_triton
Source the ROS workspace:
Note
Make sure to repeat this step in every terminal created inside the Isaac ROS environment.
Because this package was built from source, the enclosing workspace must be sourced for ROS to be able to find the package’s contents.
source install/setup.bash
Troubleshooting#
Isaac ROS Troubleshooting#
For solutions to problems with Isaac ROS, see here.
Deep Learning Troubleshooting#
For solutions to problems with using DNN models, see here.
API#
Usage#
This package contains a launch file that solely launches isaac_ros_triton.
Note
For your specific application, these launch files may need to be modified. Consult the available components to see the configurable parameters.
Additionally, for most applications, an encoder node for pre-processing your data source and decoder for post-processing the inference output is required.
Launch File |
Components Used |
|---|---|
|
TritonNode#
ROS Parameters#
ROS Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
The absolute paths to your model repositories in your local file system (the structure should follow Triton requirements) E.g. |
|
|
|
The name of your model. Under |
|
|
|
The maximum batch size allowed for the model. It should align with the model configuration |
|
|
|
The number of requests the Triton server can take at a time. This should be set according to the tensor publisher frequency |
|
|
|
A list of tensor names to be bound to specified input bindings names. Bindings occur in sequential order, so the first name here will be mapped to the first name in input_binding_names |
|
|
|
A list of input tensor binding names specified by model E.g. |
|
|
|
A list of input tensor NITROS formats. This should be given in sequential order E.g. |
|
|
|
A list of tensor names to be bound to specified output binding names |
|
|
|
A list of tensor names to be bound to specified output binding names E.g. |
|
|
|
A list of input tensor NITROS formats. This should be given in sequential order E.g. |
ROS Topics Subscribed#
ROS Topic |
Type |
Description |
|---|---|---|
|
The input tensor stream |
ROS Topics Published#
ROS Topic |
Type |
Description |
|---|---|---|
|
The tensor list of output tensors from the model inference |