============== |package_name| ============== :ir_github:` ` Quickstart ---------- .. note:: This quickstart demonstrates setting up |package_name|. It is often used with an encoder and decoder node to perform pre-processing and post-processing respectively to form an application. To use the packages in useful contexts, please refer :doc:`here `. Set Up Development Environment ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .. include:: /_snippets/set_up_dev_env.rst Build |package_name| ^^^^^^^^^^^^^^^^^^^^^^ .. tabs:: .. tab:: Binary Package 1. Launch the Docker container using the ``run_dev.sh`` script: .. code:: bash cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \ ./scripts/run_dev.sh 2. Install the prebuilt Debian package: :ir_apt: .. code:: bash sudo apt-get install -y ros-humble-isaac-ros-tensor-rt .. tab:: Build from Source 1. Clone this repository under ``${ISAAC_ROS_WS}/src``: .. code:: bash cd ${ISAAC_ROS_WS}/src && \ git clone :ir_clone:`` 2. Launch the Docker container using the ``run_dev.sh`` script: .. code:: bash cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \ ./scripts/run_dev.sh 3. Use ``rosdep`` to install the package's dependencies: :ir_apt: .. code:: bash rosdep update && rosdep install --from-paths ${ISAAC_ROS_WS}/src/isaac_ros_dnn_inference/isaac_ros_tensor_rt --ignore-src -y 4. Build the package from source: .. code:: bash cd ${ISAAC_ROS_WS}/ && \ colcon build --symlink-install --packages-up-to isaac_ros_tensor_rt --base-paths ${ISAAC_ROS_WS}/src/isaac_ros_dnn_inference/isaac_ros_tensor_rt 5. Source the ROS workspace: .. note:: Make sure to repeat this step in **every** terminal created inside the Docker container. Since this package was built from source, the enclosing workspace must be sourced for ROS to be able to find the package's contents. .. code:: bash source install/setup.bash Troubleshooting --------------- Isaac ROS Troubleshooting ^^^^^^^^^^^^^^^^^^^^^^^^^ For solutions to problems with Isaac ROS, please check :doc:`here `. Deep Learning Troubleshooting ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ For solutions to problems with using DNN models, please check :doc:`here `. API ---- Usage ^^^^^ This package contains a launch file that solely launches ``isaac_ros_tensor_rt``. .. warning:: For your specific application, these launch files may need to be modified. Please consult the available components to see the configurable parameters. Additionally, for most applications, an encoder node for pre-processing your data source and decoder for post-processing the inference output is required. ====================================== ==================== Launch File Components Used ====================================== ==================== ``isaac_ros_tensor_rt.launch.py`` :doc:`TensorRTNode ` ====================================== ==================== TensorRTNode ^^^^^^^^^^^^ ROS Parameters ~~~~~~~~~~~~~~ =========================== =============== ======================== ======================================================================================================================================================================================== ROS Parameter Type Default Description =========================== =============== ======================== ======================================================================================================================================================================================== ``model_file_path`` ``string`` ``model.onnx`` The absolute path to your model file in the local file system (the model file must be ``.onnx``) E.g. ``model.onnx`` ``engine_file_path`` ``string`` ``/tmp/trt_engine.plan`` The absolute path to either where you want your TensorRT engine plan to be generated (from your model file) or where your pre-generated engine plan file is located E.g. ``model.plan`` ``force_engine_update`` ``bool`` ``true`` If set to true, the node will always try to generate a TensorRT engine plan from your model file and needs to be set to false to use the pre-generated TensorRT engine plan ``input_tensor_names`` ``string list`` ``['input_tensor']`` A list of tensor names to be bound to specified input bindings names. Bindings occur in sequential order, so the first name here will be mapped to the first name in input_binding_names ``input_binding_names`` ``string list`` ``['']`` A list of input tensor binding names specified by model E.g. ``['input_2:0']`` ``input_tensor_formats`` ``string list`` ``['']`` A list of input tensor NITROS formats. This should be given in sequential order E.g. ``['nitros_tensor_list_nchw_rgb_f32']`` ``output_tensor_names`` ``string list`` ``['output_tensor']`` A list of tensor names to be bound to specified output binding names ``output_binding_names`` ``string list`` ``['']`` A list of tensor names to be bound to specified output binding names E.g. ``['argmax_1']`` ``output_tensor_formats`` ``string list`` ``['']`` A list of input tensor NITROS formats. This should be given in sequential order E.g. ``[nitros_tensor_list_nchw_rgb_f32]`` ``verbose`` ``bool`` ``true`` If set to true, the node will enable verbose logging to console from the internal TensorRT execution ``max_workspace_size`` ``int64_t`` ``67108864l`` The size of the working space in bytes ``max_batch_size`` ``int32_t`` ``1`` The maximum possible batch size in case the first dimension is dynamic and used as the batch size ``dla_core`` ``int64_t`` ``-1`` The DLA Core to use. Fallback to GPU is always enabled. The default setting is GPU only ``enable_fp16`` ``bool`` ``true`` Enables building a TensorRT engine plan file which uses FP16 precision for inference. If this setting is false, the plan file will use FP32 precision ``relaxed_dimension_check`` ``bool`` ``true`` Ignores dimensions of 1 for the input-tensor dimension check ``num_blocks`` ``int`` ``40`` The number of pre-allocated memory output blocks, should not be less than ``40``. =========================== =============== ======================== ======================================================================================================================================================================================== ROS Topics Subscribed ~~~~~~~~~~~~~~~~~~~~~ ============== ==================================================================================================================================================================== ======================= ROS Topic Type Description ============== ==================================================================================================================================================================== ======================= ``tensor_pub`` :ir_repo:`isaac_ros_tensor_list_interfaces/TensorList ` The input tensor stream ============== ==================================================================================================================================================================== ======================= ROS Topics Published ~~~~~~~~~~~~~~~~~~~~ ============== ==================================================================================================================================================================== ========================================================== ROS Topic Type Description ============== ==================================================================================================================================================================== ========================================================== ``tensor_sub`` :ir_repo:`isaac_ros_tensor_list_interfaces/TensorList ` The tensor list of output tensors from the model inference ============== ==================================================================================================================================================================== ========================================================== .. |package_name| replace:: ``isaac_ros_tensor_rt``