Tutorial to Run NITROS-Accelerated Graph with Argus Camera

graph LR; argus_node("ArgusMonoNode (Raw Image)") --> rectify_node("RectifyNode (Rectified Image)"); rectify_node --> encoder_node("DnnImageEncoderNode (DNN Pre-Processed Tensors)"); encoder_node --> triton_node("TritonNode (DNN Inference)"); triton_node --> unet_decoder_node("UNetDecoderNode (Segmentation Image)");

If you have an Argus-compatible camera, you can also use the launch file provided in this package to start a fully NITROS-accelerated image segmentation graph.

To start the graph:

  1. Follow the quickstart up to step 7.

  2. Inside the container, install the isaac_ros_argus_camera package.

    sudo apt-get install -y ros-humble-isaac-ros-argus-camera
    
  3. Run the following launch files to start the graph:

    ros2 launch isaac_ros_unet isaac_ros_argus_unet_triton.launch.py model_name:=peoplesemsegnet_shuffleseg model_repository_paths:=['/tmp/models'] input_binding_names:=['input_2:0'] output_binding_names:=['argmax_1'] network_output_type:='argmax'
    
  4. In another terminal, visualize and validate the output of the package by launching

    rqt_image_view:

    cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \
      ./scripts/run_dev.sh
    

    Then launch rqt_image_view:

    ros2 run rqt_image_view rqt_image_view
    

To view a colorized segmentation mask, inside the rqt_image_view GUI, change the topic to /unet/colored_segmentation_mask.

You can also view the raw segmentation, which is published to

/unet/raw_segmentation_mask, where the raw pixels correspond to the class labels making it unsuitable for human visual inspection.