Camera-based Perception with Isaac Perceptor in Isaac Sim

This tutorial will enable you to run autonomous navigation in Isaac Sim. The tutorial uses the Isaac Perceptor stack for local camera-based perception, using the simulated sensor outputs.

Hardware Requirements

Simulating the scene and robot sensors requires an RTX-enabled GPU of sufficient capability and memory capacity. In particular, we recommend an “ideal” machine in the Isaac Sim requirements.

Note

Presently, we only support running Perceptor in simulation using a single forward-facing Hawk stereo camera. We expect future releases to enable multi-cam simulation.

Note

This tutorial has been tested with Isaac Sim 4.0.

Prerequisites

To complete this tutorial you need to:

  1. Set up your development environment by following the instructions in getting started.

  2. Clone isaac_ros_common under ${ISAAC_ROS_WS}/src:

    cd ${ISAAC_ROS_WS}/src && \
       git clone https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_common.git
    
  3. Complete the Isaac Perceptor Prerequisites.

  4. Complete the Isaac Sim Setup.

Install

  1. Build/install the required packages:

  1. Make sure you followed the Prerequisites and you are inside the Isaac ROS docker container.

  2. Install the required Debian packages:

sudo apt update
sudo apt-get install -y ros-humble-nova-carter-bringup 
source /opt/ros/humble/setup.bash
  1. Install the required assets:

sudo apt-get install -y ros-humble-isaac-ros-ess-models-install ros-humble-isaac-ros-peoplesemseg-models-install
source /opt/ros/humble/setup.bash
    
ros2 run isaac_ros_ess_models_install install_ess_models.sh --eula
ros2 run isaac_ros_peoplesemseg_models_install install_peoplesemsegnet_vanilla.sh --eula
ros2 run isaac_ros_peoplesemseg_models_install install_peoplesemsegnet_shuffleseg.sh --eula

Instructions

  • Terminal #1:

    1. Open a terminal from the Isaac Sim launcher GUI, as described in Isaac Sim Setup.

    2. Start the simulation by running:

      ./isaac-sim.sh
      
    3. Open the scene at the path localhost/NVIDIA/Assets/Isaac/4.0/Isaac/Samples/NvBlox/perceptor_sample_scene.usd.

    4. Play the scene to start the ROS communication from sim.

  • Terminal #2:

    1. Inside the container, launch the example:

      ros2 launch nova_carter_bringup navigation.launch.py \
      mode:=simulation run_rviz:=True
      
  • In RViz

    1. Click on the 2D Goal Pose button. You should see the robot moving towards the goal location, using the costmap produced by the camera-based 3D perception Isaac Perceptor.

Visualizing the Outputs

In RViz, it shows camera images streaming from the Nova Carter and Isaac nvblox mesh visualization of surrounding environments. You should expect a visualization similar as below.

  • The reconstructed colored voxels are uniformly reconstructed with a resolution of 5cm.

  • The computed distance map from Isaac nvblox outputs. The rainbow color spectrum reflects the proximity of each region to nearest obstacles. Regions closer to obstacle surfaces are marked in warmer colors (red, orange), while regions further away from obstacle surfaces are marked in cooler colors (blue, violet).

To learn more about topics published by Isaac nvblox, you can refer to nvblox ROS messages.

https://media.githubusercontent.com/media/NVIDIA-ISAAC-ROS/.github/main/resources/isaac_ros_docs/robots/nova_carter/perceptor_in_isaac_sim.png/

By default, as specified in the launch file navigation.launch.py, the front camera on Nova Carter is used in Isaac Perceptor algorithms. You may use the stereo_camera_configuration launch argument to customize camera configurations when running this tutorial.

For a detailed description of other available configurations refer to Tutorial: Stereo Camera Configurations for Isaac Perceptor.