Camera-based Perception with Isaac Perceptor in Isaac Sim

This tutorial will enable you to run autonomous navigation in Isaac Sim. The tutorial uses the Isaac Perceptor stack for local camera-based perception, using the simulated sensor outputs.

Hardware Requirements

Simulating the scene and robot sensors requires an RTX-enabled GPU of sufficient capability and memory capacity. In particular, we recommend an “ideal” machine in the Isaac Sim requirements.


Presently, we only support running Perceptor in simulation using a single forward-facing Hawk stereo camera. We expect future releases to enable multi-cam simulation.


This tutorial has been tested with Isaac Sim 4.0.


To complete this tutorial you need to:

  1. Set up your development environment by following the instructions in getting started.

  2. Clone isaac_ros_common under ${ISAAC_ROS_WS}/src:

    cd ${ISAAC_ROS_WS}/src && \
       git clone
  3. Complete the Isaac Perceptor Prerequisites.

  4. Complete the Isaac Sim Setup.


  1. Build/install the required packages:

  1. Make sure you followed the Prerequisites and you are inside the Isaac ROS docker container.

  2. Install the required Debian packages:

sudo apt update
sudo apt-get install -y ros-humble-nova-carter-bringup 
source /opt/ros/humble/setup.bash
  1. Install the required assets:

sudo apt-get install -y ros-humble-isaac-ros-peoplesemseg-models-install ros-humble-isaac-ros-ess-models-install
source /opt/ros/humble/setup.bash
ros2 run isaac_ros_ess_models_install --eula
ros2 run isaac_ros_peoplesemseg_models_install --eula
ros2 run isaac_ros_peoplesemseg_models_install --eula


  • Terminal #1:

    1. Open a terminal from the Isaac Sim launcher GUI, as described in Isaac Sim Setup.

    2. Start the simulation by running:

    3. Open the scene at the path localhost/NVIDIA/Assets/Isaac/4.0/Isaac/Samples/NvBlox/perceptor_sample_scene.usd.

    4. Play the scene to start the ROS communication from sim.

  • Terminal #2:

    1. Inside the container, launch the example:

      ros2 launch nova_carter_bringup \
      mode:=simulation run_rviz:=True
  • In RViz

    1. Click on the 2D Goal Pose button. You should see the robot moving towards the goal location, using the costmap produced by the camera-based 3D perception Isaac Perceptor.

Visualizing the Outputs

In RViz, it shows camera images streaming from the Nova Carter and Isaac nvblox mesh visualization of surrounding environments. You should expect a visualization similar as below.

  • The reconstructed colored voxels are uniformly reconstructed with a resolution of 5cm.

  • The computed distance map from Isaac nvblox outputs. The rainbow color spectrum reflects the proximity of each region to nearest obstacles. Regions closer to obstacle surfaces are marked in warmer colors (red, orange), while regions further away from obstacle surfaces are marked in cooler colors (blue, violet).

To learn more about topics published by Isaac nvblox, you can refer to nvblox ROS messages.

By default, as specified in the launch file, the front camera on Nova Carter is used in Isaac Perceptor algorithms. You may use the stereo_camera_configuration launch argument to customize camera configurations when running this tutorial.

For a detailed description of other available configurations refer to Tutorial: Stereo Camera Configurations for Isaac Perceptor.