Camera-based Perception with Isaac Perceptor in Isaac Sim
This tutorial will enable you to run autonomous navigation in Isaac Sim. The tutorial uses the Isaac Perceptor stack for local camera-based perception, using the simulated sensor outputs.
Hardware Requirements
Simulating the scene and robot sensors requires an RTX-enabled GPU of sufficient capability and memory capacity. In particular, we recommend an “ideal” machine in the Isaac Sim requirements.
Note
Presently, we only support running Perceptor in simulation using a single forward-facing Hawk stereo camera. We expect future releases to enable multi-cam simulation.
Note
This tutorial is only supported up to Isaac Sim 4.0. Note that other Isaac ROS packages may support Isaac Sim 4.1.
Prerequisites
To complete this tutorial you need to:
Set up your development environment by following the instructions in getting started.
Clone
isaac_ros_common
under${ISAAC_ROS_WS}/src
:cd ${ISAAC_ROS_WS}/src && \ git clone -b release-3.1 https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_common.git isaac_ros_common
Complete the Isaac Perceptor Prerequisites.
Install Isaac Sim 4.0. To do so, complete the Isaac Sim Setup, installing Isaac Sim version 4.0. Note that you do not need to perform the fix in the Note beginning with “Due to a known issue in Isaac Sim 4.1, please run the following command…”.
Install
Build/install the required packages:
1. Make sure you followed the Prerequisites and you are inside the Isaac ROS Docker container.
Install the required Debian packages:
sudo apt update sudo apt-get install -y ros-humble-nova-carter-bringup source /opt/ros/humble/setup.bash
Install the required assets:
sudo apt-get install -y ros-humble-isaac-ros-ess-models-install ros-humble-isaac-ros-peoplesemseg-models-install source /opt/ros/humble/setup.bash ros2 run isaac_ros_ess_models_install install_ess_models.sh --eula ros2 run isaac_ros_peoplesemseg_models_install install_peoplesemsegnet_vanilla.sh --eula ros2 run isaac_ros_peoplesemseg_models_install install_peoplesemsegnet_shuffleseg.sh --eula1. Make sure you followed the Prerequisites and you are inside the Isaac ROS Docker container.
Use
rosdep
to install the package’s dependencies:sudo apt update rosdep update rosdep install -i -r --from-paths ${ISAAC_ROS_WS}/src/nova_carter/nova_carter_bringup/ \ --rosdistro humble -y
Install the required assets:
sudo apt-get install -y ros-humble-isaac-ros-ess-models-install ros-humble-isaac-ros-peoplesemseg-models-install source /opt/ros/humble/setup.bash ros2 run isaac_ros_ess_models_install install_ess_models.sh --eula ros2 run isaac_ros_peoplesemseg_models_install install_peoplesemsegnet_vanilla.sh --eula ros2 run isaac_ros_peoplesemseg_models_install install_peoplesemsegnet_shuffleseg.sh --eula
Build the ROS package in the Docker container:
colcon build --symlink-install --packages-up-to nova_carter_bringup \ source install/setup.bash
Instructions
Terminal #1:
Open a terminal from the Isaac Sim launcher GUI, as described in Isaac Sim Setup.
Start the simulation by running:
./isaac-sim.sh
Open the scene at the path
localhost/NVIDIA/Assets/Isaac/4.0/Isaac/Samples/NvBlox/perceptor_sample_scene.usd
.Play the scene to start the ROS communication from sim.
Terminal #2:
Inside the container, launch the example:
ros2 launch nova_carter_bringup navigation.launch.py \ mode:=simulation run_rviz:=True
In RViz
1. Click on the 2D Goal Pose button. You should see the robot moving towards the goal location, using the costmap produced by the camera-based 3D perception Isaac Perceptor.
Visualizing the Outputs
In RViz, it shows camera images streaming from the Nova Carter and Isaac nvblox mesh visualization of surrounding environments. You should expect a visualization similar as below.
The reconstructed colored voxels are uniformly reconstructed with a resolution of 5cm.
The computed distance map from Isaac nvblox outputs. The rainbow color spectrum reflects the proximity of each region to nearest obstacles. Regions closer to obstacle surfaces are marked in warmer colors (red, orange), while regions further away from obstacle surfaces are marked in cooler colors (blue, violet).
To learn more about topics published by Isaac nvblox, you can refer to nvblox ROS messages.
By default, as specified in the launch file navigation.launch.py
, the front camera on
Nova Carter is used in Isaac Perceptor algorithms.
You may use the stereo_camera_configuration
launch argument to customize camera
configurations when running this tutorial.
For a detailed description of other available configurations refer to Tutorial: Stereo Camera Configurations for Isaac Perceptor.