Tutorial: Recording and Playing Back Data for Isaac Perceptor
This tutorial guides you through the steps to record and playback data for Isaac Perceptor.
Note
You must run on a hardware platform equipped with Nova Orin Developer Kit to record data compatible with Isaac Perceptor.
Recording Data for Isaac Perceptor
To record data for use in the Running from a ROSbag tutorial below, we use the isaac_ros_nova_recorder, which is part of Isaac ROS Nova repository.
To generate the required data:
1. Follow the installation instructions in Isaac ROS Nova Recorder Quickstart.
2. Please use hawk-3.yaml configuration for recording data with the Nova Orin Developer. This configures the recorder to record only the front, left, and right Hawk stereo cameras.
ros2 launch isaac_ros_nova_recorder nova_recorder.launch.py config:=hawk-3
Use nova-carter_hawk-4.yaml for recording data with the Nova Carter. This configures the recorder to record the front, left, right, and back Hawk stereo cameras.
ros2 launch isaac_ros_nova_recorder nova_recorder.launch.py config:=nova-carter_hawk-4
After you have completed recording, finish data recording with ctrl+c in the terminal.
By default, the resulting data is saved under /mnt/nova_ssd/recordings
. It can be used to
generate a reconstruction using the Running from a ROSbag tutorial described below.
Running from a ROSbag
Downloading Pre-recorded Data (Optional)
You can run Isaac Perceptor on the a pre-recorded ROSbag from Nova Carter.
Download
r2b_galileo
dataset from the r2b 2024 dataset on NGC.Place the dataset at
$ISAAC_ROS_WS/isaac_ros_assets/r2b_2024/r2b_galileo
.
Launching Isaac Perceptor
Note
Complete either Tutorial: Running Camera-based 3D Perception with Isaac Perceptor on Nova Orin Developer Kit or Tutorial: Running Camera-based 3D Perception with Isaac Perceptor on Nova Carter. You must have set up the prerequisites, launched Isaac Perceptor application, and obtained object detection and visual odometry visualizations.
If you want to run the Isaac Perceptor from a ROSbag (rather than streaming from sensors),
you may use the rosbag
argument.
If your ROSbag is recorded on Nova Orin Developer Kit, you can launch the app as:
ros2 launch nova_developer_kit_bringup perceptor.launch.py mode:=rosbag rosbag:=<YOUR_ROSBAG_PATH>
If your ROSbag is recorded on Nova Carter, you can launch the app as:
ros2 launch nova_carter_bringup perceptor.launch.py mode:=rosbag rosbag:=<YOUR_ROSBAG_PATH>
If you are using the pre-recorded ROSbag, you can run:
ros2 launch nova_carter_bringup perceptor.launch.py \
stereo_camera_configuration:=front_left_right_configuration \
mode:=rosbag \
rosbag:=$ISAAC_ROS_WS/isaac_ros_assets/r2b_2024/r2b_galileo
Visualizing the Outputs
Proceed to the Foxglove studio to visualize sensor outputs and mesh of surround environments. If your ROSbag is recorded on Nova Orin Developer Kit, follow similar steps in Visualizing the Outputs from Isaac Perceptor on Nova Orin Developer Kit. If your ROSbag is recorded on Nova Carter, follow Visualizing the Outputs from Isaac Perceptor on Nova Carter.