Dynamic Reconstruction with RealSense

https://media.githubusercontent.com/media/NVIDIA-ISAAC-ROS/.github/main/resources/isaac_ros_docs/repositories_and_packages/isaac_ros_nvblox/realsense_dynamic_example.gif/

This tutorial demonstrates how to perform reconstruction of a scene with moving elements in nvblox using a RealSense camera.

The algorithm used for nvblox dynamic reconstruction is based on the following paper:

  • Lukas Schmid, Olov Andersson, Aurelio Sulser, Patrick Pfreundschuh, and Roland Siegwart.
    Dynablox: Real-time Detection of Diverse Dynamic Objects in Complex Environments
    in IEEE Robotics and Automation Letters (RA-L), Vol. 8, No. 10, pp. 6259 - 6266, October 2023.

For more information about dynamic reconstruction in nvblox see Technical Details.

Note

Dynamic reconstruction requires accurate pose estimation. Objects moving slower than the odometry drift can’t be detected as dynamic.

Example with RealSense Live Data

Before continuing this example, you must have successfully completed the

Static Reconstruction with RealSense.

  1. Connect the RealSense device to your machine using a USB 3 cable/port.

  2. Run the ROS Docker container using the run_dev.sh script:

    cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \
    ./scripts/run_dev.sh
    
  3. Source the workspace:

    source /workspaces/isaac_ros-dev/install/setup.bash
    
  4. Verify that the RealSense camera is connected by running realsense-viewer:

    realsense-viewer
    
  5. If successful, run the launch file to spin up the example:

    ros2 launch nvblox_examples_bringup realsense_dynamics_example.launch.py
    

Note

If you want to restrict odometry to a 2D plane (for example, to run a robot in a flat environment), you can use the flatten_odometry_to_2d argument.

Example with RealSense Recorded Data

If you want to run the example on recorded data see RealSense Data Recording for Nvblox.

Troubleshooting

See RealSense Issues.