Ouster OS1 LiDAR Examples#
This page contains tutorials for running nvblox with the Ouster OS1 3D LiDAR sensor.
Note
This tutorial only supports running from rosbags, specifically the DOALS dataset. For live OS1 sensor integration, you will need to modify the provided launch file to include a sensor driver.
Prerequisites#
These are the steps common to running all examples on the Ouster OS1.
Complete the Getting Started.
Complete the nvblox quickstart.
OS1 Example#
This example runs nvblox-based reconstruction from Ouster OS1 3D LiDAR data recorded in a rosbag.
Download and convert the DOALS dataset:
Follow the instructions on the DOALS dataset page to download hauptgebaeude.zip.
Unzip the dataset in the assets folder:
cd $ISAAC_ROS_WS/isaac_ros_assets unzip <DOWNLOAD_FOLDER>/hauptgebaeude.zip
Convert the ROS1 bag to ROS2 format:
pip install rosbags --break-system-packages rosbags-convert --src hauptgebaeude/sequence_1/2020-02-20-11-58-45.bag --dst hauptgebaeude_ros2
Note
The
--break-system-packagesoption is required on recent versions of Python-pip if you are not using a virtual environment.Note
For more details, see the Rosbags Conversion Guide.
Note
If you find the DOALS dataset useful for your research, please consider citing the corresponding paper in your publications:
Activate the Isaac ROS environment:
isaac-ros activateRun the OS1 example with your converted rosbag:
ros2 launch nvblox_examples_bringup os1_example.launch.py \ rosbag:=$ISAAC_ROS_WS/isaac_ros_assets/hauptgebaeude_ros2 mode:=dynamic
Note
The OS1 example supports mode:=static (default) and mode:=dynamic arguments.
As the DOALS dataset contains groups of people walking around, the dynamic mode is recommended to reconstruct the scene.
Unlike camera-based examples, LiDAR does not support people segmentation or detection modes.
Note
The DOALS dataset contains motion distorted LiDAR data (i.e. the sensor was moving significantly during scan recordings).
To increase the quality of the reconstruction, nvblox is running CUDA-accelerated motion compensation.
Each point in the pointcloud is motion compensated by approximating the sensor frame at the time of the
point’s acquisition with an interpolation of the sensor pose between scan start and end.
The use_lidar_motion_compensation parameter (default is true) can be used to enable or disable motion compensation.