Human Reconstruction in Isaac Sim
This tutorial demonstrates how to perform human reconstruction in nvblox with Isaac Sim. The detected humans are then visible in a separate dynamic costmap that can be used for navigation with Nav2. If you want to know more about human reconstruction works, see Technical Details.
Note
Due to a known issue, there is no animation of humans in the scene.
Tutorial Walkthrough
Use Isaac Sim for human reconstruction in nvblox. nvblox uses the ground truth semantic segmentation coming from the simulation as input to detect humans. This demonstration relies on the extensions omni.anim.people and omni.anim.navigation to make humans navigate in the environment while the robot is moving.
Running the Demonstration Scene
This section uses the default human paths.
Before continuing this example, you must have successfully completed the Static Reconstruction in Isaac Sim.
Open the Isaac Sim terminal and export the necessary environment variables as explained in steps 2-3 of the Isaac Sim Setup Guide.
Start the simulation by running the following command in the terminal:
./python.sh ${ISAAC_ROS_WS}/src/isaac_ros_nvblox/nvblox_examples/nvblox_isaac_sim/omniverse_scripts/start_isaac_sim.py --with_people --gpu_physics_enabled --scenario_path=/Isaac/Samples/NvBlox/carter_warehouse_navigation_with_people.usd
Note
For more information on the
start_isaac_sim
script, see the README.Note
Because the animation requires execution of Python scripts, running the scene with the UI asks you to confirm that you want to enable script execution. Click Yes to make it possible to start the scene and the human animation.
In another terminal, run the ROS Docker container using the
run_dev.sh
script:cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \ ./scripts/run_dev.sh
Inside the container, build and source the workspace:
cd /workspaces/isaac_ros-dev && \ colcon build --symlink-install && \ source install/setup.bash
Launch nvblox configured for human mapping:
ros2 launch nvblox_examples_bringup isaac_sim_humans_example.launch.py
Running with Custom Human Paths
To have non-default paths taken by the humans, you can:
Change the default human animation file directly on the server
Use the randomization options of the above script. To do so:
./python.sh ${ISAAC_ROS_WS}/src/isaac_ros_nvblox/nvblox_examples/nvblox_isaac_sim/omniverse_scripts/start_isaac_sim.py \ --with_people --scenario_path=/Isaac/Samples/NvBlox/carter_warehouse_navigation_with_people.usd \ --gpu_physics_enabled --anim_people_waypoint_dir <path_to_folder> --with_people --random_command_generation
This launches the scene headless (without visualization) and generates a new
human_cmd_file.txt
in<path_to_folder>
. By default, it generates 5 waypoints per human, but it is possible to change this with the option--num_waypoints=<number_of_waypoints>
.You can then upload the script file to replace the default one, or use the same command as step 2. by adding
--use_generated_command_file --anim_people_waypoint_dir <path_to_folder>
to automatically set the script file.
Running on a Custom Scene
To test the reconstruction on another scene:
Complete the Running the demonstration scene section.
Make sure you use the same robot USD so that the topic names and Isaac Sim ROS bridge is correctly set up.
Make sure that humans you add to the scene have the
person
semantic segmentation class. To do so, you can use the Semantics Schema Editor on the top prim of the additional humans.Make sure that all humans are under /World/Humans for them to be picked up by the randomization.
Troubleshooting
Refer to the Isaac Sim Issues.