Tutorial for cuMotion with Perception

Overview

This tutorial walks through the process of planning trajectories for a real robot. It leverages the cuMotion plugin for MoveIt 2 provided by Isaac ROS cuMotion. This tutorial demonstrates obstacle-aware planning using cuMotion by moving the robot end effector between two predetermined poses, alternating between them while avoiding obstacles detected in the workspace.

The tutorial uses:

Warning

The obstacle avoidance behavior demonstrated in this tutorial is not a safety function and does not comply with any national or international functional safety standards. When testing obstacle avoidance behavior, do not use human limbs or other living entities.

Requirements

Please ensure that you have the following available:

Tutorial

Set Up Development Environment

  1. Set up your development environment by following the instructions in getting started.

  2. Complete the Hawk setup tutorial if using a Hawk stereo camera. Alternatively, complete the RealSense setup tutorial if using a RealSense camera

  3. It is recommended to use a PREEMPT_RT kernel on Jetson. Follow the PREEMPT_RT for Jetson guide outside of the container.

  4. Clone isaac_ros_common under ${ISAAC_ROS_WS}/src.

    cd ${ISAAC_ROS_WS}/src && \
      git clone -b release-3.1 https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_common.git isaac_ros_common
    

Build Isaac ROS cuMotion

  1. Clone isaac_manipulator under ${ISAAC_ROS_WS}/src.

    cd ${ISAAC_ROS_WS}/src && \
      git clone --recursive -b release-3.1 https://github.com/NVIDIA-ISAAC-ROS/isaac_manipulator.git isaac_manipulator
    
  2. Launch the Docker container using the run_dev.sh script:

    cd $ISAAC_ROS_WS && ./src/isaac_ros_common/scripts/run_dev.sh
    
  3. There are two options for installing this tutorial: installation from Debian, and installation from source.

    1. Use rosdep to install the package’s dependencies:

      rosdep update
      rosdep install -i -r --from-paths ${ISAAC_ROS_WS}/src/isaac_manipulator/isaac_manipulator_bringup/ --rosdistro humble -y
      
    2. Build and source the ROS workspace

      cd ${ISAAC_ROS_WS}
      colcon build --symlink-install --packages-up-to isaac_manipulator_bringup
      source install/setup.bash
      

Set Up UR5e Robot

  1. Install the UR Robot Driver:

    sudo apt install ros-humble-ur ros-humble-joint-trajectory-controller
    

    Note

    To avoid having to manually install the apt package each time the container is run, consider installing it in a custom layer by following the instructions here.

  2. Setup the UR robot by following the instructions here.

  3. Create a program for external control by following the instructions here.

    Warning

    Please do this step carefully and extract the calibration as explained here. Otherwise the TCP’s pose will not be correct inside the ROS ecosystem.

  4. Finally, note down the IP address of the robot, and substitute it for <ROBOT_IP_ADDRESS> in the instructions below.

  5. Use the moveit_calibration package to calibrate the camera with respect to the robot. Please update the values found in static_transforms.launch.py, which can be found here.

  6. Update the values for world_pose_target1_frame and world_pose_target2_frame in static_transforms.launch.py, which is located here, to be two distinct poses that are reachable by the robot.

Set Up Perception Deep Learning Models

Note

The ESS model is only required for Hawk.

  1. Prepare the Light ESS model to run depth estimation:

    ros2 run isaac_ros_ess_models_install install_ess_models.sh --eula
    

Run Launch Files and Deploy to Robot

Note

We recommend setting a ROS_DOMAIN_ID via export ROS_DOMAIN_ID=<ID_NUMBER> for every new terminal you run ROS commands to avoid interference with other computers in the same network (ROS Guide).

  1. On the UR teach pendant, ensure that the robot’s remote program is loaded and that the robot is paused or stopped for safety purposes.

  2. Launch the UR Robot Driver:

    ros2 launch ur_robot_driver ur_control.launch.py ur_type:=ur5e robot_ip:=<ROBOT_IP_ADDRESS> launch_rviz:=false
    
  3. Open a new terminal inside the Docker container:

    cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \
      ./scripts/run_dev.sh
    
  4. Launch MoveIt with cuMotion for the UR5e robot:

ros2 launch isaac_ros_cumotion_examples ur.launch.py ur_type:=ur5e robot_ip:=<ROBOT_IP_ADDRESS> launch_rviz:=true

Warning

Please add any obstacles that are not visible by the camera into the scene to prevent potential collisions. Consult the documentation here.

  1. Open another terminal inside the Docker container:

    cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \
      ./scripts/run_dev.sh
    
  2. Launch the pose-to-pose example:

    1. Install the Hawk stereo camera:

      sudo apt-get install -y ros-humble-isaac-ros-hawk
      
    2. Launch the pose-to-pose example:

      ros2 launch isaac_manipulator_bringup cumotion_nvblox_pose_to_pose.launch.py \
       ess_engine_file_path:=${ISAAC_ROS_WS}/isaac_ros_assets/models/dnn_stereo_disparity/dnn_stereo_disparity_v4.0.0/light_ess.engine \
       camera_type:=hawk ur_type:=ur5e robot_ip:=<ROBOT_IP_ADDRESS>
      
  3. On the UR teach pendant, press the play button to enable the robot.

https://media.githubusercontent.com/media/NVIDIA-ISAAC-ROS/.github/main/resources/isaac_ros_docs/reference_workflows/isaac_manipulator/ur5e_pose_to_pose.gif/

“Pose-to-pose” example with obstacle avoidance via cuMotion and nvblox on a UR5e robot.