Set Up Hardware and Software for Real Robot#
Overview#
This tutorial walks through the process of setting up the hardware and software for a real robot with Isaac for Manipulation.
Set Up UR Robot#
Refer to the Set Up UR Robot section.
Set Up Cameras for Robot#
Refer to the Set Up Cameras for Robot section.
Note
Multiple cameras can help reduce occlusion and noise in the scene and therefore increase the quality and completeness of the 3D reconstruction used for collision avoidance.
While the Pick and Place workflow with multiple cameras runs scene reconstruction for obstacle-aware planning on all cameras, object detection and pose estimation are only enabled on the camera with the lowest index.
Reflective or smooth, featureless surfaces in the environment may increase noise in the depth estimation.
Use of multiple cameras is recommended.
Mixing stereo camera types is untested but may work with modifications to the launch files.
Warning
The obstacle avoidance behavior demonstrated in this tutorial is not a safety function and does not comply with any national or international functional safety standards. When testing obstacle avoidance behavior, do not use human limbs or other living entities.
Set Up Development Environment#
Set up your development environment by following the instructions in getting started.
(Optional) Install dependencies for any sensors you want to use by following the sensor-specific guides.
Note
We strongly recommend installing all sensor dependencies before starting any quickstarts. Some sensor dependencies require restarting the development environment during installation, which will interrupt the quickstart process.
Build Isaac for Manipulation Packages#
Activate the Isaac ROS environment:
isaac-ros activateInstall the prebuilt Debian package:
sudo apt-get update
sudo apt-get install -y ros-jazzy-isaac-manipulator-bringup
Clone this repository under
${ISAAC_ROS_WS}/src:cd ${ISAAC_ROS_WS}/src && git clone --recursive -b release-4.0 https://github.com/NVIDIA-ISAAC-ROS/isaac_manipulator.git isaac_manipulator
Activate the Isaac ROS environment:
isaac-ros activateUse
rosdepto install the package’s dependencies:sudo apt-get update
rosdep update && rosdep install --from-paths ${ISAAC_ROS_WS}/src/isaac_manipulator/isaac_manipulator_bringup --ignore-src -y
Install the
segment_anythingpackage viapip.pip install --no-deps --break-system-packages git+https://github.com/facebookresearch/segment-anything.git
Build the package from source:
cd ${ISAAC_ROS_WS} export MANIPULATOR_INSTALL_ASSETS=1 colcon build --symlink-install --packages-up-to isaac_manipulator_bringup
Source the ROS workspace:
Note
Make sure to repeat this step in every terminal created inside the Isaac ROS environment.
Because this package was built from source, the enclosing workspace must be sourced for ROS to be able to find the package’s contents.
source install/setup.bash
Set Up Perception Deep Learning Models#
Prepare the ESS model to run depth estimation:
ros2 run isaac_ros_ess_models_install install_ess_models.sh --eula
The FoundationStereo model is also available for stereo depth estimation. Please follow the documentation in FoundationStereo to set up the model in case of any issues.
export FOUNDATIONSTEREO_MODEL_RES=low_res ros2 run isaac_ros_foundationstereo_models_install install_foundationstereo_models.sh --eula \ --model_res low_res
Set up the FoundationPose model. Please follow the documentation in FoundationPose to set up the model in case of any issues.
ros2 run isaac_ros_foundationpose_models_install install_foundationpose_models.sh --eula
Set up the SyntheticaDETR model. Please follow the documentation in SyntheticaDETR to set up the model in case of any issues.
ros2 run isaac_ros_rtdetr_models_install install_rtdetr_models.sh --eula
Set up the Grounding DINO model. Please follow the documentation in GroundingDINO to set up the model in case of any issues.
ros2 run isaac_ros_grounding_dino_models_install install_grounding_dino_models.sh --eula
Install the
segment_anythingpackage viapip.pip install --no-deps --break-system-packages git+https://github.com/facebookresearch/segment-anything.git
Note
Note that this is only supported for x86 machines. You will have to copy over the converted
onnxfiles in the right locations on the Jetson machine.To setup perception models like FoundationPose, SyntheticaDETR and download sample object assets, run the following command. This command will also give users a final verification that all the models are installed correctly:
Please follow the documentation in GroundingDINO, SegmentAnything and SegmentAnything2 documentation to set up and build the model for the Jetson Thor. You will need to convert the model from the
pthformat to theonnxformat so that Isaac ROS nodes can pick it up.export MANIPULATOR_INSTALL_ASSETS=1 export FOUNDATIONSTEREO_MODEL_RES=low_res ros2 run isaac_manipulator_asset_bringup setup_perception_models.py --models all
export MANIPULATOR_INSTALL_ASSETS=1 export FOUNDATIONSTEREO_MODEL_RES=low_res colcon build --packages-up-to isaac_manipulator_asset_bringup
Note
Running this command can take up to 15 minutes on Jetson AGX Thor. NVLabs has provided a DOPE pre-trained model using the HOPE dataset. To train your own DOPE model, see here.
Note
Please refer to the isaac manipulator asset bringup package for more information. Please note that for the
SEGMENT_ANYTHINGandSEGMENT_ANYTHING2models, the user is expected to convert the model from thepthformat to theonnx(this only works on x86 machines, so that user will have to copy over the convertedonnxfiles in the right locations on the Jetson machine).Note
If you see this message,
ERROR: segment_anything package not found, please install it via: pip install --no-deps --break-system-packages git+https://github.com/facebookresearch/segment-anything.git, then you need to install thesegment_anythingpackage and try to build the package again. Note that this is only supported for x86 machines. You will have to copy over the convertedonnxfiles in the right locations on the Jetson machine.As a sanity check, please run this command to verify that the assets are set up correctly.
ros2 run isaac_manipulator_asset_bringup setup_perception_models.py --models all
You should see the following output:
INFO: === Setting up FoundationPose assets === INFO: Mac and Cheese assets already exist at /workspaces/isaac_ros-dev/ros_ws/isaac_ros_assets/isaac_ros_foundationpose/Mac_and_cheese_0_1 - Skipping download INFO: === Setting up DOPE model === INFO: DOPE model setup completed successfully INFO: === Setting up Segment Anything assets === INFO: Segment Anything assets already exist at /workspaces/isaac_ros-dev/ros_ws/isaac_ros_assets/isaac_ros_segment_anything - Skipping download INFO: === Setting up SAM model === INFO: SAM model already exists at /workspaces/isaac_ros-dev/ros_ws/isaac_ros_assets/isaac_ros_segment_anything/vit_b.pth - Skipping download INFO: ONNX model already exists at /workspaces/isaac_ros-dev/ros_ws/isaac_ros_assets/models/segment_anything/1/model.onnx - Skipping conversion INFO: Config file already exists at /workspaces/isaac_ros-dev/ros_ws/isaac_ros_assets/models/segment_anything/config.pbtxt - Skipping copy INFO: SAM model setup completed successfully INFO: === Setting up UR DNN Policy assets === INFO: UR DNN Policy assets already exist at /workspaces/isaac_ros-dev/ros_ws/isaac_ros_assets/isaac_manipulator_ur_dnn_policy - Skipping download INFO: Setup results: INFO: All requested models were set up successfully!
Build Robotiq Gripper Dependencies#
Clone the Isaac ROS fork of
ros2_robotiq_gripperandtylerjw/serialunder${ISAAC_ROS_WS}/src:cd ${ISAAC_ROS_WS}/src && \ git clone --recursive https://github.com/NVIDIA-ISAAC-ROS/ros2_robotiq_gripper && \ git clone -b ros2 https://github.com/tylerjw/serial
Build the gripper dependencies:
cd ${ISAAC_ROS_WS} colcon build --symlink-install --packages-select-regex robotiq* serial --cmake-args "-DBUILD_TESTING=OFF" && \ source install/setup.bash # Source the workspace after building gripper dependencies
Configure Robotiq Gripper#
Before running the gear assembly or pick and place workflow, please make sure to follow the instructions in setting up the
URrobot and the gripper.You will have to tweak the Tool I/O settings to
Usermode as shown below- especially if you see this error message:Failed to communicate with the Robotiq gripper
Note
If there are any issues with communication of the robot with the Jetson unit, please take a look at this section. Please run the Driver and Hardware Tests to make sure your robot drivers are in a good state.
Install Python Dependencies#
Install RSL-RL:
sudo apt-get install -y python3-git \
&& pip install --break-system-packages tensordict \
&& pip install --break-system-packages --no-deps rsl-rl-lib==3.1.1
Run Pre-Flight Tests on Real Robot (Optional but highly recommended)#
To verify that you have setup Isaac for Manipulation correctly, run the following command:
export ENABLE_MANIPULATOR_TESTING=on_robot
export ISAAC_MANIPULATOR_TEST_CONFIG=<config_file_path_for_your_robot>
python -m pytest ${ISAAC_ROS_WS}/src/isaac_manipulator/isaac_manipulator_bringup/test
Note
One can also use colcon test --packages-select isaac_manipulator_bringup to run the tests.
However, pytest has a better output and makes it easier to view status and progress of the tests.
If these tests fail, please look at Isaac for Manipulation Testing Guide for more information.
It is recommended to run the tests manually using launch_test and then manually inspecting the results.
This will run a series of tests to verify that Isaac for Manipulation is working correctly.