Manipulation#
Overview#
In robotics, the term “manipulator” has traditionally had a very broad definition, referring not only to robots used for repositioning or modifying physical objects but also to robot arms generally, even when used in non-contact applications such as automated optical inspection. Increasingly, robotic manipulators are being used for tasks where their motion must adapt to changing conditions as perceived via cameras or other sensors.
An example of perception-driven manipulation is unstructured picking, where the robot might be tasked with picking a variety of objects whose positions are not known in advance. This requires modules to detect a given object, determine its pose, compute a suitable grasp (dependent on the installed gripper), plan a trajectory to bring the gripper to the desired pose while avoiding collisions, execute the planned trajectory, grasp the object, and finally plan a trajectory to the desired place point, all in real time. More challenging scenarios might require the planner to adapt to a changing environment where the presence and positions of obstacles also vary.
Isaac for Manipulation consists of a set of components and reference workflows for advanced perception-driven manipulation. These components include state-of-the-art packages for object detection and object pose estimation, as well as obstacle-aware motion generation, described in more detail below.
NVIDIA cuMotion#
NVIDIA cuMotion is a software package for computing optimal-time, minimal-jerk trajectories for serial robot arms. It is capable of avoiding collisions with obstacles represented as a set of cuboids, meshes, signed distance fields (computed from one or more depth image streams using nvblox), or any combination of the three. cuMotion leverages NVIDIA hardware acceleration to compute such trajectories in a fraction of a second on Jetson Thor or tens of milliseconds on a discrete GPU such as RTX 6000 (Ada Generation).
In the current release, the planning capabilities of cuMotion are exposed via a plugin for MoveIt 2. In addition, a ROS 2 node is provided that uses the current joint configuration of the robot to perform segmentation in order to filter out the robot from a depth image, as needed to reconstruct obstacles in the environment without spurious contributions from the robot itself.
cuMotion incorporates technology developed by NVIDIA Research and leverages the cuRobo library internally.
Refer to Isaac ROS cuMotion for more on cuMotion and instructions for getting started.
Manipulation Orchestration#
Isaac for Manipulation uses behavior trees to orchestrate complex manipulation workflows. This framework allows you to build robust, modular manipulation systems by combining pre-built behavior components that handle perception, motion planning, and gripper control. The behavior tree approach provides built-in error handling, retry mechanisms, and the ability to run multiple operations in parallel.
Refer to Manipulation Orchestration for detailed information about the behavior tree framework and isaac_manipulator_orchestration package for the behaviors and implementation details.
Robot Configuration#
In order to generate motion or perform segmentation for a given robot, cuMotion requires two files:
A universal/unified robot description format (
URDF) file, describing basic kinematics.An extended robot description format (
XRDF) file, supplementing theURDFfile with collision geometry (as a set of collision spheres), a definition of the configuration space (c-space) used for planning, potential modifiers to theURDFfile, and other such data.
Refer to the following specification for details on XRDF.
For convenience, a visual
Robot Description Editor
with XRDF support is available in Isaac Sim 4.0 and later.
Warning
In the current release, isaac_ros_cumotion also accepts robot description files in a legacy
cuRobo format,
but this support will be discontinued in a future release. Please use XRDF instead.
Pick and Place#
Isaac for Manipulation includes a multi-object Pick and Place workflow that demonstrates perception-driven manipulation capabilities. This reference implementation showcases the integration of multiple Isaac ROS packages working together through the manipulation orchestration framework.
The workflow supports both single-bin collection and multi-bin class-based sorting scenarios, with configurable parameters for different operational modes. It leverages behavior trees for robust coordination between perception and motion systems, enabling parallel object detection while executing sequential manipulation tasks.
For detailed workflow architecture and implementation, refer to Multi-Object Pick and Place section. For hands-on experience, refer to the tutorial.
Tutorials#
To continue your exploration, check out the following suggested tutorials:
More advanced tutorials accompany the Isaac for Manipulation reference workflows: