isaac_ros_unitree_g1_teleop_bringup#
Source code available on GitHub.
Overview#
Top-level launch file for Unitree G1 teleoperation combining AGILE locomotion, bimanual inverse kinematics, and finger control. Supports XR teleoperation via Isaac Teleop and RViz interactive markers.
Both simulation and real hardware are supported.
Tutorial: Unitree G1 XR Teleop#
This tutorial walks through running whole-body XR teleoperation on the Unitree G1 humanoid robot. The application combines AGILE locomotion, bimanual inverse kinematics, and finger control, all driven by an XR headset.
You will first run the application in MuJoCo simulation, then deploy on real hardware.
Prerequisites#
Note
This tutorial has been tested and qualified on Jetson AGX Thor for both simulation and real robot deployment. MuJoCo simulation is also supported on x86_64.
PICO 4 Ultra headset (if no XR headset is available, the emulator provided by Isaac Teleop Core can be used instead)
Unitree G1 robot powered on and connected to the host machine via Ethernet
Set Up Development Environment#
Set up your development environment by following the instructions in getting started.
(Optional) Install dependencies for any sensors you want to use by following the sensor-specific guides.
Note
We strongly recommend installing all sensor dependencies before starting any quickstarts. Some sensor dependencies require restarting the development environment during installation, which will interrupt the quickstart process.
Set up the Unitree G1 with a Jetson AGX Thor connected to it by following these guides:
Build isaac_ros_unitree_g1_teleop_bringup#
If using the Isaac ROS Environment in Docker mode, open a new terminal and mount
~/.cloudxr:mkdir -p ~/.cloudxr && grep -qxF -- '-v `realpath ~/.cloudxr`:/home/admin/.cloudxr' ~/.isaac_ros_dev-dockerargs 2>/dev/null || echo "-v `realpath ~/.cloudxr`:/home/admin/.cloudxr" >> ~/.isaac_ros_dev-dockerargs
Install and build
isaac_ros_unitree_g1_teleop_bringup:Activate the Isaac ROS environment:
isaac-ros activateInstall the prebuilt Debian package:
sudo apt-get update
sudo apt-get install -y ros-jazzy-isaac-ros-unitree-g1-teleop-bringup
Install
mujoco_ros2_control:sudo apt-get install -y ros-jazzy-mujoco-ros2-control source /opt/ros/jazzy/setup.bash
Install Git LFS:
sudo apt-get install -y git-lfs && git lfs install
Clone this repository under
${ISAAC_ROS_WS}/src:cd ${ISAAC_ROS_WS}/src && \ git clone -b release-4.4 https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_physical_ai.git isaac_ros_physical_ai
Activate the Isaac ROS environment:
isaac-ros activateUse
rosdepto install the package’s dependencies:sudo apt-get update
rosdep update && rosdep install --from-paths ${ISAAC_ROS_WS}/src/isaac_ros_physical_ai/isaac_ros_unitree_g1_teleop_bringup --ignore-src -y
Build the package from source:
cd ${ISAAC_ROS_WS}/ && \ colcon build --symlink-install --packages-up-to isaac_ros_unitree_g1_teleop_bringup --base-paths ${ISAAC_ROS_WS}/src/isaac_ros_physical_ai/isaac_ros_unitree_g1_teleop_bringup
Source the ROS workspace:
Note
Make sure to repeat this step in every terminal created inside the Isaac ROS environment. Because this package was built from source, the enclosing workspace must be sourced for ROS to be able to find the package’s contents.
source install/setup.bash
Run CloudXR#
In a new terminal, clone
isaac_ros_teleop:cd ${ISAAC_ROS_WS}/src && \ git clone --recurse-submodules -b release-4.4 https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_teleop.git isaac_ros_teleop
Run CloudXR Runtime via the CloudXR Docker Container:
cd ${ISAAC_ROS_WS}/src/isaac_ros_teleop/isaac_teleop_core/IsaacTeleop && \ ./scripts/run_cloudxr_via_docker.sh
Connect the XR headset to the teleop server. Follow the headset connection guide.
Note
If you are running this on Thor, make sure to set the
Video CodectoH.264, otherwise the headset will fail to connect.Warning
The world frame of the headset is defined as the position of the headset and controllers at the moment of connection. Stand still and face the robot before connecting to establish a consistent world frame. To reset the world frame, disconnect and reconnect the headset while stationary. On real hardware, ensure the robot is stopped (
blend_ratioset to0.0) before disconnecting.In the original terminal, launch the application:
Launch the teleop application:
ros2 launch isaac_ros_unitree_g1_teleop_bringup unitree_g1_teleop.launch.py \ hardware_type:=mujoco input_mode:=teleop
This opens the MuJoCo viewer with the G1 robot. The robot will fall to the ground initially because there is no gantry in simulation. Press Backspace in the MuJoCo viewer to reset (you may need to press it multiple times).
Note
In simulation,
blend_ratiodefaults to1.0so the policy is active immediately.With the controllers in your hands, start moving them. You should see the robot’s arms track your movements in the MuJoCo viewer.
Warning
Before operating on real hardware:
Ensure the working area is free of any persons or other potential hazards.
Always start with
blend_ratioat0.0. You can increase from0.0to1.0in a single step since the ratio is smoothed internally.Ensure the waist yaw joint is close to zero before launching. It is uncontrolled and will be held at its current position, so a rotated torso can degrade balance.
Have the disable command ready (refer to the disable step below).
Set up the network — clone
isaac_ros_robotsand run the setup script outside the docker container on the host machine:cd ${ISAAC_ROS_WS}/src && \ git clone -b release-4.4 https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_robots.git isaac_ros_robots
Run the network setup script:
${ISAAC_ROS_WS}/src/isaac_ros_robots/isaac_ros_robots_tools/scripts/setup_network.py
The script will interactively guide you through the network setup. Make sure to select the network interface that is physically connected to the G1 robot.
Launch the application:
ros2 launch isaac_ros_unitree_g1_teleop_bringup unitree_g1_teleop.launch.py \ hardware_type:=real \ input_mode:=teleop \ network_interface:=<your_interface>
Replace
<your_interface>with the network interface you selected in the previous step.Note
The application starts but the robot will not move because
blend_ratiodefaults to0.0on real hardware.Tip
To verify that XR commands are reaching the controller:
ros2 topic echo /xr_teleop/ee_poses ros2 topic echo /xr_teleop/root_twist
To disable the robot, set the blend ratio back to zero:
ros2 param set /safety_controller blend_ratio 0.0
Tip
Keep this command in your shell history so you can execute it quickly if something goes wrong.
Enable the robot by setting the blend ratio:
ros2 param set /safety_controller blend_ratio 1.0
The robot will start tracking your hand movements.
Note
After several minutes of operation, the G1 hands may lower due to temperature limits. Allow the robot to cool down before resuming
Controller Reference#
The PICO 4 Ultra headset includes two handheld controllers. The following table summarizes what each input does during teleoperation:
Input |
Action |
|---|---|
Left joystick |
Move the robot: up = forward, down = backward, left = strafe left, right = strafe right |
Right joystick — left / right |
Rotate the robot in place (yaw) |
Controller motion (6-DOF) |
The end-effector pose tracks the physical controller; moving and rotating the controller moves the robot’s hand correspondingly |
Triggers (each controller has two) |
Open and close the finger joints of the tri-finger hand |
API#
Usage#
ros2 launch isaac_ros_unitree_g1_teleop_bringup unitree_g1_teleop.launch.py
Launch Arguments#
Launch Argument |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
Hardware platform. Options: |
|
|
|
Input source. Options: |
|
|
|
Network interface for G1 communication. Only used when |
|
|
|
Enable MuJoCo GUI viewer. Only used when |
|
|
|
Enable RViz visualization. Automatically set to |
|
|
|
Start Foxglove bridge for remote monitoring. |
ROS Topics#
Topics depend on the input_mode launch argument. In teleop mode:
ROS Topic |
Interface |
Description |
|---|---|---|
|
|
End-effector (wrist) poses from XR headset |
|
|
Root velocity command from XR headset |
|
|
Retargeted finger joint angles from XR hand tracking |
In markers mode:
ROS Topic |
Interface |
Description |
|---|---|---|
|
|
End-effector poses published by the RViz interactive marker node |
ROS Parameters#
Parameter |
Node |
Type |
Default |
Description |
|---|---|---|---|---|
|
|
|
|
Policy activation level (0.0–1.0). Dynamically adjustable at runtime. |
Troubleshooting#
Test Without an XR Headset (Interactive Markers Mode)#
If the XR headset is unavailable or you want to isolate whether an issue is
with XR or the robot itself, launch with input_mode:=markers:
ros2 launch isaac_ros_unitree_g1_teleop_bringup unitree_g1_teleop.launch.py \
input_mode:=markers
RViz opens automatically with 6-DOF interactive markers for each wrist.
The /ik_controller/reference_pose topic
replaces the /xr_teleop/ee_poses topic in this mode.
Publish to
/cmd_velto start the controller:ros2 topic pub --rate 10 /cmd_vel geometry_msgs/msg/Twist \ "{linear: {x: 0.0, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.0}}"
In the RViz Displays panel, find the IK Target Marker display and set its Interactive Markers Namespace to
/ik_controller_marker. You can then drag the wrist markers to command the arms.
Remote Monitoring with Foxglove#
If visualization via Foxglove is desired, add
use_foxglove:=true to any launch command to start the Foxglove bridge:
ros2 launch isaac_ros_unitree_g1_teleop_bringup unitree_g1_teleop.launch.py \
use_foxglove:=true
Refer to Foxglove Setup for instructions on connecting Foxglove Studio.