isaac_ros_unitree_g1_teleop_bringup#

Source code available on GitHub.

Overview#

Top-level launch file for Unitree G1 teleoperation combining AGILE locomotion, bimanual inverse kinematics, and finger control. Supports XR teleoperation via Isaac Teleop and RViz interactive markers.

Both simulation and real hardware are supported.

Tutorial: Unitree G1 XR Teleop#

This tutorial walks through running whole-body XR teleoperation on the Unitree G1 humanoid robot. The application combines AGILE locomotion, bimanual inverse kinematics, and finger control, all driven by an XR headset.

You will first run the application in MuJoCo simulation, then deploy on real hardware.

Prerequisites#

Note

This tutorial has been tested and qualified on Jetson AGX Thor for both simulation and real robot deployment. MuJoCo simulation is also supported on x86_64.

  • PICO 4 Ultra headset (if no XR headset is available, the emulator provided by Isaac Teleop Core can be used instead)

  • Unitree G1 robot powered on and connected to the host machine via Ethernet

Set Up Development Environment#

  1. Set up your development environment by following the instructions in getting started.

  2. (Optional) Install dependencies for any sensors you want to use by following the sensor-specific guides.

    Note

    We strongly recommend installing all sensor dependencies before starting any quickstarts. Some sensor dependencies require restarting the development environment during installation, which will interrupt the quickstart process.

  1. Set up the Unitree G1 with a Jetson AGX Thor connected to it by following these guides:

Build isaac_ros_unitree_g1_teleop_bringup#

  1. If using the Isaac ROS Environment in Docker mode, open a new terminal and mount ~/.cloudxr:

    mkdir -p ~/.cloudxr && grep -qxF -- '-v `realpath ~/.cloudxr`:/home/admin/.cloudxr' ~/.isaac_ros_dev-dockerargs 2>/dev/null || echo "-v `realpath ~/.cloudxr`:/home/admin/.cloudxr" >> ~/.isaac_ros_dev-dockerargs
    
  2. Install and build isaac_ros_unitree_g1_teleop_bringup:

    1. Activate the Isaac ROS environment:

      isaac-ros activate
      
    2. Install the prebuilt Debian package:

      sudo apt-get update
      
      sudo apt-get install -y ros-jazzy-isaac-ros-unitree-g1-teleop-bringup
      
    3. Install mujoco_ros2_control:

      sudo apt-get install -y ros-jazzy-mujoco-ros2-control
      source /opt/ros/jazzy/setup.bash
      

Run CloudXR#

  1. In a new terminal, clone isaac_ros_teleop:

    cd ${ISAAC_ROS_WS}/src && \
       git clone --recurse-submodules -b release-4.4 https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_teleop.git isaac_ros_teleop
    
  2. Run CloudXR Runtime via the CloudXR Docker Container:

    cd ${ISAAC_ROS_WS}/src/isaac_ros_teleop/isaac_teleop_core/IsaacTeleop && \
       ./scripts/run_cloudxr_via_docker.sh
    
  3. Connect the XR headset to the teleop server. Follow the headset connection guide.

    Note

    If you are running this on Thor, make sure to set the Video Codec to H.264, otherwise the headset will fail to connect.

    Warning

    The world frame of the headset is defined as the position of the headset and controllers at the moment of connection. Stand still and face the robot before connecting to establish a consistent world frame. To reset the world frame, disconnect and reconnect the headset while stationary. On real hardware, ensure the robot is stopped (blend_ratio set to 0.0) before disconnecting.

  4. In the original terminal, launch the application:

    1. Launch the teleop application:

      ros2 launch isaac_ros_unitree_g1_teleop_bringup unitree_g1_teleop.launch.py \
          hardware_type:=mujoco input_mode:=teleop
      

      This opens the MuJoCo viewer with the G1 robot. The robot will fall to the ground initially because there is no gantry in simulation. Press Backspace in the MuJoCo viewer to reset (you may need to press it multiple times).

      Note

      In simulation, blend_ratio defaults to 1.0 so the policy is active immediately.

    2. With the controllers in your hands, start moving them. You should see the robot’s arms track your movements in the MuJoCo viewer.

Controller Reference#

The PICO 4 Ultra headset includes two handheld controllers. The following table summarizes what each input does during teleoperation:

Input

Action

Left joystick

Move the robot: up = forward, down = backward, left = strafe left, right = strafe right

Right joystick — left / right

Rotate the robot in place (yaw)

Controller motion (6-DOF)

The end-effector pose tracks the physical controller; moving and rotating the controller moves the robot’s hand correspondingly

Triggers (each controller has two)

Open and close the finger joints of the tri-finger hand

API#

Usage#

ros2 launch isaac_ros_unitree_g1_teleop_bringup unitree_g1_teleop.launch.py

Launch Arguments#

Launch Argument

Type

Default

Description

hardware_type

string

mujoco

Hardware platform. Options: mujoco (simulation), real (physical robot).

input_mode

string

teleop

Input source. Options: teleop (XR device via CloudXR), markers (RViz interactive markers).

network_interface

string

eno1

Network interface for G1 communication. Only used when hardware_type:=real.

enable_viewer

bool

true

Enable MuJoCo GUI viewer. Only used when hardware_type:=mujoco.

use_rviz

bool

false

Enable RViz visualization. Automatically set to true when input_mode:=markers.

use_foxglove

bool

false

Start Foxglove bridge for remote monitoring.

ROS Topics#

Topics depend on the input_mode launch argument. In teleop mode:

ROS Topic

Interface

Description

/xr_teleop/ee_poses

geometry_msgs/PoseArray

End-effector (wrist) poses from XR headset

/xr_teleop/root_twist

geometry_msgs/TwistStamped

Root velocity command from XR headset

/xr_teleop/finger_joints

sensor_msgs/JointState

Retargeted finger joint angles from XR hand tracking

In markers mode:

ROS Topic

Interface

Description

/ik_controller/reference_pose

geometry_msgs/PoseArray

End-effector poses published by the RViz interactive marker node

ROS Parameters#

Parameter

Node

Type

Default

Description

blend_ratio

/safety_controller

double

0.0 (real), 1.0 (sim)

Policy activation level (0.0–1.0). Dynamically adjustable at runtime.

Troubleshooting#

Test Without an XR Headset (Interactive Markers Mode)#

If the XR headset is unavailable or you want to isolate whether an issue is with XR or the robot itself, launch with input_mode:=markers:

ros2 launch isaac_ros_unitree_g1_teleop_bringup unitree_g1_teleop.launch.py \
    input_mode:=markers

RViz opens automatically with 6-DOF interactive markers for each wrist. The /ik_controller/reference_pose topic replaces the /xr_teleop/ee_poses topic in this mode.

  1. Publish to /cmd_vel to start the controller:

    ros2 topic pub --rate 10 /cmd_vel geometry_msgs/msg/Twist \
        "{linear: {x: 0.0, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.0}}"
    
  2. In the RViz Displays panel, find the IK Target Marker display and set its Interactive Markers Namespace to /ik_controller_marker. You can then drag the wrist markers to command the arms.

Remote Monitoring with Foxglove#

If visualization via Foxglove is desired, add use_foxglove:=true to any launch command to start the Foxglove bridge:

ros2 launch isaac_ros_unitree_g1_teleop_bringup unitree_g1_teleop.launch.py \
    use_foxglove:=true

Refer to Foxglove Setup for instructions on connecting Foxglove Studio.