upload real-robot deployment code

This commit is contained in:
yuchen-x
2025-09-23 15:13:22 +08:00
parent 5dcd1ca503
commit f12b478265
130 changed files with 10434 additions and 5 deletions

View File

@@ -13,7 +13,7 @@
<b>UnifoLM-WMA-0</b> is Unitrees open-source world-modelaction architecture spanning multiple types of robotic embodiments, designed specifically for general-purpose robot learning. Its core component is a world-model capable of understanding the physical interactions between robots and the environments. This world-model provides two key functions: (a) <b>Simulation Engine</b> operates as an interactive simulator to generate synthetic data for robot learning; (b) <b>Policy Enhancement</b> connects with an action head and, by predicting future interaction processes with the world-model, further optimizes decision-making performance.
</div>
## 🦾 Real Robot Deployment
## 🦾 Real-Robot Demonstrations
| <img src="assets/gifs/real_z1_stackbox.gif" style="border:none;box-shadow:none;margin:0;padding:0;" /> | <img src="assets/gifs/real_dual_stackbox.gif" style="border:none;box-shadow:none;margin:0;padding:0;" /> |
|:---:|:---:|
| <img src="assets/gifs/real_cleanup_pencils.gif" style="border:none;box-shadow:none;margin:0;padding:0;" /> | <img src="assets/gifs/real_g1_pack_camera.gif" style="border:none;box-shadow:none;margin:0;padding:0;" /> |
@@ -22,13 +22,14 @@
## 🔥 News
* Sep 15, 2025: 🚀 We released the training and inference code along with the model weights of **UnifoLM-WMA-0**.
* Sep 22, 2025: 🚀 We released the deployment code for assisting experiments with [Unitree](https://www.unitree.com/) robots.
* Sep 15, 2025: 🚀 We released the training and inference code along with the model weights of [**UnifoLM-WMA-0**](https://huggingface.co/collections/unitreerobotics/unifolm-wma-0-68ca23027310c0ca0f34959c).
## 📑 Opensource Plan
- [x] Training
- [x] Inference
- [x] Checkpoints
- [ ] Deployment
- [x] Deployment
## ⚙️ Installation
```
@@ -140,7 +141,7 @@ B. To conduct training on a single or multiple datasets, please follow the steps
```
bash scripts/train.sh
```
## 🌏 Inference under the Interactive Simulation Mode
## 🌏 Inference under Interactive Simulation Mode
To run the world model in an interactive simulation mode, follow these steps:
- **Step 1**: (Skip this step if you just would like to test using the examples we provided) Prepare your own prompt following the format used in the [examples/world_model_interaction_prompts](https://github.com/unitreerobotics/unitree-world-model/tree/main/examples/world_model_interaction_prompts):
```
@@ -166,7 +167,33 @@ To run the world model in an interactive simulation mode, follow these steps:
```
bash scripts/run_world_model_interaction.sh
```
## 🧠 Inference and Deployment under Decision-Making Mode
In this setup, inference is performed on a server, while a robot client gathers observations from the real-robot and sends them to the server to query actions. The process unfolds through the following steps:
### Server Setup:
- **Step-1**: Specify ```ckpt```, ```res_dir```, ```datasets``` in scripts/run_real_eval_server.sh;
- **Step-2**: Configure ```data_dir``` and ```dataset_and_weights``` in config/inference/world_model_decision_making.yaml;
- **Step-3**: Launch the server:
```
conda activate unifolm-wma
cd unifolm-world-model-action
bash scripts/run_real_eval_server.sh
```
### Client Setup
- **Step-1**: Follow the instructions in [unitree_deploy/README.md](https://github.com/unitreerobotics/unitree-world-model/blob/main/unitree_deploy/README.md) to create create the ```unitree_deploy``` conda environment, install the required packages, lanuch the controllers or services on the real-robot.
- **Step-2**: Open a new terminal and establish a tunnel connection from the client to the server:
```
ssh user_name@remote_server_IP -CNg -L 8000:127.0.0.1:8000
```
- **Step-3**: Run the ```robot_client.py``` script to start inference:
```
cd unitree_deploy
python scripts/robot_client.py --robot_type "g1_dex1" --action_horizon 16 --exe_steps 16 --observation_horizon 2 --language_instruction "pack black camera into box" --output_dir ./results --control_freq 15
```
## 📝 Codebase Architecture
Here's a high-level overview of the project's code structure and core components:
```
@@ -185,6 +212,7 @@ unitree-world-model/
│ │ ├── models # Model architectures and backbone definitions
│ │ ├── modules # Custom model modules and components
│ │ └── utils # Utility functions and common helpers
└── unitree_deploy # Depolyment code
```
## 🙏 Acknowledgement

View File

@@ -0,0 +1,240 @@
model:
target: unifolm_wma.models.ddpms.LatentVisualDiffusion
params:
rescale_betas_zero_snr: True
parameterization: "v"
linear_start: 0.00085
linear_end: 0.012
num_timesteps_cond: 1
timesteps: 1000
first_stage_key: video
cond_stage_key: instruction
cond_stage_trainable: False
conditioning_key: hybrid
image_size: [40, 64]
channels: 4
scale_by_std: False
scale_factor: 0.18215
use_ema: False
uncond_type: 'empty_seq'
use_dynamic_rescale: true
base_scale: 0.7
fps_condition_type: 'fps'
perframe_ae: True
freeze_embedder: True
n_obs_steps_imagen: 2
n_obs_steps_acting: 2
agent_state_dim: 16
agent_action_dim: 16
decision_making_only: True
###################### DP Related
input_pertub: 0.1
lr_scheduler: cosine
lr_warmup_steps: 2000
num_epochs: 30000
gradient_accumulate_every: 1
use_scheduler: True
dp_use_ema: True
dp_ema_config:
target: unifolm_wma.models.diffusion_head.ema_model.EMAModel
params:
update_after_step: 0
inv_gamma: 1.0
power: 0.75
min_value: 0.0
max_value: 0.9999
noise_scheduler_config:
target: diffusers.DDIMScheduler
params:
num_train_timesteps: 1000
beta_start: 0.0001
beta_end: 0.02
beta_schedule: squaredcos_cap_v2
clip_sample: True
set_alpha_to_one: True
steps_offset: 0
prediction_type: epsilon
dp_optimizer_config:
target: torch.optim.AdamW
params:
lr: 1.0e-4
betas: [0.95, 0.999]
eps: 1.0e-8
weight_decay: 1.0e-6
wma_config:
target: unifolm_wma.modules.networks.wma_model.WMAModel
params:
in_channels: 8
out_channels: 4
model_channels: 320
attention_resolutions:
- 4
- 2
- 1
num_res_blocks: 2
channel_mult:
- 1
- 2
- 4
- 4
dropout: 0.1
num_head_channels: 64
transformer_depth: 1
context_dim: 1024
use_linear: true
use_checkpoint: True
temporal_conv: True
temporal_attention: True
temporal_selfatt_only: True
use_relative_position: False
use_causal_attention: False
temporal_length: 16
addition_attention: True
image_cross_attention: True
default_fs: 10
fs_condition: True
cross_attention_scale_learnable: False
n_obs_steps: ${model.params.n_obs_steps_imagen}
num_stem_token: 16
base_model_gen_only: False
unet_head_config:
target: unifolm_wma.models.diffusion_head.conditional_unet1d.ConditionalUnet1D
params:
input_dim: ${model.params.agent_action_dim}
n_obs_steps: ${model.params.n_obs_steps_acting}
diffusion_step_embed_dim: 128
down_dims: [256, 512, 1024, 2048]
kernel_size: 5
n_groups: 8
cond_predict_scale: True
num_head_channels: ${model.params.wma_config.params.num_head_channels}
horizon: ${model.params.wma_config.params.temporal_length}
use_linear_attn: ${model.params.wma_config.params.use_linear}
use_linear_act_proj: True
act_proj_dim: 32
cond_cross_attention: False
context_dims: []
image_size: ${model.params.image_size}
imagen_cond_gradient: True
last_frame_only: False
use_imagen_mid_only: False
use_z_only: False
obs_encoder_config:
target: unifolm_wma.models.diffusion_head.vision.multi_image_obs_encoder.MultiImageObsEncoder
params:
rgb_model_config:
target: unifolm_wma.models.diffusion_head.vision.model_getter.get_resnet
params:
name: resnet18
weights: null
resize_shape: null
crop_shape: null
random_crop: False
use_group_norm: True
share_rgb_model: False
imagenet_norm: True
use_spatial_softmax: True
spatial_softmax_kp: 128
###################### Action Tokenization
stem_process_config:
target: unifolm_wma.modules.encoders.condition.SATokenProjector
params:
dim: 1024
depth: 1
dim_head: 64
heads: 16
num_queries: ${model.params.wma_config.params.num_stem_token}
output_dim: 1024
ff_mult: 4
chunk_size: ${model.params.wma_config.params.temporal_length}
first_stage_config:
target: unifolm_wma.models.autoencoder.AutoencoderKL
params:
embed_dim: 4
monitor: val/rec_loss
ddconfig:
double_z: True
z_channels: 4
resolution: 256
in_channels: 3
out_ch: 3
ch: 128
ch_mult:
- 1
- 2
- 4
- 4
num_res_blocks: 2
attn_resolutions: []
dropout: 0.0
lossconfig:
target: torch.nn.Identity
cond_stage_config:
target: unifolm_wma.modules.encoders.condition.FrozenOpenCLIPEmbedder
params:
freeze: True
layer: "penultimate"
img_cond_stage_config:
target: unifolm_wma.modules.encoders.condition.FrozenOpenCLIPImageEmbedderV2
params:
freeze: true
image_proj_stage_config:
target: unifolm_wma.modules.encoders.resampler.Resampler
params:
dim: 1024
depth: 4
dim_head: 64
heads: 12
num_queries: 16
embedding_dim: 1280
output_dim: 1024
ff_mult: 4
video_length: ${model.params.wma_config.params.temporal_length}
normalization_config:
input_shapes:
observation.state: ${model.params.wma_config.params.action_unet_config.params.input_dim}
input_normalization_modes:
observation.state: 'min_max'
output_shapes:
action: ${model.params.wma_config.params.action_unet_config.params.input_dim}
output_normalization_modes:
action: 'min_max'
data:
target: unifolm_wma.utils.data.DataModuleFromConfig
params:
batch_size: 6
num_workers: 12
wrap: False
test:
target: unifolm_wma.data.wma_data.WMAData
params:
data_dir: '/path/to/the/dataset/directory/that/contains/the/meta/folder/of/the/testing/case/under/a/transitions/folder' # e.g., /path/to/unifolm-world-model-action/examples/world_model_interaction_prompts
video_length: ${model.params.wma_config.params.temporal_length}
frame_stride: 2
load_raw_resolution: True
resolution: [320, 512]
spatial_transform: resize_center_crop
crop_resolution: [320, 512]
random_fs: False
cond_robot_label_prob: 0.0
normalization_mode: 'min_max'
individual_normalization: True
n_obs_steps: ${model.params.n_obs_steps_imagen}
max_action_dim: ${model.params.agent_action_dim}
max_state_dim: ${model.params.agent_state_dim}
dataset_and_weights:
unitree_g1_pack_camera: 1.0

View File

@@ -0,0 +1,463 @@
import argparse, os, sys
import torch
import torchvision
import warnings
import imageio
import logging
import matplotlib.pyplot as plt
plt.switch_backend('agg')
import traceback
import uvicorn
from omegaconf import OmegaConf
from einops import rearrange, repeat
from collections import OrderedDict
from pytorch_lightning import seed_everything
from torch import nn
from fastapi import FastAPI
from fastapi.responses import JSONResponse
from typing import Any, Dict, Optional, Tuple, List
from datetime import datetime
from unifolm_wma.utils.utils import instantiate_from_config
from unifolm_wma.models.samplers.ddim import DDIMSampler
def get_device_from_parameters(module: nn.Module) -> torch.device:
"""Get a module's device by checking one of its parameters.
Args:
module (nn.Module): PyTorch module.
Returns:
torch.device: The device where the module's parameters are stored.
"""
return next(iter(module.parameters())).device
def load_model_checkpoint(model: nn.Module, ckpt: str) -> nn.Module:
"""Load model weights from checkpoint file.
Args:
model (nn.Module): Model to load weights into.
ckpt (str): Path to checkpoint file.
Returns:
nn.Module: Model with loaded weights.
"""
state_dict = torch.load(ckpt, map_location="cpu")
if "state_dict" in list(state_dict.keys()):
state_dict = state_dict["state_dict"]
try:
model.load_state_dict(state_dict, strict=False)
except:
new_pl_sd = OrderedDict()
for k, v in state_dict.items():
new_pl_sd[k] = v
for k in list(new_pl_sd.keys()):
if "framestride_embed" in k:
new_key = k.replace("framestride_embed", "fps_embedding")
new_pl_sd[new_key] = new_pl_sd[k]
del new_pl_sd[k]
model.load_state_dict(new_pl_sd, strict=False)
else:
new_pl_sd = OrderedDict()
for key in state_dict['module'].keys():
new_pl_sd[key[16:]] = state_dict['module'][key]
model.load_state_dict(new_pl_sd)
print('>>> model checkpoint loaded.')
return model
def write_video(video_path: str, stacked_frames: List[Any], fps: int) -> None:
"""Write a video to disk using imageio.
Args:
video_path (str): Path to save the video.
stacked_frames (List[Any]): Frames to write.
fps (int): Frames per second.
"""
with warnings.catch_warnings():
warnings.filterwarnings("ignore",
"pkg_resources is deprecated as an API",
category=DeprecationWarning)
imageio.mimsave(video_path, stacked_frames, fps=fps)
def save_results(video: torch.Tensor, filename: str, fps: int = 8) -> None:
"""Save a video tensor as an MP4 file.
Args:
video (torch.Tensor): Video tensor of shape (B, C, T, H, W).
filename (str): Path to save video.
fps (int, optional): Frame rate. Defaults to 8.
"""
video = video.detach().cpu()
video = torch.clamp(video.float(), -1., 1.)
n = video.shape[0]
video = video.permute(2, 0, 1, 3, 4)
frame_grids = [
torchvision.utils.make_grid(framesheet, nrow=int(n), padding=0)
for framesheet in video
]
grid = torch.stack(frame_grids, dim=0)
grid = (grid + 1.0) / 2.0
grid = (grid * 255).to(torch.uint8).permute(0, 2, 3, 1)
torchvision.io.write_video(filename,
grid,
fps=fps,
video_codec='h264',
options={'crf': '10'})
def get_latent_z(model: nn.Module, videos: torch.Tensor) -> torch.Tensor:
"""Encode videos into latent space.
Args:
model (nn.Module): Model with `encode_first_stage` method.
videos (torch.Tensor): Input videos (B, C, T, H, W).
Returns:
torch.Tensor: Latent representation (B, C, T, H, W).
"""
b, c, t, h, w = videos.shape
x = rearrange(videos, 'b c t h w -> (b t) c h w')
z = model.encode_first_stage(x)
z = rearrange(z, '(b t) c h w -> b c t h w', b=b, t=t)
return z
def image_guided_synthesis(
model: torch.nn.Module,
prompts: list[str],
observation: Dict[str, torch.Tensor],
noise_shape: tuple[int, int, int, int, int],
ddim_steps: int = 50,
ddim_eta: float = 1.0,
unconditional_guidance_scale: float = 1.0,
fs: int | None = None,
timestep_spacing: str = 'uniform',
guidance_rescale: float = 0.0,
**kwargs) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]:
"""Run inference with DDIM sampling.
Args:
model (nn.Module): Diffusion model.
prompts (Any): Conditioning text prompts.
observation (Dict[str, torch.Tensor]): Observation dictionary.
noise_shape (List[int]): Shape of noise tensor.
ddim_steps (int, optional): Number of DDIM steps. Defaults to 50.
ddim_eta (float, optional): Sampling eta. Defaults to 1.0.
unconditional_guidance_scale (float, optional): Guidance scale. Defaults to 1.0.
fs (Optional[int], optional): Frame stride or FPS. Defaults to None.
timestep_spacing (str, optional): Spacing strategy. Defaults to "uniform".
guidance_rescale (float, optional): Guidance rescale. Defaults to 0.0.
**kwargs (Any): Additional arguments.
Returns:
Tuple[torch.Tensor, torch.Tensor, torch.Tensor]:
"""
b, _, t, _, _ = noise_shape
ddim_sampler = DDIMSampler(model)
batch_size = noise_shape[0]
fs = torch.tensor([fs] * batch_size, dtype=torch.long, device=model.device)
img = observation['observation.images.top']
cond_img = img[:, -1, ...]
cond_img_emb = model.embedder(cond_img)
cond_img_emb = model.image_proj_model(cond_img_emb)
if model.model.conditioning_key == 'hybrid':
z = get_latent_z(model, img.permute(0, 2, 1, 3, 4))
img_cat_cond = z[:, :, -1:, :, :]
img_cat_cond = repeat(img_cat_cond,
'b c t h w -> b c (repeat t) h w',
repeat=noise_shape[2])
cond = {"c_concat": [img_cat_cond]}
cond_ins_emb = model.get_learned_conditioning(prompts)
cond_state = model.state_projector(observation['observation.state'])
cond_state_emb = model.agent_state_pos_emb + cond_state
cond_action = model.action_projector(observation['action'])
cond_action_emb = model.agent_action_pos_emb + cond_action
cond_action_emb = torch.zeros_like(cond_action_emb)
cond["c_crossattn"] = [
torch.cat([cond_state_emb, cond_ins_emb, cond_img_emb], dim=1)
]
cond["c_crossattn_action"] = [
observation['observation.images.top'].permute(
0, 2, 1, 3, 4)[:, :, -model.n_obs_steps_acting:],
observation['observation.state'][:, -model.n_obs_steps_acting:]
]
uc = None
kwargs.update({"unconditional_conditioning_img_nonetext": None})
cond_mask = None
cond_z0 = None
if ddim_sampler is not None:
samples, actions, states, intermedia = ddim_sampler.sample(
S=ddim_steps,
conditioning=cond,
batch_size=batch_size,
shape=noise_shape[1:],
verbose=False,
unconditional_guidance_scale=unconditional_guidance_scale,
unconditional_conditioning=uc,
eta=ddim_eta,
cfg_img=None,
mask=cond_mask,
x0=cond_z0,
fs=fs,
timestep_spacing=timestep_spacing,
guidance_rescale=guidance_rescale,
**kwargs)
# Reconstruct from latent to pixel space
batch_images = model.decode_first_stage(samples)
batch_variants = batch_images
return batch_variants, actions, states
def run_inference(args: argparse.Namespace, gpu_num: int,
gpu_no: int) -> Tuple[nn.Module, List[int], Any]:
"""
Run inference pipeline on prompts and image inputs.
Args:
args (argparse.Namespace): Parsed command-line arguments.
gpu_num (int): Number of GPUs.
gpu_no (int): Index of the current GPU.
Returns:
None
"""
# Load config
config = OmegaConf.load(args.config)
# Set use_checkpoint as False as when using deepspeed, it encounters an error "deepspeed backend not set"
config['model']['params']['wma_config']['params']['use_checkpoint'] = False
model = instantiate_from_config(config.model)
model.perframe_ae = args.perframe_ae
assert os.path.exists(args.ckpt_path), "Error: checkpoint Not Found!"
model = load_model_checkpoint(model, args.ckpt_path)
model = model.cuda(gpu_no)
model.eval()
print(">>> Model is successfully loaded ...")
# Build unnomalizer
logging.info("***** Configing Data *****")
data = instantiate_from_config(config.data)
data.setup()
print(">>> Dataset is successfully loaded ...")
## Run over data
assert (args.height % 16 == 0) and (
args.width % 16
== 0), "Error: image size [h,w] should be multiples of 16!"
assert args.bs == 1, "Current implementation only support [batch size = 1]!"
## Get latent noise shape
h, w = args.height // 8, args.width // 8
channels = model.model.diffusion_model.out_channels
n_frames = args.video_length
print(f'>>> Generate {n_frames} frames under each generation ...')
noise_shape = [args.bs, channels, n_frames, h, w]
return model, noise_shape, data
def get_parser() -> argparse.ArgumentParser:
parser = argparse.ArgumentParser()
parser.add_argument("--savedir",
type=str,
default=None,
help="Path to save the results.")
parser.add_argument("--ckpt_path",
type=str,
default=None,
help="Path to the model checkpoint.")
parser.add_argument("--config", type=str, help="Path to the config file.")
parser.add_argument(
"--ddim_steps",
type=int,
default=50,
help="Number of DDIM steps. If non-positive, DDPM is used instead.")
parser.add_argument(
"--ddim_eta",
type=float,
default=1.0,
help="Eta for DDIM sampling. Set to 0.0 for deterministic results.")
parser.add_argument("--bs",
type=int,
default=1,
help="Batch size for inference. Must be 1.")
parser.add_argument("--height",
type=int,
default=320,
help="Height of the generated images in pixels.")
parser.add_argument("--width",
type=int,
default=512,
help="Width of the generated images in pixels.")
parser.add_argument(
"--frame_stride",
type=int,
default=3,
help=
"frame stride control for 256 model (larger->larger motion), FPS control for 512 or 1024 model (smaller->larger motion)"
)
parser.add_argument(
"--unconditional_guidance_scale",
type=float,
default=1.0,
help="Scale for classifier-free guidance during sampling.")
parser.add_argument("--seed",
type=int,
default=123,
help="Random seed for reproducibility.")
parser.add_argument("--video_length",
type=int,
default=16,
help="Number of frames in the generated video.")
parser.add_argument(
"--timestep_spacing",
type=str,
default="uniform",
help=
"Strategy for timestep scaling. See Table 2 in the paper: 'Common Diffusion Noise Schedules and Sample Steps are Flawed' (https://huggingface.co/papers/2305.08891)."
)
parser.add_argument(
"--guidance_rescale",
type=float,
default=0.0,
help=
"Rescale factor for guidance as discussed in 'Common Diffusion Noise Schedules and Sample Steps are Flawed' (https://huggingface.co/papers/2305.08891)."
)
parser.add_argument(
"--perframe_ae",
action='store_true',
default=False,
help=
"Use per-frame autoencoder decoding to reduce GPU memory usage. Recommended for models with resolutions like 576x1024."
)
return parser
class Server:
def __init__(self, args: argparse.Namespace) -> None:
self.model_, self.noise_shape_, self.data_ = run_inference(args, 1, 0)
self.args_ = args
self.dataset_name = self.data_.dataset_configs['test']['params'][
'dataset_name']
self.device_ = get_device_from_parameters(self.model_)
def normalize_image(self, image: torch.Tensor) -> torch.Tensor:
return (image / 255 - 0.5) * 2
def predict_action(self, payload: Dict[str, Any]) -> Any:
try:
images = payload['observation.images.top']
states = payload['observation.state']
actions = payload['action'] # Should be all zeros
language_instruction = payload['language_instruction']
images = torch.tensor(images).cuda()
images = self.data_.test_datasets[
self.dataset_name].spatial_transform(images).unsqueeze(0)
images = self.normalize_image(images)
print(f"images shape: {images.shape} ...")
states = torch.tensor(states)
states = self.data_.test_datasets[self.dataset_name].normalizer(
{'observation.state': states})['observation.state']
states, _ = self.data_.test_datasets[
self.dataset_name]._map_to_uni_state(states, "joint position")
print(f"states shape: {states.shape} ...")
actions = torch.tensor(actions)
actions, action_mask = self.data_.test_datasets[
self.dataset_name]._map_to_uni_action(actions,
"joint position")
print(f"actions shape: {actions.shape} ...")
print("=" * 20)
states = states.unsqueeze(0).cuda()
actions = actions.unsqueeze(0).cuda()
observation = {
'observation.images.top': images,
'observation.state': states,
'action': actions
}
observation = {
key: observation[key].to(self.device_, non_blocking=True)
for key in observation
}
args = self.args_
pred_videos, pred_action, _ = image_guided_synthesis(
self.model_,
language_instruction,
observation,
self.noise_shape_,
ddim_steps=args.ddim_steps,
ddim_ets=args.ddim_eta,
unconditional_guidance_scale=args.unconditional_guidance_scale,
fs=30 / args.frame_stride,
timestep_spacing=args.timestep_spacing,
guidance_rescale=args.guidance_rescale)
pred_action = pred_action[..., action_mask[0] == 1.0][0].cpu()
pred_action = self.data_.test_datasets[
self.dataset_name].unnormalizer({'action':
pred_action})['action']
os.makedirs(args.savedir, exist_ok=True)
current_time = datetime.now().strftime("%H:%M:%S")
video_file = f'{args.savedir}/{current_time}.mp4'
save_results(pred_videos.cpu(), video_file)
response = {
'result': 'ok',
'action': pred_action.tolist(),
'desc': 'success'
}
return JSONResponse(response)
except:
logging.error(traceback.format_exc())
logging.warning(
"Your request threw an error; make sure your request complies with the expected format:\n"
"{'image': np.ndarray, 'instruction': str}\n"
"You can optionally an `unnorm_key: str` to specific the dataset statistics you want to use for "
"de-normalizing the output actions.")
return {'result': 'error', 'desc': traceback.format_exc()}
def run(self, host: str = "127.0.0.1", port: int = 8000) -> None:
self.app = FastAPI()
self.app.post("/predict_action")(self.predict_action)
print(">>> Inference server is ready ... ")
uvicorn.run(self.app, host=host, port=port)
print(">>> Inference server stops ... ")
return
if __name__ == '__main__':
parser = get_parser()
args = parser.parse_args()
seed = args.seed
seed_everything(seed)
rank, gpu_num = 0, 1
print(">>> Launch inference server ... ")
server = Server(args)
server.run()

View File

@@ -0,0 +1,26 @@
model_name=testing
ckpt=/path/to/model/checkpoint
config=configs/inference/world_model_decision_making.yaml
seed=123
res_dir="path/to/results/directory"
datasets=(
"unitree_g1_pack_camera"
)
for dataset in "${datasets[@]}"; do
CUDA_VISIBLE_DEVICES=0 python3 scripts/evaluation/real_eval_server.py \
--seed ${seed} \
--ckpt_path $ckpt \
--config $config \
--savedir "${res_dir}/${dataset}/${model_name}/videos" \
--bs 1 --height 320 --width 512 \
--unconditional_guidance_scale 1.0 \
--ddim_steps 16 \
--ddim_eta 1.0 \
--video_length 16 \
--frame_stride 2 \
--timestep_spacing 'uniform_trailing' \
--guidance_rescale 0.7 \
--perframe_ae
done

207
unitree_deploy/README.md Normal file
View File

@@ -0,0 +1,207 @@
# Unitree Deploy
This document provides instructions for setting up the deployment environment for Unitree G1 (with gripper) and Z1 platforms, including dependency installation, image service startup, and gripper control.
# 0. 📖 Introduction
This repository is used for model deployment with Unitree robots.
---
# 1. 🛠️ Environment Setup
```bash
conda create -n unitree_deploy python=3.10 && conda activate unitree_deploy
conda install pinocchio -c conda-forge
pip install -e .
# Optional: Install lerobot dependencies
pip install -e ".[lerobot]"
git clone https://github.com/unitreerobotics/unitree_sdk2_python.git
cd unitree_sdk2_python && pip install -e . && cd ..
```
---
# 2. 🚀 Start
**Tip: Keep all devices on the same LAN**
## 2.1 🤖 Run G1 with Dex_1 Gripper
### 2.1.1 📷 Image Capture Service Setup (G1 Board)
[To open the image_server, follow these steps](https://github.com/unitreerobotics/xr_teleoperate?tab=readme-ov-file#31-%EF%B8%8F-image-service)
1. Connect to the G1 board:
```bash
ssh unitree@192.168.123.164 # Password: 123
```
2. Activate the environment and start the image server:
```bash
conda activate tv
cd ~/image_server
python image_server.py
```
---
### 2.1.2 🤏 Dex_1 Gripper Service Setup (Development PC2)
Refer to the [Dex_1 Gripper Installation Guide](https://github.com/unitreerobotics/dex1_1_service?tab=readme-ov-file#1--installation) for detailed setup instructions.
1. Navigate to the service directory:
```bash
cd ~/dex1_1_service/build
```
2. Start the gripper service, **ifconfig examines its own dds networkInterface**:
```bash
sudo ./dex1_1_gripper_server --network eth0 -l -r
```
3. Verify communication with the gripper service:
```bash
./test_dex1_1_gripper_server --network eth0 -l -r
```
---
### 2.1.2 ✅Testing
Perform the following tests to ensure proper functionality:
- **Dex1 Gripper Test**:
```bash
python test/endeffector/test_dex1.py
```
- **G1 Arm Test**:
```bash
python test/arm/g1/test_g1_arm.py
```
- **Image Client Camera Test**:
```bash
python test/camera/test_image_client_camera.py
```
- **G1 Datasets Replay**:
```bash
python test/test_replay.py --repo-id unitreerobotics/G1_CameraPackaging_NewDataset --robot_type g1_dex1
```
---
## 2.2 🦿 Run Z1
### 2.2.1 🦿 Z1 Setup
Clone and build the required repositories:
1. Download [z1_controller](https://github.com/unitreerobotics/z1_controller.git) and [z1_sdk](https://github.com/unitreerobotics/z1_sdk.git).
2. Build the repositories:
```bash
mkdir build && cd build
cmake .. && make -j
```
3. Copy the `unitree_arm_interface` library: [Modify according to your own path]
```bash
cp z1_sdk/lib/unitree_arm_interface.cpython-310-x86_64-linux-gnu.so ./unitree_deploy/robot_devices/arm
```
4. Start the Z1 controller [Modify according to your own path]:
```bash
cd z1_controller/build
./z1_ctrl
```
---
### 2.2.2 Testing ✅
Run the following tests:
- **Realsense Camera Test**:
```bash
python test/camera/test_realsense_camera.py # Modify the corresponding serial number according to your realsense
```
- **Z1 Arm Test**:
```bash
python test/arm/z1/test_z1_arm.py
```
- **Z1 Environment Test**:
```bash
python test/arm/z1/test_z1_env.py
```
- **Z1 Datasets Replay**:
```bash
python test/test_replay.py --repo-id unitreerobotics/Z1_StackBox_Dataset --robot_type z1_realsense
```
---
## 2.3 🦿 Run Z1_Dual
### 2.3.1 🦿 Z1 Setup and Dex1 Setup
Clone and build the required repositories:
1. Download and compile the corresponding code according to the above z1 steps and Download the gripper program to start locally
2. [Modify the multi-machine control according to the document](https://support.unitree.com/home/zh/Z1_developer/sdk_operation)
3. [Download the modified z1_sdk_1 and then compile it](https://github.com/unitreerobotics/z1_sdk/tree/z1_dual), Copy the `unitree_arm_interface` library: [Modify according to your own path]
```bash
cp z1_sdk/lib/unitree_arm_interface.cpython-310-x86_64-linux-gnu.so ./unitree_deploy/robot_devices/arm
```
4. Start the Z1 controller [Modify according to your own path]:
```bash
cd z1_controller/builb && ./z1_ctrl
cd z1_controller_1/builb && ./z1_ctrl
```
5. Start the gripper service, **ifconfig examines its own dds networkInterface**:
```
sudo ./dex1_1_gripper_server --network eth0 -l -r
```
---
### 2.3.2 Testing ✅
Run the following tests:
- **Z1_Dual Arm Test**:
```bash
python test/arm/z1/test_z1_arm_dual.py
```
- **Z1_Dual Datasets Replay**:
```bash
python test/test_replay.py --repo-id unitreerobotics/Z1_Dual_Dex1_StackBox_Dataset_V2 --robot_type z1_dual_dex1_realsense
```
---
# 3.🧠 Inference and Deploy
1. [Modify the corresponding parameters according to your configuration](./unitree_deploy/robot/robot_configs.py)
2. Go back the **step-2 of Client Setup** under the [Inference and Deployment under Decision-Making Mode](https://github.com/unitreerobotics/unitree-world-model/blob/main/README.md).
# 4.🏗️ Code structure
[If you want to add your own robot equipment, you can build it according to this document](./docs/GettingStarted.md)
# 5. 🤔 Troubleshooting
For assistance, contact the project maintainer or refer to the respective GitHub repository documentation. 📖
# 6. 🙏 Acknowledgement
This code builds upon following open-source code-bases. Please visit the URLs to see the respective LICENSES (If you find these projects valuable, it would be greatly appreciated if you could give them a star rating.):
1. https://github.com/huggingface/lerobot
2. https://github.com/unitreerobotics/unitree_sdk2_python

View File

@@ -0,0 +1,70 @@
# Getting Started
### Code framework
| Module Name | Documentation Link |
| ------------------------- | -------------------------------------------------- |
| robots | [build_robot](./build_robot.md) |
| robot_devices/arm | [add_robot_arm](./add_robot_arm.md) |
| robot_devices/cameras | [add_robot_camera](./add_robot_camera.md) |
| robot_devices/endeffector | [add_robot_endeffector](./add_robot_endeffector.md)|
### Simple Usage (Example code, not executable)
```python
import time
import math
import torch
from unitree_deploy.robot.robot_utils import make_robot
from unitree_deploy.robot_devices.robots_devices_utils import precise_wait
class YourPolicy:
def predict_action(self, observation, policy):
# Logic for predicting action
pass
class UnitreeEnv:
def __init__(self):
self.robot = make_robot(self.robot_type)
if not self.robot.is_connected:
self.robot.connect()
# If disconnection is needed, call disconnect() here
# self.robot.disconnect()
def get_obs(self):
# Get observation
observation = self.robot.capture_observation()
return observation
def step(self, pred_action, t_command_target):
# Execute action
t_cycle_end = time.monotonic() + self.control_dt
t_command_target = t_cycle_end + self.control_dt
action = self.robot.send_action(torch.from_numpy(pred_action), t_command_target)
precise_wait(t_cycle_end)
return action
if __name__ == "__main__":
policy = YourPolicy() # Create policy instance
env = UnitreeEnv() # Create environment instance
t_start = time.monotonic() # Get start time
iter_idx = 0 # Initialize iteration index
control_dt = 1 / 30 # Control loop interval (30Hz)
try:
while True:
t_cycle_end = t_start + (iter_idx + 1) * control_dt # Calculate end time of current cycle
t_command_target = t_cycle_end + control_dt # Calculate command target time
observation = env.get_obs() # Get environment observation
pred_action = policy.predict_action(observation, policy) # Predict action
env.step(pred_action, t_command_target) # Execute action
precise_wait(t_cycle_end) # Wait until cycle end
iter_idx += 1 # Update iteration index
finally:
# Perform cleanup operations on exit (e.g., disconnect robot)
pass

View File

@@ -0,0 +1,76 @@
# How to Build Your Own Arm
### Define your own config for the robot arm (unitree_deploy/robot_devices/arm/config.py)
```python
@ArmConfig.register_subclass("z1") # Register your custom arm wrapper. Here use def __init__(self, config: Z1DualArmConfig):
@dataclass
class Z1ArmConfig(ArmConfig):
port: str
motors: dict[str, tuple[int, str]]
mock: bool = False
init_pose_left: list = None
init_pose_right: list = None
control_dt: float = 1/500.0
# Default parameters go first [parameters that may need to be customized],
# Non-default parameters go later [fixed parameters]
```
### Description of methods in your arm class (unitree_deploy/robot_devices/arm/utils.py)
```python
# Base class for Arm, extensible with required methods
class Arm(Protocol):
def connect(self): ...
def disconnect(self): ...
def motor_names(self): ...
def read_current_motor_q(self): ...
def read_current_arm_q(self): ...
def read_current_arm_dq(self): ...
def write_arm(self): ...
def arm_ik(self): ...
```
How to implement external calls?
Use make_arm_motors_buses_from_configs [based on the config file] to construct the UnitreeRobot class.
Use make_arm_motors_bus [based on arm_type] which is generally used for external module loading.
### Implementation of the arm class (unitree_deploy/robot_devices/arm/.../....py)
```python
# These methods need to be implemented and completed
def connect(self): ...
def disconnect(self): ...
def motor_names(self): ...
# connect() and disconnect() should handle initialization and homing respectively
def read_current_motor_q(self): ...
def read_current_arm_q(self): ...
def read_current_arm_dq(self): ...
# Outputs should be unified as np.ndarray
def write_arm(self): ...
# Write control commands here
def arm_ik(self): ...
# Wrap IK into your own arm class for external calling
# Private/protected properties [for reading motor names, IDs, etc.]
@property
def motor_names(self) -> list[str]:
return list(self.motors.keys())
@property
def motor_models(self) -> list[str]:
return [model for _, model in self.motors.values()]
@property
def motor_indices(self) -> list[int]:
return [idx for idx, _ in self.motors.values()]
```
All arms use threading to implement \_subscribe_motor_state and \_ctrl_motor_state threads for internal reading and writing within the class.

View File

@@ -0,0 +1,66 @@
# How to build your own cameras
### Define your own config for cameras (unitree_deploy/robot_devices/cameras/config.py)
```python
@CameraConfig.register_subclass("opencv") # Define and wrap your own cameras. Here use def __init__(self, config: OpenCVCameraConfig):
@dataclass
class OpenCVCameraConfig(CameraConfig):
"""
Example of tested options for Intel Real Sense D405:
OpenCVCameraConfig(0, 30, 640, 480)
OpenCVCameraConfig(0, 60, 640, 480)
OpenCVCameraConfig(0, 90, 640, 480)
OpenCVCameraConfig(0, 30, 1280, 720)
"""
# Define the required camera parameters
camera_index: int
fps: int | None = None
width: int | None = None
height: int | None = None
color_mode: str = "rgb"
channels: int | None = None
rotation: int | None = None
mock: bool = False
def __post_init__(self):
if self.color_mode not in ["rgb", "bgr"]:
raise ValueError(
f"`color_mode` is expected to be 'rgb' or 'bgr', but {self.color_mode} is provided."
)
self.channels = 3
if self.rotation not in [-90, None, 90, 180]:
raise ValueError(f"`rotation` must be in [-90, None, 90, 180] (got {self.rotation})")
# Default parameters go first [parameters that need to be customized],
# Non-default parameters go later [fixed parameters]
```
### Description of methods in your cameras class (unitree_deploy/robot_devices/cameras/utils.py)
```python
# Base class for cameras, extensible with required methods
class Camera(Protocol):
def connect(self): ...
def read(self, temporary_color: str | None = None) -> np.ndarray: ... # Single-threaded reading
def async_read(self) -> np.ndarray: ... # Multi-threaded
def disconnect(self): ...
```
How can external modules implement calls? Use **make_cameras_from_configs [based on configuration files]** to construct the `UnitreeRobot` class.
**make_camera [based on camera_type]** is generally used for external module loading.
### Implementation of the `camera` class (unitree_deploy/robot_devices/camera/.../....py)
```python
# These need to be completed, focusing on implementing these two parts
def read(self, temporary_color: str | None = None) -> np.ndarray: ... # Single-threaded reading
def async_read(self) -> np.ndarray: ... # Multi-threaded
```
All cameras use threading to implement `async_read` for internal read and write operations.

View File

@@ -0,0 +1,77 @@
# How to Build Your Own End-Effector [Currently dex_1 and dex_3 are available]
### Define your own config for the end-effector (unitree_deploy/robot_devices/endeffector/config.py)
```python
@EndEffectorConfig.register_subclass("gripper") # Register your custom end-effector wrapper. Here it uses def __init__(self, config: GripperConfig):
@dataclass
class GripperConfig(EndEffectorConfig):
motors: dict[str, tuple[int, str]]
unit_test: bool = False
control_dt: float = 1/200
mock: bool = False
def __post_init__(self):
if self.control_dt < 0.002:
raise ValueError(f"`control_dt` must > 1/500 (got {self.control_dt})")
# Default arguments should be placed first [parameters that may need to be customized],
# Non-default arguments should be placed later [fixed or less important parameters].
```
### Description of methods in your end-effector class (unitree_deploy/robot_devices/endeffector/utils.py)
```python
# Base class for EndEffector, extend with required methods
class EndEffector(Protocol):
def connect(self): ...
def disconnect(self): ...
def motor_names(self): ...
def read_current_endeffector_q(self): ...
def read_current_endeffector_dq(self): ...
def write_endeffector(self): ...
def endeffector_ik(self): ...
```
How to call externally?
Use make_endeffector_motors_buses_from_configs → Construct the UnitreeRobot class based on the config file
Use make_endeffector_motors_bus → Construct based on endeffector_type (typically for external module loading)
### Implementation of your end-effector class (unitree_deploy/robot_devices/endeffector/.../....py)
```python
# These methods need to be implemented and completed
def connect(self): ...
def disconnect(self): ...
def motor_names(self): ...
# connect() and disconnect() should handle initialization and homing respectively
def read_current_endeffector_q(self): ...
def read_current_endeffector_dq(self): ...
# Outputs should be unified as np.ndarray
def write_endeffector(self): ...
# Write control commands here
def arm_ik(self): ...
# Wrap IK into your own arm class, to be called externally
# Private/protected properties
# (for reading motor names, IDs, etc. These will be used in UnitreeRobot class for dataset encapsulation)
@property
def motor_names(self) -> list[str]:
return list(self.motors.keys())
@property
def motor_models(self) -> list[str]:
return [model for _, model in self.motors.values()]
@property
def motor_indices(self) -> list[int]:
return [idx for idx, _ in self.motors.values()]
```
For arms, use threading to implement \_subscribe_gripper_motor_state (thread for reading motor states),\_ctrl_gripper_motor (thread for motor control),Both threads should run internally within the class.

View File

@@ -0,0 +1,140 @@
# Build your own robot
### Add your own config ((unitree_deploy/robot/robot_configs.py))
The base class of robot config is defined as **UnitreeRobotConfig**
```python
class UnitreeRobotConfig(RobotConfig):
cameras: dict[str, CameraConfig] = field(default_factory=lambda: {}) # Corresponding to your own camera
arm: dict[str, ArmConfig] = field(default_factory=lambda: {}) # Corresponding to your own arm
endeffector: dict[str, EndEffectorConfig] = field(default_factory=lambda: {}) # Corresponding to your own end-effector
mock: bool = False # Simulation [To be implemented, for debugging, to check some class definitions and message type formats]
```
Specific example: separately fill in \[name\]:robot_devies → cameras,
arm, endeffector.\
If not provided, they default to empty and will not affect the system.\
(In principle, different robot_devies and different quantities can be
constructed.)
```python
class Z1dual_Dex1_Opencv_RobotConfig(UnitreeRobotConfig):
# Troubleshooting: If one of your IntelRealSense cameras freezes during
# data recording due to bandwidth limit, you might need to plug the camera
# into another USB hub or PCIe card.
cameras: dict[str, CameraConfig] = field(
default_factory=lambda: { # Add corresponding configs for different cameras [name]:OpenCVCameraConfig + required parameters
"cam_high": OpenCVCameraConfig(
camera_index="/dev/video0",
fps=30,
width=640,
height=480,
),
"cam_left_wrist": OpenCVCameraConfig(
camera_index="/dev/video2",
fps=30,
width=640,
height=480,
),
"cam_right_wrist": OpenCVCameraConfig(
camera_index="/dev/video4",
fps=30,
width=640,
height=480,
),
}
)
arm: dict[str, ArmConfig] = field(
default_factory=lambda: {
"z1_dual": Z1DualArmConfig( # Add corresponding configs for different arms [name]:Z1DualArmConfig + required parameters
unit_test = False,
init_pose_left = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
init_pose_right = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
control_dt = 1/500.0,
motors={
# name: (index, model)
"kLeftWaist": [0, "z1-joint"],
"kLeftShoulder": [1, "z1-joint"],
"kLeftElbow": [2, "z1-joint"],
"kLeftForearmRoll": [3, "z1-joint"],
"kLeftWristAngle": [4, "z1-joint"],
"kLeftWristRotate": [5, "z1-joint"],
"kRightWaist": [7, "z1-joint"],
"kRightShoulder": [8, "z1-joint"],
"kRightElbow": [9, "z1-joint"],
"kRightForearmRoll": [10, "z1-joint"],
"kRightWristAngle": [11, "z1-joint"],
"kRightWristRotate": [12, "z1-joint"],
},
),
}
)
endeffector: dict[str, EndEffectorConfig] = field(
default_factory=lambda: {
"gripper": GripperConfig( # Add corresponding configs for different end-effectors [name]:GripperConfig + required parameters
unit_test = False,
unit_test = True,
control_dt = 1/250,
motors={
# name: (index, model)
"kLeftGripper": [0, "z1_gripper-joint"],
"kRightGripper": [1, "z1_gripper-joint"],
},
),
}
)
mock: bool = False
```
---
### robot utils ((unitree_deploy/robot/utils.py))
```python
Implementation of the Robot base class
class Robot(Protocol):
robot_type: str
features: dict
def connect(self): ... # Connect devices (including cameras, arms, end-effectors of robot_devies)
def capture_observation(self): ... # capture_observation (Get current state, including data from camera + arm + end-effector)
def send_action(self, action): ... # send_action (Send action to arm + end-effector actuators, can be used for model inference and data replay)
def disconnect(self): ... # Disconnect devices
```
External calls **make_robot_from_config** and **make_robot** are used in
**control_robot**, to initialize the robot and implement specific
functions.
---
### manipulator ((unitree_deploy/robot/manipulator.py))
UnitreeRobot implements initialization by calling
**UnitreeRobotConfig**.
```python
Several important parts of the implementation
def capture_observation(self): # Get current obs, return { observation.state, observation.images}
def send_action( # Model inference and data replay, receives action + time
self, action: torch.Tensor, t_command_target:float|None = None
) -> torch.Tensor:
# Here we input device data
# Output (arm + end-effector) joint angle positions, end-effector positions, or other data conversion (IK is implemented here!)
# Output is uniformly converted into joint angle positions {"left":arm_joint_points, "roght":arm_joint_points} + {"left":endeffector_joint_points, "roght":endeffector_joint_points}
# Why consider left and right? Because this separates single-arm cases, and different arms and different end-effectors.
# This way, the implementation can work properly.
def convert_data_based_on_robot_type(self, robot_type: str, leader_pos: dict[str, np.ndarray]
) -> None | tuple[dict[str, np.ndarray], dict[str, np.ndarray]]:
```

View File

@@ -0,0 +1,51 @@
[build-system]
requires = ["setuptools", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "unitree_deploy"
version = "0.0.3"
description = "unitree deploy"
readme = "README.md"
requires-python = ">=3.10"
license = { text = "BSD-3-Clause" }
authors = [
{ name = "hengguo", email = "rd_gh@unitree.com" }
]
keywords = ["unitree", "robotics", "deployment"]
dependencies = [
"tyro",
"draccus",
"datasets==3.3.0",
"meshcat",
"pyrealsense2",
"numpy",
"opencv-python",
"mujoco",
"matplotlib",
"dm_env",
"torch>=2.2.1,<2.8.0",
"rerun-sdk>=0.21.0,<0.23.0",
]
[tool.setuptools]
packages = ["unitree_deploy"]
[project.optional-dependencies]
lerobot = [
"lerobot @ git+https://github.com/huggingface/lerobot.git@0878c68"
]
[tool.ruff]
line-length = 110
target-version = "py310"
exclude = ["build", "venv", "__pycache__"]
fix = true
show-fixes = true
[tool.ruff.lint]
select = ["E4", "E7", "E9", "F", "I", "N", "B", "C4", "SIM"]
ignore = ["N801"]
[tool.ruff.lint.per-file-ignores]
"arm_indexs.py" = ["N815"]

View File

@@ -0,0 +1,198 @@
import argparse
import os
import time
import cv2
import numpy as np
import torch
import tqdm
from typing import Any, Deque, MutableMapping, OrderedDict
from collections import deque
from pathlib import Path
from unitree_deploy.real_unitree_env import make_real_env
from unitree_deploy.utils.eval_utils import (
ACTTemporalEnsembler,
LongConnectionClient,
populate_queues,
)
# -----------------------------------------------------------------------------
# Network & environment defaults
# -----------------------------------------------------------------------------
os.environ["http_proxy"] = ""
os.environ["https_proxy"] = ""
HOST = "127.0.0.1"
PORT = 8000
BASE_URL = f"http://{HOST}:{PORT}"
# fmt: off
INIT_POSE = {
'g1_dex1': np.array([0.10559805, 0.02726714, -0.01210221, -0.33341318, -0.22513399, -0.02627627, -0.15437093, 0.1273793 , -0.1674708 , -0.11544029, -0.40095493, 0.44332668, 0.11566751, 0.3936641, 5.4, 5.4], dtype=np.float32),
'z1_dual_dex1_realsense': np.array([-1.0262332, 1.4281361, -1.2149128, 0.6473399, -0.12425245, 0.44945636, 0.89584476, 1.2593982, -1.0737865, 0.6672816, 0.39730102, -0.47400007, 0.9894176, 0.9817477 ], dtype=np.float32),
'z1_realsense': np.array([-0.06940782, 1.4751548, -0.7554075, 1.0501366, 0.02931615, -0.02810347, -0.99238837], dtype=np.float32),
}
ZERO_ACTION = {
'g1_dex1': torch.zeros(16, dtype=torch.float32),
'z1_dual_dex1_realsense': torch.zeros(14, dtype=torch.float32),
'z1_realsense': torch.zeros(7, dtype=torch.float32),
}
CAM_KEY = {
'g1_dex1': 'cam_right_high',
'z1_dual_dex1_realsense': 'cam_high',
'z1_realsense': 'cam_high',
}
# fmt: on
def prepare_observation(args: argparse.Namespace, obs: Any) -> OrderedDict:
"""
Convert a raw env observation into the model's expected input dict.
"""
rgb_image = cv2.cvtColor(
obs.observation["images"][CAM_KEY[args.robot_type]], cv2.COLOR_BGR2RGB)
observation = {
"observation.images.top":
torch.from_numpy(rgb_image).permute(2, 0, 1),
"observation.state":
torch.from_numpy(obs.observation["qpos"]),
"action": ZERO_ACTION[args.robot_type],
}
return OrderedDict(observation)
def run_policy(
args: argparse.Namespace,
env: Any,
client: LongConnectionClient,
temporal_ensembler: ACTTemporalEnsembler,
cond_obs_queues: MutableMapping[str, Deque[torch.Tensor]],
output_dir: Path,
) -> None:
"""
Single rollout loop:
1) warm start the robot,
2) stream observations,
3) fetch actions from the policy server,
4) execute with temporal ensembling for smoother control.
"""
_ = env.step(INIT_POSE[args.robot_type])
time.sleep(2.0)
t = 0
while True:
# Gapture observation
obs = env.get_observation(t)
# Format observation
obs = prepare_observation(args, obs)
cond_obs_queues = populate_queues(cond_obs_queues, obs)
# Call server to get actions
pred_actions = client.predict_action(args.language_instruction,
cond_obs_queues).unsqueeze(0)
# Keep only the next horizon of actions and apply temporal ensemble smoothing
actions = temporal_ensembler.update(
pred_actions[:, :args.action_horizon])[0]
# Execute the actions
for n in range(args.exe_steps):
action = actions[n].cpu().numpy()
print(f">>> Exec => step {n} action: {action}", flush=True)
print("---------------------------------------------")
# Maintain real-time loop at `control_freq` Hz
t1 = time.time()
obs = env.step(action)
time.sleep(max(0, 1 / args.control_freq - time.time() + t1))
t += 1
# Prime the queue for the next action step (except after the last one in this chunk)
if n < args.exe_steps - 1:
obs = prepare_observation(args, obs)
cond_obs_queues = populate_queues(cond_obs_queues, obs)
def run_eval(args: argparse.Namespace) -> None:
client = LongConnectionClient(BASE_URL)
# Initialize ACT temporal moving-averge smoother
temporal_ensembler = ACTTemporalEnsembler(temporal_ensemble_coeff=0.01,
chunk_size=args.action_horizon,
exe_steps=args.exe_steps)
temporal_ensembler.reset()
# Initialize observation and action horizon queue
cond_obs_queues = {
"observation.images.top": deque(maxlen=args.observation_horizon),
"observation.state": deque(maxlen=args.observation_horizon),
"action": deque(
maxlen=16), # NOTE: HAND CODE AS THE MODEL PREDCIT FUTURE 16 STEPS
}
env = make_real_env(
robot_type=args.robot_type,
dt=1 / args.control_freq,
)
env.connect()
try:
for episode_idx in tqdm.tqdm(range(0, args.num_rollouts_planned)):
output_dir = Path(args.output_dir) / f"episode_{episode_idx:03d}"
output_dir.mkdir(parents=True, exist_ok=True)
run_policy(args, env, client, temporal_ensembler, cond_obs_queues,
output_dir)
finally:
env.close()
env.close()
def get_parser() -> argparse.ArgumentParser:
parser = argparse.ArgumentParser()
parser.add_argument("--robot_type",
type=str,
default="g1_dex1",
help="The type of the robot embodiment.")
parser.add_argument(
"--action_horizon",
type=int,
default=16,
help="Number of future actions, predicted by the policy, to keep",
)
parser.add_argument(
"--exe_steps",
type=int,
default=16,
help=
"Number of future actions to execute, which must be less than the above action horizon.",
)
parser.add_argument(
"--observation_horizon",
type=int,
default=2,
help="Number of most recent frames/states to consider.",
)
parser.add_argument(
"--language_instruction",
type=str,
default="Pack black camera into box",
help="The language instruction provided to the policy server.",
)
parser.add_argument("--num_rollouts_planned",
type=int,
default=10,
help="The number of rollouts to run.")
parser.add_argument("--output_dir",
type=str,
default="./results",
help="The directory for saving results.")
parser.add_argument("--control_freq",
type=float,
default=30,
help="The Low-level control frequency in Hz.")
return parser
if __name__ == "__main__":
parser = get_parser()
args = parser.parse_args()
run_eval(args)

View File

@@ -0,0 +1,105 @@
import time
import numpy as np
import pinocchio as pin
from unitree_deploy.robot.robot_configs import g1_motors
from unitree_deploy.robot_devices.arm.configs import G1ArmConfig
from unitree_deploy.robot_devices.arm.utils import make_arm_motors_buses_from_configs
from unitree_deploy.robot_devices.robots_devices_utils import precise_wait
if __name__ == "__main__":
# ============== Arm Configuration ==============
def g1_dual_arm_default_factory():
return {
"g1": G1ArmConfig(
init_pose=np.zeros(14),
motors=g1_motors,
mock=False,
),
}
# ==============================================
# Initialize and connect to the robotic arm
arm = make_arm_motors_buses_from_configs(g1_dual_arm_default_factory())
for name in arm:
arm[name].connect()
time.sleep(1.5)
print("✅ Arm connected. Waiting to start...")
# Define initial target poses for left and right arms
L_tf_target = pin.SE3(
pin.Quaternion(1, 0, 0, 0),
np.array([0.25, +0.25, 0.1]),
)
R_tf_target = pin.SE3(
pin.Quaternion(1, 0, 0, 0),
np.array([0.25, -0.25, 0.1]),
)
rotation_speed = 0.005 # Rotation speed in radians per iteration
# Motion parameters
control_dt = 1 / 50 # Control cycle duration (20ms)
step = 0
max_step = 240
initial_data_received = True # Used to switch from drive to schedule mode
# Wait for user input to start the motion loop
user_input = input("Please enter the start signal (enter 's' to start the subsequent program): \n")
if user_input.lower() == "s":
try:
while True:
# Define timing for the control cycle
t_cycle_end = time.monotonic() + control_dt
t_command_target = t_cycle_end + control_dt
direction = 1 if step <= 120 else -1
angle = rotation_speed * (step if step <= 120 else (240 - step))
cos_half_angle = np.cos(angle / 2)
sin_half_angle = np.sin(angle / 2)
L_quat = pin.Quaternion(cos_half_angle, 0, sin_half_angle, 0) # 绕 Y 轴旋转
R_quat = pin.Quaternion(cos_half_angle, 0, 0, sin_half_angle) # 绕 Z 轴旋转
delta_l = np.array([0.001, 0.001, 0.001]) * direction
delta_r = np.array([0.001, -0.001, 0.001]) * direction
L_tf_target.translation += delta_l
R_tf_target.translation += delta_r
L_tf_target.rotation = L_quat.toRotationMatrix()
R_tf_target.rotation = R_quat.toRotationMatrix()
# Solve inverse kinematics for the arm
for name in arm:
sol_q, sol_tauff = arm[name].arm_ik(L_tf_target.homogeneous, R_tf_target.homogeneous)
print(f"Arm {name} solution: q={sol_q}, tauff={sol_tauff}")
# Determine command mode
cmd_target = "drive_to_waypoint" if initial_data_received else "schedule_waypoint"
# Send joint target command to arm
arm[name].write_arm(
q_target=sol_q,
tauff_target=sol_tauff, # Optional: send torque feedforward
time_target=t_command_target - time.monotonic() + time.perf_counter(),
cmd_target="schedule_waypoint",
)
# Update step and reset after full cycle
step = (step + 1) % (max_step + 1)
initial_data_received = False
# Wait until end of control cycle
precise_wait(t_cycle_end)
except KeyboardInterrupt:
# Handle Ctrl+C to safely disconnect
print("\n🛑 Ctrl+C detected. Disconnecting arm...")
for name in arm:
arm[name].disconnect()
print("✅ Arm disconnected. Exiting.")

View File

@@ -0,0 +1,91 @@
import math
import time
import numpy as np
import pinocchio as pin
from unitree_deploy.real_unitree_env import make_real_env
from unitree_deploy.utils.rerun_visualizer import RerunLogger, flatten_images, visualization_data
from unitree_deploy.utils.rich_logger import log_info
from unitree_deploy.utils.trajectory_generator import sinusoidal_gripper_motion
if __name__ == "__main__":
period = 2.0
motion_period = 2.0
motion_amplitude = 0.99
rerun_logger = RerunLogger()
env = make_real_env(robot_type="g1_dex1", dt=1 / 30)
env.connect()
# Define initial target poses for left and right arms
L_tf_target = pin.SE3(
pin.Quaternion(1, 0, 0, 0),
np.array([0.25, +0.25, 0.1]),
)
R_tf_target = pin.SE3(
pin.Quaternion(1, 0, 0, 0),
np.array([0.25, -0.25, 0.1]),
)
rotation_speed = 0.005 # Rotation speed in radians per iteration
# Motion parameters
control_dt = 1 / 50 # Control cycle duration (20ms)
step = 0
max_step = 240
initial_data_received = True # Used to switch from drive to schedule mode
# Wait for user input to start the motion loop
user_input = input("Please enter the start signal (enter 's' to start the subsequent program): \n")
if user_input.lower() == "s":
try:
current_time = math.pi / 2
idx = 0 # Initialize index for logging
while True:
# Define timing for the control cycle
t_cycle_end = time.monotonic() + control_dt
t_command_target = t_cycle_end + control_dt
direction = 1 if step <= 120 else -1
angle = rotation_speed * (step if step <= 120 else (240 - step))
cos_half_angle = np.cos(angle / 2)
sin_half_angle = np.sin(angle / 2)
L_quat = pin.Quaternion(cos_half_angle, 0, sin_half_angle, 0) # 绕 Y 轴旋转
R_quat = pin.Quaternion(cos_half_angle, 0, 0, sin_half_angle) # 绕 Z 轴旋转
delta_l = np.array([0.001, 0.001, 0.001]) * direction
delta_r = np.array([0.001, -0.001, 0.001]) * direction
L_tf_target.translation += delta_l
R_tf_target.translation += delta_r
L_tf_target.rotation = L_quat.toRotationMatrix()
R_tf_target.rotation = R_quat.toRotationMatrix()
# Solve inverse kinematics for the left arm
for arm_name in env.robot.arm:
arm_sol_q, arm_sol_tauff = env.robot.arm[arm_name].arm_ik(
L_tf_target.homogeneous, R_tf_target.homogeneous
)
gripper_target_q = sinusoidal_gripper_motion(
period=motion_period, amplitude=motion_amplitude, current_time=time.perf_counter()
)
action = np.concatenate([arm_sol_q, gripper_target_q], axis=0)
step_type, reward, _, observation = env.step(action)
idx += 1
visualization_data(idx, flatten_images(observation), observation["qpos"], arm_sol_q, rerun_logger)
# Update step and reset after full cycle
current_time += control_dt
step = (step + 1) % (max_step + 1)
except KeyboardInterrupt:
# Handle Ctrl+C to safely disconnect
log_info("\n🛑 Ctrl+C detected. Disconnecting arm...")
env.close()

View File

@@ -0,0 +1,81 @@
import math
import time
import numpy as np
import pinocchio as pin
from unitree_deploy.robot.robot_configs import z1_motors
from unitree_deploy.robot_devices.arm.utils import make_arm_motors_bus
from unitree_deploy.robot_devices.robots_devices_utils import precise_wait
from unitree_deploy.utils.trajectory_generator import generate_rotation, sinusoidal_gripper_motion
if __name__ == "__main__":
# ============== Arm Configuration ==============
arm_type = "z1"
arm_kwargs = {
"arm_type": arm_type,
"init_pose": [0.00623, 1.11164, -0.77531, -0.32167, -0.005, 0.0, 0.0], # Initial joint pose
"motors": z1_motors,
}
# ==============================================
# Initialize and connect to the robotic arm
arm = make_arm_motors_bus(**arm_kwargs)
arm.connect()
time.sleep(2)
print("✅ Arm connected. Waiting to start...")
# Define arm initial target poses
arm_tf_target = pin.SE3(pin.Quaternion(1, 0, 0, 0), np.array([0.2, 0, 0.4]))
# Motion parameters
rotation_speed = 0.01 # Rotation speed (rad per step)
control_dt = 1 / 30 # Control cycle duration (20ms)
step = 0
max_step = 240 # Full motion cycle
# Wait for user input to start the motion loop
user_input = input("Please enter the start signal (enter 's' to start the subsequent program): \n")
if user_input.lower() == "s":
try:
current_time = math.pi / 2
while True:
# Define timing for the control cycle
t_cycle_end = time.monotonic() + control_dt
t_command_target = t_cycle_end + control_dt
# Generate target rotation and translation
L_quat, R_quat, delta_l, delta_r = generate_rotation(step, rotation_speed, max_step)
arm_tf_target.translation += delta_l
# delta_r is not used in this context
arm_tf_target.rotation = L_quat.toRotationMatrix()
# Solve inverse kinematics for the left arm
arm_sol_q, arm_sol_tauff = arm.arm_ik(arm_tf_target.homogeneous)
# Generate sinusoidal motion for the gripper
target_gripper = (
sinusoidal_gripper_motion(period=4.0, amplitude=0.99, current_time=current_time) - 1
) # Adjust target_q by subtracting 1
target_arm = np.concatenate((arm_sol_q, target_gripper), axis=0) # Add a zero for the gripper
arm.write_arm(
q_target=target_arm,
# tauff_target=left_sol_tauff, # Optional: send torque feedforward
time_target=t_command_target - time.monotonic() + time.perf_counter(),
cmd_target="schedule_waypoint",
)
# Update step and reset after full cycle
step = (step + 1) % (max_step + 1)
current_time += control_dt
# Wait until end of control cycle
precise_wait(t_cycle_end)
except KeyboardInterrupt:
# Handle Ctrl+C to safely disconnect
print("\n🛑 Ctrl+C detected. Disconnecting arm...")
arm.disconnect()
print("✅ Arm disconnected. Exiting.")

View File

@@ -0,0 +1,112 @@
import time
import numpy as np
import pinocchio as pin
from unitree_deploy.robot_devices.arm.configs import Z1DualArmConfig
from unitree_deploy.robot_devices.arm.utils import make_arm_motors_buses_from_configs
from unitree_deploy.robot_devices.robots_devices_utils import precise_wait
from unitree_deploy.utils.trajectory_generator import generate_rotation
if __name__ == "__main__":
# ============== Arm Configuration ==============
def z1_dual_arm_single_config_factory():
return {
"z1_dual": Z1DualArmConfig(
left_robot_ip="127.0.0.1",
left_robot_port1=8073,
left_robot_port2=8074,
right_robot_ip="127.0.0.1",
right_robot_port1=8071,
right_robot_port2=8072,
init_pose_left=[0, 0, 0, 0, 0, 0],
init_pose_right=[0, 0, 0, 0, 0, 0],
control_dt=1 / 250.0,
motors={
# name: (index, model)
"kLeftWaist": [0, "z1-joint"],
"kLeftShoulder": [1, "z1-joint"],
"kLeftElbow": [2, "z1-joint"],
"kLeftForearmRoll": [3, "z1-joint"],
"kLeftWristAngle": [4, "z1-joint"],
"kLeftWristRotate": [5, "z1-joint"],
"kRightWaist": [7, "z1-joint"],
"kRightShoulder": [8, "z1-joint"],
"kRightElbow": [9, "z1-joint"],
"kRightForearmRoll": [10, "z1-joint"],
"kRightWristAngle": [11, "z1-joint"],
"kRightWristRotate": [12, "z1-joint"],
},
),
}
# ==============================================
# Initialize and connect to the robotic arm
arm = make_arm_motors_buses_from_configs(z1_dual_arm_single_config_factory())
for name in arm:
arm[name].connect()
time.sleep(1.5)
print("✅ Arm connected. Waiting to start...")
# Define initial target poses for left and right arms
L_tf_target = pin.SE3(pin.Quaternion(1, 0, 0, 0), np.array([0.2, 0, 0.4]))
R_tf_target = pin.SE3(pin.Quaternion(1, 0, 0, 0), np.array([0.2, 0, 0.3]))
# Motion parameters
rotation_speed = 0.01 # Rotation speed (rad per step)
control_dt = 1 / 30 # Control cycle duration (20ms)
step = 0
max_step = 240 # Full motion cycle
initial_data_received = True # Used to switch from drive to schedule mode
# Wait for user input to start the motion loop
user_input = input("Please enter the start signal (enter 's' to start the subsequent program): \n")
if user_input.lower() == "s":
try:
while True:
# Define timing for the control cycle
t_cycle_end = time.monotonic() + control_dt
t_command_target = t_cycle_end + control_dt
# Generate target rotation and translation
L_quat, R_quat, delta_l, delta_r = generate_rotation(step, rotation_speed, max_step)
# Apply translation deltas to target pose
L_tf_target.translation += delta_l
R_tf_target.translation += delta_r
# Apply rotation to target pose
L_tf_target.rotation = L_quat.toRotationMatrix()
R_tf_target.rotation = R_quat.toRotationMatrix()
# Solve inverse kinematics for the left arm
for name in arm:
sol_q, sol_tauff = arm[name].arm_ik(L_tf_target.homogeneous, R_tf_target.homogeneous)
# Determine command mode
cmd_target = "drive_to_waypoint" if initial_data_received else "schedule_waypoint"
# Send joint target command to arm
for name in arm:
arm[name].write_arm(
q_target=sol_q,
# tauff_target=sol_tauff, # Optional: send torque feedforward
time_target=t_command_target - time.monotonic() + time.perf_counter(),
cmd_target=cmd_target,
)
# Update step and reset after full cycle
step = (step + 1) % (max_step + 1)
initial_data_received = False
# Wait until end of control cycle
precise_wait(t_cycle_end)
except KeyboardInterrupt:
# Handle Ctrl+C to safely disconnect
print("\n🛑 Ctrl+C detected. Disconnecting arm...")
for name in arm:
arm[name].disconnect()
print("✅ Arm disconnected. Exiting.")

View File

@@ -0,0 +1,65 @@
import math
import time
import numpy as np
import pinocchio as pin
from unitree_deploy.real_unitree_env import make_real_env
from unitree_deploy.utils.rerun_visualizer import RerunLogger, flatten_images, visualization_data
from unitree_deploy.utils.rich_logger import log_info
from unitree_deploy.utils.trajectory_generator import generate_rotation, sinusoidal_gripper_motion
if __name__ == "__main__":
rerun_logger = RerunLogger()
env = make_real_env(robot_type="z1_realsense", dt=1 / 30)
env.connect()
# Define initial target poses for left and right arms
arm_tf_target = pin.SE3(pin.Quaternion(1, 0, 0, 0), np.array([0.2, 0, 0.4]))
# Motion parameters
rotation_speed = 0.01 # Rotation speed (rad per step)
control_dt = 1 / 30 # Control cycle duration (20ms)
step = 0
max_step = 240 # Full motion cycle
# Wait for user input to start the motion loop
user_input = input("Please enter the start signal (enter 's' to start the subsequent program): \n")
if user_input.lower() == "s":
try:
current_time = math.pi / 2
idx = 0 # Initialize index for logging
while True:
# Define timing for the control cycle
t_cycle_end = time.monotonic() + control_dt
t_command_target = t_cycle_end + control_dt
# Generate target rotation and translation
L_quat, R_quat, delta_l, delta_r = generate_rotation(step, rotation_speed, max_step)
arm_tf_target.translation += delta_l
# delta_r is not used in this context
arm_tf_target.rotation = L_quat.toRotationMatrix()
# Solve inverse kinematics for the left arm
for arm_name in env.robot.arm:
arm_sol_q, arm_sol_tauff = env.robot.arm[arm_name].arm_ik(arm_tf_target.homogeneous)
# Generate sinusoidal motion for the gripper
target_gripper = (
sinusoidal_gripper_motion(period=4.0, amplitude=0.99, current_time=current_time) - 1
) # Adjust target_q by subtracting 1
target_arm = np.concatenate((arm_sol_q, target_gripper), axis=0) # Add a zero for the gripper
step_type, reward, _, observation = env.step(target_arm)
idx += 1
visualization_data(idx, flatten_images(observation), observation["qpos"], target_arm, rerun_logger)
# Update step and reset after full cycle
current_time += control_dt
step = (step + 1) % (max_step + 1)
except KeyboardInterrupt:
# Handle Ctrl+C to safely disconnect
log_info("\n🛑 Ctrl+C detected. Disconnecting arm...")
env.close()

View File

@@ -0,0 +1,64 @@
import time
import cv2
import numpy as np
import torch
from tqdm import tqdm
from unitree_deploy.robot_devices.cameras.configs import ImageClientCameraConfig
from unitree_deploy.robot_devices.cameras.utils import make_cameras_from_configs
from unitree_deploy.utils.rich_logger import log_success
# ============================From configs============================
def run_camera():
def image_client_default_factory():
return {
"imageclient": ImageClientCameraConfig(
head_camera_type="opencv",
head_camera_id_numbers=[4],
head_camera_image_shape=[480, 1280], # Head camera resolution
wrist_camera_type="opencv",
wrist_camera_id_numbers=[0, 2],
wrist_camera_image_shape=[480, 640], # Wrist camera resolution
aspect_ratio_threshold=2.0,
fps=30,
mock=False,
),
}
# ===========================================
cameras = make_cameras_from_configs(image_client_default_factory())
print(cameras)
for name in cameras:
cameras[name].connect()
log_success(f"Connecting {name} cameras.")
for _ in tqdm(range(20), desc="Camera warming up"):
for name in cameras:
cameras[name].async_read()
time.sleep(1 / 30)
while True:
images = {}
for name in cameras:
output = cameras[name].async_read()
if isinstance(output, dict):
for k, v in output.items():
images[k] = torch.from_numpy(v)
else:
images[name] = torch.from_numpy(output)
image_list = [np.stack([img.numpy()] * 3, axis=-1) if img.ndim == 2 else img.numpy() for img in images.values()]
stacked_image = np.hstack(image_list)
cv2.imshow("Stacked Image", stacked_image)
if (cv2.waitKey(1) & 0xFF) == ord("q"):
cv2.destroyAllWindows()
break
if __name__ == "__main__":
run_camera()

View File

@@ -0,0 +1,66 @@
import time
import cv2
import numpy as np
from unitree_deploy.robot_devices.cameras.configs import IntelRealSenseCameraConfig
from unitree_deploy.robot_devices.cameras.utils import make_cameras_from_configs
from unitree_deploy.utils.rich_logger import log_success
def run_camera():
# ===========================================
def intelrealsense_camera_default_factory():
return {
"cam_high": IntelRealSenseCameraConfig(
serial_number="044122071036",
fps=30,
width=640,
height=480,
),
"cam_wrist": IntelRealSenseCameraConfig(
serial_number="419122270615",
fps=30,
width=640,
height=480,
),
}
# ===========================================
cameras = make_cameras_from_configs(intelrealsense_camera_default_factory())
for name in cameras:
cameras[name].connect()
log_success(f"Connecting {name} cameras.")
for _ in range(20):
for name in cameras:
cameras[name].async_read()
time.sleep(1 / 30)
while True:
images = []
for name in cameras:
frame = cameras[name].async_read()
if frame is not None:
frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
cv2.putText(frame, name, (10, 25), cv2.FONT_HERSHEY_SIMPLEX, 0.7, (0, 255, 0), 2)
images.append(frame)
if images:
rows = []
for i in range(0, len(images), 2):
row = np.hstack(images[i : i + 2])
rows.append(row)
canvas = np.vstack(rows)
cv2.imshow("All Cameras", canvas)
if cv2.waitKey(1) & 0xFF == ord("q"):
break
cv2.destroyAllWindows()
if __name__ == "__main__":
run_camera()

View File

@@ -0,0 +1,95 @@
import time
import cv2
import numpy as np
import tyro
from tqdm import tqdm
from unitree_deploy.robot_devices.cameras.configs import OpenCVCameraConfig
from unitree_deploy.robot_devices.cameras.utils import make_camera, make_cameras_from_configs
from unitree_deploy.utils.rich_logger import log_success
def usb_camera_default_factory():
return {
"cam_high": OpenCVCameraConfig(
camera_index="/dev/video1",
fps=30,
width=640,
height=480,
),
"cam_left_wrist": OpenCVCameraConfig(
camera_index="/dev/video3",
fps=30,
width=640,
height=480,
),
"cam_right_wrist": OpenCVCameraConfig(
camera_index="/dev/video5",
fps=30,
width=640,
height=480,
),
}
def run_cameras(camera_style: int = 0):
"""
Runs camera(s) based on the specified style.
Args:
camera_style (int):
0 - Single camera (OpenCV).
1 - Multiple cameras from config.
"""
if camera_style == 0:
# ========== Single camera ==========
camera_kwargs = {"camera_type": "opencv", "camera_index": "/dev/video5", "mock": False}
camera = make_camera(**camera_kwargs)
camera.connect()
log_success("Connecting camera.")
while True:
color_image = camera.read()
color_image = cv2.cvtColor(color_image, cv2.COLOR_BGR2RGB)
cv2.imshow("Camera", color_image)
if cv2.waitKey(1) & 0xFF == ord("q"):
break
elif camera_style == 1:
# ========== Multi-camera from configs ==========
cameras = make_cameras_from_configs(usb_camera_default_factory())
for name in cameras:
cameras[name].connect()
log_success(f"Connecting {name} camera.")
# Camera warm-up
for _ in tqdm(range(20), desc="Camera warming up"):
for name in cameras:
cameras[name].async_read()
time.sleep(1 / 30)
while True:
images = {}
for name in cameras:
images[name] = cameras[name].async_read()
image_list = [
np.stack([img.numpy()] * 3, axis=-1) if img.ndim == 2 else img.numpy() for img in images.values()
]
stacked_image = np.hstack(image_list)
cv2.imshow("Multi-Camera View", stacked_image)
if (cv2.waitKey(1) & 0xFF) == ord("q"):
cv2.destroyAllWindows()
break
else:
raise ValueError(f"Unsupported camera_style: {camera_style}")
if __name__ == "__main__":
tyro.cli(run_cameras)

View File

@@ -0,0 +1,60 @@
import time
import tyro
from unitree_deploy.robot_devices.endeffector.utils import (
Dex1_GripperConfig,
make_endeffector_motors_buses_from_configs,
)
from unitree_deploy.robot_devices.robots_devices_utils import precise_wait
from unitree_deploy.utils.rich_logger import log_success
from unitree_deploy.utils.trajectory_generator import sinusoidal_single_gripper_motion
period = 2.0
motion_period = 2.0
motion_amplitude = 0.99
def gripper_default_factory():
return {
"left": Dex1_GripperConfig(
unit_test=True,
motors={
"kLeftGripper": [0, "z1_gripper-joint"],
},
topic_gripper_state="rt/dex1/left/state",
topic_gripper_command="rt/dex1/left/cmd",
),
"right": Dex1_GripperConfig(
unit_test=True,
motors={
"kRightGripper": [1, "z1_gripper-joint"],
},
topic_gripper_state="rt/dex1/right/state",
topic_gripper_command="rt/dex1/right/cmd",
),
}
def run_gripper():
control_dt = 1 / 30
log_success("Running gripper in style 1 (multi-bus from config)")
endeffectors = make_endeffector_motors_buses_from_configs(gripper_default_factory())
for name in endeffectors:
endeffectors[name].connect()
log_success(f"Connected endeffector '{name}'.")
while True:
t_cycle_end = time.monotonic() + control_dt
target_q = sinusoidal_single_gripper_motion(
period=motion_period, amplitude=motion_amplitude, current_time=time.perf_counter()
)
for name in endeffectors:
endeffectors[name].write_endeffector(q_target=target_q)
precise_wait(t_cycle_end)
if __name__ == "__main__":
tyro.cli(run_gripper)

View File

@@ -0,0 +1,44 @@
"""
python test/test_replay.py --repo-id unitreerobotics/G1_CameraPackaging_NewDataset --robot_type g1_dex1
python test/test_replay.py --repo-id unitreerobotics/Z1_StackBox_Dataset --robot_type z1_realsense
python test/test_replay.py --repo-id unitreerobotics/Z1_Dual_Dex1_StackBox_Dataset_V2 --robot_type z1_dual_dex1_realsense
"""
import tyro
from lerobot.datasets.lerobot_dataset import LeRobotDataset
from unitree_deploy.real_unitree_env import make_real_env
from unitree_deploy.utils.rerun_visualizer import RerunLogger, flatten_images, visualization_data
from unitree_deploy.utils.rich_logger import log_info
# Replay a specific episode from the LeRobot dataset using the real environment robot_type(e.g., g1_dex1, z1_realsense, z1_dual_dex1_realsense)
def replay_lerobot_data(repo_id: str, robot_type: str, root: str | None = None, episode: int = 145):
dataset = LeRobotDataset(repo_id, root=root, episodes=[episode])
actions = dataset.hf_dataset.select_columns("action")
init_pose_arm = actions[0]["action"].numpy()[:14] if robot_type == "g1" else actions[0]["action"].numpy()
rerun_logger = RerunLogger()
env = make_real_env(robot_type=robot_type, dt=1 / 30, init_pose_arm=init_pose_arm)
env.connect()
try:
# Wait for user input to start the motion loop
user_input = input("Please enter the start signal (enter 's' to start the subsequent program): \n")
if user_input.lower() == "s":
log_info("Replaying episode")
for idx in range(dataset.num_frames):
action = actions[idx]["action"].numpy()
if robot_type == "z1_realsense":
action[-1] = -action[-1]
step_type, reward, _, observation = env.step(action)
visualization_data(idx, flatten_images(observation), observation["qpos"], action, rerun_logger)
env.close()
except KeyboardInterrupt:
# Handle Ctrl+C to safely disconnect
log_info("\n🛑 Ctrl+C detected. Disconnecting arm...")
env.close()
if __name__ == "__main__":
tyro.cli(replay_lerobot_data)

View File

@@ -0,0 +1,105 @@
import collections
import time
import matplotlib.pyplot as plt
import numpy as np
from lerobot.datasets.lerobot_dataset import LeRobotDataset
from unitree_deploy.utils.rerun_visualizer import RerunLogger, visualization_data
def extract_observation(step: dict):
observation = {}
for key, value in step.items():
if key.startswith("observation.images."):
if isinstance(value, np.ndarray) and value.ndim == 3 and value.shape[-1] in [1, 3]:
value = np.transpose(value, (2, 0, 1))
observation[key] = value
elif key == "observation.state":
observation[key] = value
return observation
class DatasetEvalEnv:
def __init__(self, repo_id: str, episode_index: int = 0, visualization: bool = True):
self.dataset = LeRobotDataset(repo_id=repo_id)
self.visualization = visualization
if self.visualization:
self.rerun_logger = RerunLogger()
self.from_idx = self.dataset.episode_data_index["from"][episode_index].item()
self.to_idx = self.dataset.episode_data_index["to"][episode_index].item()
self.step_idx = self.from_idx
self.ground_truth_actions = []
self.predicted_actions = []
def get_observation(self):
step = self.dataset[self.step_idx]
observation = extract_observation(step)
state = step["observation.state"].numpy()
self.ground_truth_actions.append(step["action"].numpy())
if self.visualization:
visualization_data(
self.step_idx,
observation,
observation["observation.state"],
step["action"].numpy(),
self.rerun_logger,
)
images_observation = {
key: value.numpy() for key, value in observation.items() if key.startswith("observation.images.")
}
obs = collections.OrderedDict()
obs["qpos"] = state
obs["images"] = images_observation
self.step_idx += 1
return obs
def step(self, action):
self.predicted_actions.append(action)
if self.step_idx == self.to_idx:
self._plot_results()
exit()
def _plot_results(self):
ground_truth_actions = np.array(self.ground_truth_actions)
predicted_actions = np.array(self.predicted_actions)
n_timesteps, n_dims = ground_truth_actions.shape
fig, axes = plt.subplots(n_dims, 1, figsize=(12, 4 * n_dims), sharex=True)
fig.suptitle("Ground Truth vs Predicted Actions")
for i in range(n_dims):
ax = axes[i] if n_dims > 1 else axes
ax.plot(ground_truth_actions[:, i], label="Ground Truth", color="blue")
ax.plot(predicted_actions[:, i], label="Predicted", color="red", linestyle="--")
ax.set_ylabel(f"Dim {i + 1}")
ax.legend()
axes[-1].set_xlabel("Timestep")
plt.tight_layout()
plt.savefig("figure.png")
time.sleep(1)
def make_dataset_eval_env() -> DatasetEvalEnv:
return DatasetEvalEnv()
if __name__ == "__main__":
eval_dataset = DatasetEvalEnv(repo_id="unitreerobotics/G1_Brainco_PickApple_Dataset")
while True:
observation = eval_dataset.get_observation()
eval_dataset.step(observation["qpos"])

View File

@@ -0,0 +1,81 @@
import collections
import time
from typing import List, Optional
import cv2
import dm_env
import numpy as np
import torch
from unitree_deploy.robot.robot_utils import make_robot
from unitree_deploy.robot_devices.robots_devices_utils import precise_wait
from unitree_deploy.utils.rich_logger import log_success
class UnitreeEnv:
def __init__(
self,
robot_type: str = "z1_realsense",
dt: float = 1 / 30,
init_pose_arm: np.ndarray | List[float] | None = None,
):
self.control_dt = dt
self.init_pose_arm = init_pose_arm
self.state: Optional[np.ndarray] = None
self.robot_type = robot_type
self.robot = make_robot(self.robot_type)
def connect(self):
self.robot.connect()
def _get_obs(self):
observation = self.robot.capture_observation()
# Process images
image_dict = {
key.split("observation.images.")[-1]: cv2.cvtColor(value.numpy(), cv2.COLOR_BGR2RGB)
for key, value in observation.items()
if key.startswith("observation.images.")
}
# for image_key, image in image_dict.items():
# cv2.imwrite(f"{image_key}.png", image)
# Process state
self.state = observation["observation.state"].numpy()
# Construct observation dictionary
obs = collections.OrderedDict(
qpos=self.state,
qvel=np.zeros_like(self.state),
effort=np.zeros_like(self.state),
images=image_dict,
)
return obs
def get_observation(self, t=0):
step_type = dm_env.StepType.FIRST if t == 0 else dm_env.StepType.MID
return dm_env.TimeStep(step_type=step_type, reward=0, discount=None, observation=self._get_obs())
def step(self, action) -> dm_env.TimeStep:
t_cycle_end = time.monotonic() + self.control_dt
t_command_target = t_cycle_end + self.control_dt
self.robot.send_action(torch.from_numpy(action), t_command_target)
precise_wait(t_cycle_end)
return dm_env.TimeStep(
step_type=dm_env.StepType.MID,
reward=0,
discount=None,
observation=self._get_obs(),
)
def close(self) -> None:
self.robot.disconnect()
log_success("Robot disconnected successfully! 🎉")
def make_real_env(
robot_type: str, dt: float | None, init_pose_arm: np.ndarray | List[float] | None = None
) -> UnitreeEnv:
return UnitreeEnv(robot_type, dt, init_pose_arm)

View File

@@ -0,0 +1,147 @@
import time
import torch
from unitree_deploy.robot.robot_configs import UnitreeRobotConfig
from unitree_deploy.robot_devices.arm.utils import make_arm_motors_buses_from_configs
from unitree_deploy.robot_devices.cameras.utils import make_cameras_from_configs
from unitree_deploy.robot_devices.endeffector.utils import (
make_endeffector_motors_buses_from_configs,
)
from unitree_deploy.utils.rich_logger import log_success
class UnitreeRobot:
def __init__(
self,
config: UnitreeRobotConfig,
):
self.config = config
self.robot_type = self.config.type
self.cameras = make_cameras_from_configs(self.config.cameras)
self.arm = make_arm_motors_buses_from_configs(self.config.arm)
self.endeffector = make_endeffector_motors_buses_from_configs(self.config.endeffector)
self.initial_data_received = True
def connect(self):
if not self.arm and self.endeffector and not self.cameras:
raise ValueError(
"UnitreeRobot doesn't have any device to connect. See example of usage in docstring of the class."
)
# Connect the cameras
for name in self.cameras:
self.cameras[name].connect()
log_success(f"Connecting {name} cameras.")
for _ in range(20):
for name in self.cameras:
self.cameras[name].async_read()
time.sleep(1 / 30)
for name in self.arm:
self.arm[name].connect()
log_success(f"Connecting {name} arm.")
for name in self.endeffector:
self.endeffector[name].connect()
log_success(f"Connecting {name} endeffector.")
time.sleep(2)
log_success("All Device Connect Success!!!.✅")
def capture_observation(self):
"""The returned observations do not have a batch dimension."""
# Create state by concatenating follower current position
state = []
arm_state_list = []
endeffector_state_list = []
for arm_name in self.arm:
arm_state = self.arm[arm_name].read_current_arm_q()
arm_state_list.append(torch.from_numpy(arm_state))
for endeffector_name in self.endeffector:
endeffector_state = self.endeffector[endeffector_name].read_current_endeffector_q()
endeffector_state_list.append(torch.from_numpy(endeffector_state))
state = (
torch.cat(arm_state_list + endeffector_state_list, dim=0)
if arm_state_list or endeffector_state_list
else torch.tensor([])
)
# Capture images from cameras
images = {}
for name in self.cameras:
output = self.cameras[name].async_read()
if isinstance(output, dict):
images.update({k: torch.from_numpy(v) for k, v in output.items()})
else:
images[name] = torch.from_numpy(output)
# Populate output dictionnaries and format to pytorch
obs_dict = {}
obs_dict["observation.state"] = state
for name, value in images.items():
obs_dict[f"observation.images.{name}"] = value
return obs_dict
def send_action(self, action: torch.Tensor, t_command_target: float | None = None) -> torch.Tensor:
from_idx_arm = 0
to_idx_arm = 0
action_sent_arm = []
cmd_target = "drive_to_waypoint" if self.initial_data_received else "schedule_waypoint"
for arm_name in self.arm:
to_idx_arm += len(self.arm[arm_name].motor_names)
action_arm = action[from_idx_arm:to_idx_arm].numpy()
from_idx_arm = to_idx_arm
action_sent_arm.append(torch.from_numpy(action_arm))
self.arm[arm_name].write_arm(
action_arm,
time_target=t_command_target - time.monotonic() + time.perf_counter(),
cmd_target=cmd_target,
)
from_idx_endeffector = to_idx_arm
to_idx_endeffector = to_idx_arm
action_endeffector_set = []
for endeffector_name in self.endeffector:
to_idx_endeffector += len(self.endeffector[endeffector_name].motor_names)
action_endeffector = action[from_idx_endeffector:to_idx_endeffector].numpy()
from_idx_endeffector = to_idx_endeffector
action_endeffector_set.append(torch.from_numpy(action_endeffector))
self.endeffector[endeffector_name].write_endeffector(
action_endeffector,
time_target=t_command_target - time.monotonic() + time.perf_counter(),
cmd_target=cmd_target,
)
self.initial_data_received = False
return torch.cat(action_sent_arm + action_endeffector_set, dim=0)
def disconnect(self):
# disconnect the arms
for name in self.arm:
self.arm[name].disconnect()
log_success(f"disconnect {name} arm.")
for name in self.endeffector:
self.endeffector[name].disconnect()
log_success(f"disconnect {name} endeffector.")
# disconnect the cameras
for name in self.cameras:
self.cameras[name].disconnect()
log_success(f"disconnect {name} cameras.")
def __del__(self):
if getattr(self, "is_connected", False):
self.disconnect()

View File

@@ -0,0 +1,270 @@
import abc
from dataclasses import dataclass, field
import draccus
import numpy as np
from unitree_deploy.robot_devices.arm.configs import (
ArmConfig,
G1ArmConfig,
Z1ArmConfig,
Z1DualArmConfig,
)
from unitree_deploy.robot_devices.cameras.configs import (
CameraConfig,
ImageClientCameraConfig,
IntelRealSenseCameraConfig,
OpenCVCameraConfig,
)
from unitree_deploy.robot_devices.endeffector.configs import (
Dex1_GripperConfig,
EndEffectorConfig,
)
# ======================== arm motors =================================
# name: (index, model)
g1_motors = {
"kLeftShoulderPitch": [0, "g1-joint"],
"kLeftShoulderRoll": [1, "g1-joint"],
"kLeftShoulderYaw": [2, "g1-joint"],
"kLeftElbow": [3, "g1-joint"],
"kLeftWristRoll": [4, "g1-joint"],
"kLeftWristPitch": [5, "g1-joint"],
"kLeftWristyaw": [6, "g1-joint"],
"kRightShoulderPitch": [7, "g1-joint"],
"kRightShoulderRoll": [8, "g1-joint"],
"kRightShoulderYaw": [9, "g1-joint"],
"kRightElbow": [10, "g1-joint"],
"kRightWristRoll": [11, "g1-joint"],
"kRightWristPitch": [12, "g1-joint"],
"kRightWristYaw": [13, "g1-joint"],
}
z1_motors = {
"kWaist": [0, "z1-joint"],
"kShoulder": [1, "z1-joint"],
"kElbow": [2, "z1-joint"],
"kForearmRoll": [3, "z1-joint"],
"kWristAngle": [4, "z1-joint"],
"kWristRotate": [5, "z1-joint"],
"kGripper": [6, "z1-joint"],
}
z1_dual_motors = {
"kLeftWaist": [0, "z1-joint"],
"kLeftShoulder": [1, "z1-joint"],
"kLeftElbow": [2, "z1-joint"],
"kLeftForearmRoll": [3, "z1-joint"],
"kLeftWristAngle": [4, "z1-joint"],
"kLeftWristRotate": [5, "z1-joint"],
"kRightWaist": [7, "z1-joint"],
"kRightShoulder": [8, "z1-joint"],
"kRightElbow": [9, "z1-joint"],
"kRightForearmRoll": [10, "z1-joint"],
"kRightWristAngle": [11, "z1-joint"],
"kRightWristRotate": [12, "z1-joint"],
}
# =========================================================
# ======================== camera =================================
def z1_intelrealsense_camera_default_factory():
return {
"cam_high": IntelRealSenseCameraConfig(
serial_number="044122071036",
fps=30,
width=640,
height=480,
),
# "cam_wrist": IntelRealSenseCameraConfig(
# serial_number="419122270615",
# fps=30,
# width=640,
# height=480,
# ),
}
def z1_dual_intelrealsense_camera_default_factory():
return {
# "cam_left_wrist": IntelRealSenseCameraConfig(
# serial_number="218722271166",
# fps=30,
# width=640,
# height=480,
# ),
# "cam_right_wrist": IntelRealSenseCameraConfig(
# serial_number="419122270677",
# fps=30,
# width=640,
# height=480,
# ),
"cam_high": IntelRealSenseCameraConfig(
serial_number="947522071393",
fps=30,
width=640,
height=480,
),
}
def g1_image_client_default_factory():
return {
"imageclient": ImageClientCameraConfig(
head_camera_type="opencv",
head_camera_id_numbers=[4],
head_camera_image_shape=[480, 1280], # Head camera resolution
wrist_camera_type="opencv",
wrist_camera_id_numbers=[0, 2],
wrist_camera_image_shape=[480, 640], # Wrist camera resolution
aspect_ratio_threshold=2.0,
fps=30,
mock=False,
),
}
def usb_camera_default_factory():
return {
"cam_high": OpenCVCameraConfig(
camera_index="/dev/video1",
fps=30,
width=640,
height=480,
),
"cam_left_wrist": OpenCVCameraConfig(
camera_index="/dev/video5",
fps=30,
width=640,
height=480,
),
"cam_right_wrist": OpenCVCameraConfig(
camera_index="/dev/video3",
fps=30,
width=640,
height=480,
),
}
# =========================================================
# ======================== endeffector =================================
def dex1_default_factory():
return {
"left": Dex1_GripperConfig(
unit_test=True,
motors={
"kLeftGripper": [0, "z1_gripper-joint"],
},
topic_gripper_state="rt/dex1/left/state",
topic_gripper_command="rt/dex1/left/cmd",
),
"right": Dex1_GripperConfig(
unit_test=True,
motors={
"kRightGripper": [1, "z1_gripper-joint"],
},
topic_gripper_state="rt/dex1/right/state",
topic_gripper_command="rt/dex1/right/cmd",
),
}
# =========================================================
# ======================== arm =================================
def z1_arm_default_factory(init_pose=None):
return {
"z1": Z1ArmConfig(
init_pose=np.zeros(7) if init_pose is None else init_pose,
motors=z1_motors,
),
}
def z1_dual_arm_single_config_factory(init_pose=None):
return {
"z1_dual": Z1DualArmConfig(
left_robot_ip="127.0.0.1",
left_robot_port1=8073,
left_robot_port2=8074,
right_robot_ip="127.0.0.1",
right_robot_port1=8071,
right_robot_port2=8072,
init_pose_left=np.zeros(6) if init_pose is None else init_pose[:6],
init_pose_right=np.zeros(6) if init_pose is None else init_pose[6:],
control_dt=1 / 250.0,
motors=z1_dual_motors,
),
}
def g1_dual_arm_default_factory(init_pose=None):
return {
"g1": G1ArmConfig(
init_pose=np.zeros(14) if init_pose is None else init_pose,
motors=g1_motors,
mock=False,
),
}
# =========================================================
# robot_type: arm devies _ endeffector devies _ camera devies
@dataclass
class RobotConfig(draccus.ChoiceRegistry, abc.ABC):
@property
def type(self) -> str:
return self.get_choice_name(self.__class__)
@dataclass
class UnitreeRobotConfig(RobotConfig):
cameras: dict[str, CameraConfig] = field(default_factory=lambda: {})
arm: dict[str, ArmConfig] = field(default_factory=lambda: {})
endeffector: dict[str, EndEffectorConfig] = field(default_factory=lambda: {})
# =============================== Single-arm:z1, Camera:Realsense ========================================
@RobotConfig.register_subclass("z1_realsense")
@dataclass
class Z1_Realsense_RobotConfig(UnitreeRobotConfig):
cameras: dict[str, CameraConfig] = field(default_factory=z1_intelrealsense_camera_default_factory)
arm: dict[str, ArmConfig] = field(default_factory=z1_arm_default_factory)
# =============================== Dual-arm:z1, Endeffector:dex1, Camera:Realsense ========================================
@RobotConfig.register_subclass("z1_dual_dex1_realsense")
@dataclass
class Z1dual_Dex1_Realsense_RobotConfig(UnitreeRobotConfig):
cameras: dict[str, CameraConfig] = field(default_factory=z1_dual_intelrealsense_camera_default_factory)
arm: dict[str, ArmConfig] = field(default_factory=z1_dual_arm_single_config_factory)
endeffector: dict[str, EndEffectorConfig] = field(default_factory=dex1_default_factory)
# =============================== Dual-arm:z1, Endeffector:dex1, Camera:Realsense ========================================
@RobotConfig.register_subclass("z1_dual_dex1_opencv")
@dataclass
class Z1dual_Dex1_Opencv_RobotConfig(UnitreeRobotConfig):
cameras: dict[str, CameraConfig] = field(default_factory=usb_camera_default_factory)
arm: dict[str, ArmConfig] = field(default_factory=z1_dual_arm_single_config_factory)
endeffector: dict[str, EndEffectorConfig] = field(default_factory=dex1_default_factory)
# =============================== Arm:g1, Endeffector:dex1, Camera:imageclint ========================================
@RobotConfig.register_subclass("g1_dex1")
@dataclass
class G1_Dex1_Imageclint_RobotConfig(UnitreeRobotConfig):
cameras: dict[str, CameraConfig] = field(default_factory=g1_image_client_default_factory)
arm: dict[str, ArmConfig] = field(default_factory=g1_dual_arm_default_factory)
endeffector: dict[str, EndEffectorConfig] = field(default_factory=dex1_default_factory)

View File

@@ -0,0 +1,47 @@
from typing import Protocol
from unitree_deploy.robot.robot_configs import (
G1_Dex1_Imageclint_RobotConfig,
RobotConfig,
Z1_Realsense_RobotConfig,
Z1dual_Dex1_Opencv_RobotConfig,
Z1dual_Dex1_Realsense_RobotConfig,
)
def get_arm_id(name, arm_type):
return f"{name}_{arm_type}"
class Robot(Protocol):
robot_type: str
features: dict
def connect(self): ...
def capture_observation(self): ...
def send_action(self, action): ...
def disconnect(self): ...
def make_robot_config(robot_type: str, **kwargs) -> RobotConfig:
if robot_type == "z1_realsense":
return Z1_Realsense_RobotConfig(**kwargs)
elif robot_type == "z1_dual_dex1_realsense":
return Z1dual_Dex1_Realsense_RobotConfig(**kwargs)
elif robot_type == "z1_dual_dex1_opencv":
return Z1dual_Dex1_Opencv_RobotConfig(**kwargs)
elif robot_type == "g1_dex1":
return G1_Dex1_Imageclint_RobotConfig(**kwargs)
else:
raise ValueError(f"Robot type '{robot_type}' is not available.")
def make_robot_from_config(config: RobotConfig):
from unitree_deploy.robot.robot import UnitreeRobot
return UnitreeRobot(config)
def make_robot(robot_type: str, **kwargs) -> Robot:
config = make_robot_config(robot_type, **kwargs)
return make_robot_from_config(config)

View File

@@ -0,0 +1,119 @@
# noqa: N815
from enum import IntEnum
# ==================g1========================
class G1_29_JointArmIndex(IntEnum):
# Left arm
kLeftShoulderPitch = 15
kLeftShoulderRoll = 16
kLeftShoulderYaw = 17
kLeftElbow = 18
kLeftWristRoll = 19
kLeftWristPitch = 20
kLeftWristyaw = 21
# Right arm
kRightShoulderPitch = 22
kRightShoulderRoll = 23
kRightShoulderYaw = 24
kRightElbow = 25
kRightWristRoll = 26
kRightWristPitch = 27
kRightWristYaw = 28
class G1_29_JointIndex(IntEnum):
# Left leg
kLeftHipPitch = 0
kLeftHipRoll = 1
kLeftHipYaw = 2
kLeftKnee = 3
kLeftAnklePitch = 4
kLeftAnkleRoll = 5
# Right leg
kRightHipPitch = 6
kRightHipRoll = 7
kRightHipYaw = 8
kRightKnee = 9
kRightAnklePitch = 10
kRightAnkleRoll = 11
kWaistYaw = 12
kWaistRoll = 13
kWaistPitch = 14
# Left arm
kLeftShoulderPitch = 15
kLeftShoulderRoll = 16
kLeftShoulderYaw = 17
kLeftElbow = 18
kLeftWristRoll = 19
kLeftWristPitch = 20
kLeftWristyaw = 21
# Right arm
kRightShoulderPitch = 22
kRightShoulderRoll = 23
kRightShoulderYaw = 24
kRightElbow = 25
kRightWristRoll = 26
kRightWristPitch = 27
kRightWristYaw = 28
# not used
kNotUsedJoint0 = 29
kNotUsedJoint1 = 30
kNotUsedJoint2 = 31
kNotUsedJoint3 = 32
kNotUsedJoint4 = 33
kNotUsedJoint5 = 34
# ==========================================
# ==================z1========================
class Z1ArmJointIndex(IntEnum):
WAIST = 0
SHOULDER = 1
ELBOW = 2
FOREARM_ROLL = 3
WRIST_ANGLE = 4
WRIST_ROTATE = 5
class Z1_12_JointArmIndex(IntEnum):
# Left arm
kLeftWaist = 0
kLeftShoulder = 1
kLeftElbow = 2
kLeftForearmRoll = 3
kLeftWristAngle = 4
kLeftWristRotate = 5
# Right arm
kRightWaist = 6
kRightShoulder = 7
kRightElbow = 8
kRightForearmRoll = 9
kRightWristAngle = 10
kRightWristRotate = 11
class Z1GripperArmJointIndex(IntEnum):
WAIST = 0
SHOULDER = 1
ELBOW = 2
FOREARM_ROLL = 3
WRIST_ANGLE = 4
WRIST_ROTATE = 5
GRIPPER = 6
class Gripper_Sigle_JointIndex(IntEnum):
kGripper = 0
# ==========================================

View File

@@ -0,0 +1,82 @@
import abc
from dataclasses import dataclass
import draccus
import numpy as np
@dataclass
class ArmConfig(draccus.ChoiceRegistry, abc.ABC):
@property
def type(self) -> str:
return self.get_choice_name(self.__class__)
@ArmConfig.register_subclass("z1")
@dataclass
class Z1ArmConfig(ArmConfig):
motors: dict[str, tuple[int, str]]
init_pose: list = None
unit_test: bool = False
control_dt: float = 1 / 500.0
robot_kp: np.ndarray = np.array([4, 6, 6, 6, 6, 6])
robot_kd: np.ndarray = np.array([350, 300, 300, 200, 200, 200])
max_pos_speed: float = 180 * (np.pi / 180) * 2
log_level: str | int = "ERROR"
def __post_init__(self):
if self.control_dt < 0.002:
raise ValueError(f"`control_dt` must > 1/500 (got {self.control_dt})")
@ArmConfig.register_subclass("z1_dual")
@dataclass
class Z1DualArmConfig(ArmConfig):
left_robot_ip: str
left_robot_port1: int
left_robot_port2: int
right_robot_ip: str
right_robot_port1: int
right_robot_port2: int
motors: dict[str, tuple[int, str]]
robot_kp: np.ndarray = np.array([4, 6, 6, 6, 6, 6])
robot_kd: np.ndarray = np.array([350, 300, 300, 200, 200, 200])
mock: bool = False
unit_test: bool = False
init_pose_left: list | None = None
init_pose_right: list | None = None
max_pos_speed: float = 180 * (np.pi / 180) * 2
control_dt: float = 1 / 500.0
def __post_init__(self):
if self.control_dt < 0.002:
raise ValueError(f"`control_dt` must > 1/500 (got {self.control_dt})")
@ArmConfig.register_subclass("g1")
@dataclass
class G1ArmConfig(ArmConfig):
motors: dict[str, tuple[int, str]]
mock: bool = False
unit_test: bool = False
init_pose: np.ndarray | list = np.zeros(14)
control_dt: float = 1 / 500.0
max_pos_speed: float = 180 * (np.pi / 180) * 2
topic_low_command: str = "rt/lowcmd"
topic_low_state: str = "rt/lowstate"
kp_high: float = 300.0
kd_high: float = 3.0
kp_low: float = 80.0 # 140.0
kd_low: float = 3.0 # 3.0
kp_wrist: float = 40.0
kd_wrist: float = 1.5
def __post_init__(self):
if self.control_dt < 0.002:
raise ValueError(f"`control_dt` must > 1/500 (got {self.control_dt})")

View File

@@ -0,0 +1,385 @@
import threading
import time
from typing import Callable
import numpy as np
from unitree_sdk2py.core.channel import ChannelFactoryInitialize, ChannelPublisher, ChannelSubscriber # dds
from unitree_sdk2py.idl.default import unitree_hg_msg_dds__LowCmd_
from unitree_sdk2py.idl.unitree_hg.msg.dds_ import LowCmd_, LowState_ # idl
from unitree_sdk2py.utils.crc import CRC
from unitree_deploy.robot_devices.arm.arm_indexs import G1_29_JointArmIndex, G1_29_JointIndex
from unitree_deploy.robot_devices.arm.configs import G1ArmConfig
from unitree_deploy.robot_devices.arm.g1_arm_ik import G1_29_ArmIK
from unitree_deploy.robot_devices.robots_devices_utils import (
DataBuffer,
MotorState,
Robot_Num_Motors,
RobotDeviceAlreadyConnectedError,
)
from unitree_deploy.utils.joint_trajcetory_inter import JointTrajectoryInterpolator
from unitree_deploy.utils.rich_logger import log_error, log_info, log_success, log_warning
from unitree_deploy.utils.run_simulation import MujicoSimulation, get_mujoco_sim_config
class G1_29_LowState:
def __init__(self):
self.motor_state = [MotorState() for _ in range(Robot_Num_Motors.G1_29_Num_Motors)]
class G1_29_ArmController:
def __init__(self, config: G1ArmConfig):
self.motors = config.motors
self.mock = config.mock
self.unit_test = config.unit_test
self.init_pose = config.init_pose
self.control_dt = config.control_dt
self.max_pos_speed = config.max_pos_speed
self.topic_low_command = config.topic_low_command
self.topic_low_state = config.topic_low_state
self.kp_high = config.kp_high
self.kd_high = config.kd_high
self.kp_low = config.kp_low
self.kd_low = config.kd_low
self.kp_wrist = config.kp_wrist
self.kd_wrist = config.kd_wrist
self.all_motor_q = None
self.q_target = np.zeros(14)
self.dq_target = np.zeros(14)
self.tauff_target = np.zeros(14)
self.time_target = time.monotonic()
self.arm_cmd = "schedule_waypoint"
self.lowstate_buffer = DataBuffer()
self.g1_arm_ik = G1_29_ArmIK(unit_test=self.unit_test, visualization=False)
self.stop_event = threading.Event()
self.ctrl_lock = threading.Lock()
self.is_connected = False
@property
def motor_names(self) -> list[str]:
return list(self.motors.keys())
@property
def motor_models(self) -> list[str]:
return [model for _, model in self.motors.values()]
@property
def motor_indices(self) -> list[int]:
return [idx for idx, _ in self.motors.values()]
def _start_daemon_thread(self, target_fn: Callable[[], None], name: str | None = None) -> threading.Thread:
thread = threading.Thread(target=target_fn, name=name)
thread.daemon = True
thread.start()
return thread
def connect(self):
try:
if self.is_connected:
raise RobotDeviceAlreadyConnectedError(
"G1_Arm is already connected. Do not call `robot.connect()` twice."
)
if self.mock:
config = get_mujoco_sim_config(robot_type="g1")
self.g1 = MujicoSimulation(config)
time.sleep(1)
else:
# initialize lowcmd publisher and lowstate subscriber
ChannelFactoryInitialize(0)
self.lowcmd_publisher = ChannelPublisher(self.topic_low_command, LowCmd_)
self.lowcmd_publisher.Init()
self.lowstate_subscriber = ChannelSubscriber(self.topic_low_state, LowState_)
self.lowstate_subscriber.Init()
# initialize subscribe thread
self.subscribe_thread = self._start_daemon_thread(
self._subscribe_motor_state, name="g1._subscribe_motor_state"
)
while not self.lowstate_buffer.get_data():
time.sleep(0.01)
log_warning("[G1_29_ArmController] Waiting to subscribe dds...")
if not self.mock:
# initialize hg's lowcmd msg
self.crc = CRC()
self.msg = unitree_hg_msg_dds__LowCmd_()
self.msg.mode_pr = 0
self.msg.mode_machine = self._read_mode_machine()
self.all_motor_q = self._read_current_motor_q()
log_info(f"Current all body motor state q:\n{self.all_motor_q} \n")
log_info(f"Current two arms motor state q:\n{self.read_current_arm_q()}\n")
log_info("Lock all joints except two arms...\n")
arm_indices = {member.value for member in G1_29_JointArmIndex}
for id in G1_29_JointIndex:
self.msg.motor_cmd[id].mode = 1
if id.value in arm_indices:
if self._is_wrist_motor(id):
self.msg.motor_cmd[id].kp = self.kp_wrist
self.msg.motor_cmd[id].kd = self.kd_wrist
else:
self.msg.motor_cmd[id].kp = self.kp_low
self.msg.motor_cmd[id].kd = self.kd_low
else:
if self._is_weak_motor(id):
self.msg.motor_cmd[id].kp = self.kp_low
self.msg.motor_cmd[id].kd = self.kd_low
else:
self.msg.motor_cmd[id].kp = self.kp_high
self.msg.motor_cmd[id].kd = self.kd_high
self.msg.motor_cmd[id].q = self.all_motor_q[id]
log_info("Lock OK!\n")
# initialize publish thread
self.publish_thread = self._start_daemon_thread(self._ctrl_motor_state, name="g1._ctrl_motor_state")
self.is_connected = True
except Exception as e:
self.disconnect()
log_error(f"❌ Error in G1_29_ArmController.connect: {e}")
def _subscribe_motor_state(self):
try:
while not self.stop_event.is_set():
lowstate = G1_29_LowState()
if self.mock:
if self.g1.get_current_positions() is not None and len(self.g1.get_current_positions()) != 0:
for motor_id in range(Robot_Num_Motors.G1_29_Num_Motors):
lowstate.motor_state[motor_id].q = self.g1.get_current_positions()[motor_id]
lowstate.motor_state[motor_id].dq = 0.0
else:
print("[WARN] get_current_positions() failed: queue is empty.")
else:
msg = self.lowstate_subscriber.Read()
if msg is not None:
for id in range(Robot_Num_Motors.G1_29_Num_Motors):
lowstate.motor_state[id].q = msg.motor_state[id].q
lowstate.motor_state[id].dq = msg.motor_state[id].dq
self.lowstate_buffer.set_data(lowstate)
time.sleep(self.control_dt)
except Exception as e:
self.disconnect()
log_error(f"❌ Error in G1_29_ArmController._subscribe_motor_state: {e}")
def _update_g1_arm(
self,
arm_q_target: np.ndarray,
arm_dq_target: np.ndarray | None = None,
arm_tauff_target: np.ndarray | None = None,
):
if self.mock:
self.g1.set_positions(arm_q_target)
else:
for idx, id in enumerate(G1_29_JointArmIndex):
self.msg.motor_cmd[id].q = arm_q_target[idx]
self.msg.motor_cmd[id].dq = arm_dq_target[idx]
self.msg.motor_cmd[id].tau = arm_tauff_target[idx]
self.msg.crc = self.crc.Crc(self.msg)
self.lowcmd_publisher.Write(self.msg)
def _drive_to_waypoint(self, target_pose: np.ndarray, t_insert_time: float):
curr_time = time.monotonic() + self.control_dt
t_insert = curr_time + t_insert_time
self.pose_interp = self.pose_interp.drive_to_waypoint(
pose=target_pose,
time=t_insert,
curr_time=curr_time,
max_pos_speed=self.max_pos_speed,
)
while time.monotonic() < t_insert:
start_time = time.perf_counter()
cliped_arm_q_target = self.pose_interp(time.monotonic())
self._update_g1_arm(cliped_arm_q_target, self.dq_target, self.tauff_target)
time.sleep(max(0, (self.control_dt - (time.perf_counter() - start_time))))
def _schedule_waypoint(
self,
arm_q_target: np.ndarray,
arm_time_target: float,
t_now: float,
start_time: float,
last_waypoint_time: float,
arm_tauff_target: np.ndarray | None = None,
) -> float:
target_time = time.monotonic() - time.perf_counter() + arm_time_target
curr_time = t_now + self.control_dt
target_time = max(target_time, curr_time + self.control_dt)
self.pose_interp = self.pose_interp.schedule_waypoint(
pose=arm_q_target,
time=target_time,
max_pos_speed=self.max_pos_speed,
curr_time=curr_time,
last_waypoint_time=last_waypoint_time,
)
last_waypoint_time = target_time
cliped_arm_q_target = self.pose_interp(t_now)
self._update_g1_arm(cliped_arm_q_target, self.dq_target, arm_tauff_target)
time.sleep(max(0, (self.control_dt - (time.perf_counter() - start_time))))
def _ctrl_motor_state(self):
# wait dds init done !!!
time.sleep(2)
self.pose_interp = JointTrajectoryInterpolator(
times=[time.monotonic()], joint_positions=[self.read_current_arm_q()]
)
self.go_start()
arm_q_target = self.read_current_arm_q()
arm_tauff_target = self.tauff_target
arm_time_target = time.monotonic()
arm_cmd = "schedule_waypoint"
last_waypoint_time = time.monotonic()
while not self.stop_event.is_set():
start_time = time.perf_counter()
t_now = time.monotonic()
with self.ctrl_lock:
arm_q_target = self.q_target
arm_tauff_target = self.tauff_target
arm_time_target = self.time_target
arm_cmd = self.arm_cmd
if arm_cmd == "drive_to_waypoint":
self._drive_to_waypoint(target_pose=arm_q_target, t_insert_time=0.8)
elif arm_cmd == "schedule_waypoint":
self._schedule_waypoint(
arm_q_target=arm_q_target,
arm_time_target=arm_time_target,
t_now=t_now,
start_time=start_time,
last_waypoint_time=last_waypoint_time,
arm_tauff_target=arm_tauff_target,
)
# target_time = time.monotonic() - time.perf_counter() + arm_time_target
# curr_time = t_now + self.control_dt
# target_time = max(target_time, curr_time + self.control_dt)
# self.pose_interp = self.pose_interp.schedule_waypoint(
# pose=arm_q_target,
# time=target_time,
# max_pos_speed=self.max_pos_speed,
# curr_time=curr_time,
# last_waypoint_time=last_waypoint_time
# )
# last_waypoint_time = target_time
# cliped_arm_q_target = self.pose_interp(t_now)
# self._update_g1_arm(cliped_arm_q_target, self.dq_target, arm_tauff_target)
# time.sleep(max(0, (self.control_dt - (time.perf_counter() - start_time))))
def _read_mode_machine(self):
"""Return current dds mode machine."""
return self.lowstate_subscriber.Read().mode_machine
def _read_current_motor_q(self) -> np.ndarray | None:
"""Return current state q of all body motors."""
return np.array([self.lowstate_buffer.get_data().motor_state[id].q for id in G1_29_JointIndex])
def read_current_arm_q(self) -> np.ndarray | None:
"""Return current state q of the left and right arm motors."""
return np.array([self.lowstate_buffer.get_data().motor_state[id].q for id in G1_29_JointArmIndex])
def read_current_arm_dq(self) -> np.ndarray | None:
"""Return current state dq of the left and right arm motors."""
return np.array([self.lowstate_buffer.get_data().motor_state[id].dq for id in G1_29_JointArmIndex])
def write_arm(
self,
q_target: list[float] | np.ndarray,
tauff_target: list[float] | np.ndarray = None,
time_target: float | None = None,
cmd_target: str | None = None,
):
"""Set control target values q & tau of the left and right arm motors."""
with self.ctrl_lock:
self.q_target = q_target
self.tauff_target = tauff_target if tauff_target is not None else self.arm_tau(self.q_target)
self.time_target = time_target
self.arm_cmd = cmd_target
def arm_ik(
self, l_ee_target: list[float] | np.ndarray, r_ee_target: list[float] | np.ndarray
) -> tuple[np.ndarray, np.ndarray] | None:
return self.g1_arm_ik.solve_ik(l_ee_target, r_ee_target, self.read_current_arm_q(), self.read_current_arm_dq())
def arm_tau(
self, current_arm_q: np.ndarray | None = None, current_arm_dq: np.ndarray | None = None
) -> np.ndarray | None:
return self.g1_arm_ik.solve_tau(current_arm_q, current_arm_dq)
def arm_fk(self, q: np.ndarray | None = None) -> np.ndarray | None:
pass
def go_start(self):
self._drive_to_waypoint(target_pose=self.init_pose, t_insert_time=2.0)
log_success("[G1_29_ArmController] Go Start OK!\n")
def go_home(self):
if self.mock:
self.stop_event.set()
# self.subscribe_thread.join()
# self.publish_thread.join()
time.sleep(1)
# self.g1.stop()
else:
self.stop_event.set()
self.publish_thread.join()
self._drive_to_waypoint(target_pose=self.init_pose, t_insert_time=2.0)
log_success("[G1_29_ArmController] Go Home OK!\n")
def disconnect(self):
self.is_connected = False
self.go_home()
def _is_weak_motor(self, motor_index):
weak_motors = [
G1_29_JointIndex.kLeftAnklePitch.value,
G1_29_JointIndex.kRightAnklePitch.value,
# Left arm
G1_29_JointIndex.kLeftShoulderPitch.value,
G1_29_JointIndex.kLeftShoulderRoll.value,
G1_29_JointIndex.kLeftShoulderYaw.value,
G1_29_JointIndex.kLeftElbow.value,
# Right arm
G1_29_JointIndex.kRightShoulderPitch.value,
G1_29_JointIndex.kRightShoulderRoll.value,
G1_29_JointIndex.kRightShoulderYaw.value,
G1_29_JointIndex.kRightElbow.value,
]
return motor_index.value in weak_motors
def _is_wrist_motor(self, motor_index):
wrist_motors = [
G1_29_JointIndex.kLeftWristRoll.value,
G1_29_JointIndex.kLeftWristPitch.value,
G1_29_JointIndex.kLeftWristyaw.value,
G1_29_JointIndex.kRightWristRoll.value,
G1_29_JointIndex.kRightWristPitch.value,
G1_29_JointIndex.kRightWristYaw.value,
]
return motor_index.value in wrist_motors

View File

@@ -0,0 +1,280 @@
import casadi
import meshcat.geometry as mg
import numpy as np
import pinocchio as pin
from pinocchio import casadi as cpin
from pinocchio.visualize import MeshcatVisualizer
from unitree_deploy.utils.weighted_moving_filter import WeightedMovingFilter
class G1_29_ArmIK:
def __init__(self, unit_test=False, visualization=False):
np.set_printoptions(precision=5, suppress=True, linewidth=200)
self.unit_test = unit_test
self.visualization = visualization
if not self.unit_test:
self.robot = pin.RobotWrapper.BuildFromURDF(
"unitree_deploy/robot_devices/assets/g1/g1_body29_hand14.urdf",
"unitree_deploy/robot_devices/assets/g1/",
)
else:
self.robot = pin.RobotWrapper.BuildFromURDF(
"unitree_deploy/robot_devices/assets/g1/g1_body29_hand14.urdf",
"unitree_deploy/robot_devices/assets/g1/",
) # for test
self.mixed_jointsToLockIDs = [
"left_hip_pitch_joint",
"left_hip_roll_joint",
"left_hip_yaw_joint",
"left_knee_joint",
"left_ankle_pitch_joint",
"left_ankle_roll_joint",
"right_hip_pitch_joint",
"right_hip_roll_joint",
"right_hip_yaw_joint",
"right_knee_joint",
"right_ankle_pitch_joint",
"right_ankle_roll_joint",
"waist_yaw_joint",
"waist_roll_joint",
"waist_pitch_joint",
"left_hand_thumb_0_joint",
"left_hand_thumb_1_joint",
"left_hand_thumb_2_joint",
"left_hand_middle_0_joint",
"left_hand_middle_1_joint",
"left_hand_index_0_joint",
"left_hand_index_1_joint",
"right_hand_thumb_0_joint",
"right_hand_thumb_1_joint",
"right_hand_thumb_2_joint",
"right_hand_index_0_joint",
"right_hand_index_1_joint",
"right_hand_middle_0_joint",
"right_hand_middle_1_joint",
]
self.reduced_robot = self.robot.buildReducedRobot(
list_of_joints_to_lock=self.mixed_jointsToLockIDs,
reference_configuration=np.array([0.0] * self.robot.model.nq),
)
self.reduced_robot.model.addFrame(
pin.Frame(
"L_ee",
self.reduced_robot.model.getJointId("left_wrist_yaw_joint"),
pin.SE3(np.eye(3), np.array([0.05, 0, 0]).T),
pin.FrameType.OP_FRAME,
)
)
self.reduced_robot.model.addFrame(
pin.Frame(
"R_ee",
self.reduced_robot.model.getJointId("right_wrist_yaw_joint"),
pin.SE3(np.eye(3), np.array([0.05, 0, 0]).T),
pin.FrameType.OP_FRAME,
)
)
# for i in range(self.reduced_robot.model.nframes):
# frame = self.reduced_robot.model.frames[i]
# frame_id = self.reduced_robot.model.getFrameId(frame.name)
# Creating Casadi models and data for symbolic computing
self.cmodel = cpin.Model(self.reduced_robot.model)
self.cdata = self.cmodel.createData()
# Creating symbolic variables
self.cq = casadi.SX.sym("q", self.reduced_robot.model.nq, 1)
self.cTf_l = casadi.SX.sym("tf_l", 4, 4)
self.cTf_r = casadi.SX.sym("tf_r", 4, 4)
cpin.framesForwardKinematics(self.cmodel, self.cdata, self.cq)
# Get the hand joint ID and define the error function
self.L_hand_id = self.reduced_robot.model.getFrameId("L_ee")
self.R_hand_id = self.reduced_robot.model.getFrameId("R_ee")
self.translational_error = casadi.Function(
"translational_error",
[self.cq, self.cTf_l, self.cTf_r],
[
casadi.vertcat(
self.cdata.oMf[self.L_hand_id].translation - self.cTf_l[:3, 3],
self.cdata.oMf[self.R_hand_id].translation - self.cTf_r[:3, 3],
)
],
)
self.rotational_error = casadi.Function(
"rotational_error",
[self.cq, self.cTf_l, self.cTf_r],
[
casadi.vertcat(
cpin.log3(self.cdata.oMf[self.L_hand_id].rotation @ self.cTf_l[:3, :3].T),
cpin.log3(self.cdata.oMf[self.R_hand_id].rotation @ self.cTf_r[:3, :3].T),
)
],
)
# Defining the optimization problem
self.opti = casadi.Opti()
self.var_q = self.opti.variable(self.reduced_robot.model.nq)
self.var_q_last = self.opti.parameter(self.reduced_robot.model.nq) # for smooth
self.param_tf_l = self.opti.parameter(4, 4)
self.param_tf_r = self.opti.parameter(4, 4)
self.translational_cost = casadi.sumsqr(self.translational_error(self.var_q, self.param_tf_l, self.param_tf_r))
self.rotation_cost = casadi.sumsqr(self.rotational_error(self.var_q, self.param_tf_l, self.param_tf_r))
self.regularization_cost = casadi.sumsqr(self.var_q)
self.smooth_cost = casadi.sumsqr(self.var_q - self.var_q_last)
# Setting optimization constraints and goals
self.opti.subject_to(
self.opti.bounded(
self.reduced_robot.model.lowerPositionLimit,
self.var_q,
self.reduced_robot.model.upperPositionLimit,
)
)
self.opti.minimize(
50 * self.translational_cost + self.rotation_cost + 0.02 * self.regularization_cost + 0.1 * self.smooth_cost
)
opts = {
"ipopt": {"print_level": 0, "max_iter": 50, "tol": 1e-6},
"print_time": False, # print or not
"calc_lam_p": False, # https://github.com/casadi/casadi/wiki/FAQ:-Why-am-I-getting-%22NaN-detected%22in-my-optimization%3F
}
self.opti.solver("ipopt", opts)
self.init_data = np.zeros(self.reduced_robot.model.nq)
self.smooth_filter = WeightedMovingFilter(np.array([0.4, 0.3, 0.2, 0.1]), 14)
self.vis = None
if self.visualization:
# Initialize the Meshcat visualizer for visualization
self.vis = MeshcatVisualizer(
self.reduced_robot.model, self.reduced_robot.collision_model, self.reduced_robot.visual_model
)
self.vis.initViewer(open=True)
self.vis.loadViewerModel("pinocchio")
self.vis.displayFrames(True, frame_ids=[101, 102], axis_length=0.15, axis_width=5)
self.vis.display(pin.neutral(self.reduced_robot.model))
# Enable the display of end effector target frames with short axis lengths and greater width.
frame_viz_names = ["L_ee_target", "R_ee_target"]
frame_axis_positions = (
np.array([[0, 0, 0], [1, 0, 0], [0, 0, 0], [0, 1, 0], [0, 0, 0], [0, 0, 1]]).astype(np.float32).T
)
frame_axis_colors = (
np.array([[1, 0, 0], [1, 0.6, 0], [0, 1, 0], [0.6, 1, 0], [0, 0, 1], [0, 0.6, 1]]).astype(np.float32).T
)
axis_length = 0.1
axis_width = 10
for frame_viz_name in frame_viz_names:
self.vis.viewer[frame_viz_name].set_object(
mg.LineSegments(
mg.PointsGeometry(
position=axis_length * frame_axis_positions,
color=frame_axis_colors,
),
mg.LineBasicMaterial(
linewidth=axis_width,
vertexColors=True,
),
)
)
# If the robot arm is not the same size as your arm :)
def scale_arms(self, human_left_pose, human_right_pose, human_arm_length=0.60, robot_arm_length=0.75):
scale_factor = robot_arm_length / human_arm_length
robot_left_pose = human_left_pose.copy()
robot_right_pose = human_right_pose.copy()
robot_left_pose[:3, 3] *= scale_factor
robot_right_pose[:3, 3] *= scale_factor
return robot_left_pose, robot_right_pose
def solve_ik(self, left_wrist, right_wrist, current_lr_arm_motor_q=None, current_lr_arm_motor_dq=None):
if current_lr_arm_motor_q is not None:
self.init_data = current_lr_arm_motor_q
self.opti.set_initial(self.var_q, self.init_data)
# left_wrist, right_wrist = self.scale_arms(left_wrist, right_wrist)
if self.visualization:
self.vis.viewer["L_ee_target"].set_transform(left_wrist) # for visualization
self.vis.viewer["R_ee_target"].set_transform(right_wrist) # for visualization
self.opti.set_value(self.param_tf_l, left_wrist)
self.opti.set_value(self.param_tf_r, right_wrist)
self.opti.set_value(self.var_q_last, self.init_data) # for smooth
try:
self.opti.solve()
# sol = self.opti.solve_limited()
sol_q = self.opti.value(self.var_q)
self.smooth_filter.add_data(sol_q)
sol_q = self.smooth_filter.filtered_data
v = current_lr_arm_motor_dq * 0.0 if current_lr_arm_motor_dq is not None else (sol_q - self.init_data) * 0.0
self.init_data = sol_q
sol_tauff = pin.rnea(
self.reduced_robot.model,
self.reduced_robot.data,
sol_q,
v,
np.zeros(self.reduced_robot.model.nv),
)
if self.visualization:
self.vis.display(sol_q) # for visualization
return sol_q, sol_tauff
except Exception as e:
print(f"ERROR in convergence, plotting debug info.{e}")
sol_q = self.opti.debug.value(self.var_q)
self.smooth_filter.add_data(sol_q)
sol_q = self.smooth_filter.filtered_data
v = current_lr_arm_motor_dq * 0.0 if current_lr_arm_motor_dq is not None else (sol_q - self.init_data) * 0.0
self.init_data = sol_q
sol_tauff = pin.rnea(
self.reduced_robot.model,
self.reduced_robot.data,
sol_q,
v,
np.zeros(self.reduced_robot.model.nv),
)
print(
f"sol_q:{sol_q} \nmotorstate: \n{current_lr_arm_motor_q} \nleft_pose: \n{left_wrist} \nright_pose: \n{right_wrist}"
)
if self.visualization:
self.vis.display(sol_q) # for visualization
# return sol_q, sol_tauff
return current_lr_arm_motor_q, np.zeros(self.reduced_robot.model.nv)
def solve_tau(self, current_lr_arm_motor_q=None, current_lr_arm_motor_dq=None):
try:
sol_tauff = pin.rnea(
self.reduced_robot.model,
self.reduced_robot.data,
current_lr_arm_motor_q,
np.zeros(14),
np.zeros(self.reduced_robot.model.nv),
)
return sol_tauff
except Exception as e:
print(f"ERROR in convergence, plotting debug info.{e}")
return np.zeros(self.reduced_robot.model.nv)

View File

@@ -0,0 +1,63 @@
from typing import Protocol
from unitree_deploy.robot_devices.arm.configs import ArmConfig, G1ArmConfig, Z1ArmConfig, Z1DualArmConfig
class Arm(Protocol):
def connect(self): ...
def disconnect(self): ...
def motor_names(self): ...
def read_current_motor_q(self): ...
def read_current_arm_q(self): ...
def read_current_arm_dq(self): ...
def write_arm(self): ...
def arm_ik(self): ...
def arm_fk(self): ...
def go_start(self): ...
def go_home(self): ...
def make_arm_motors_buses_from_configs(armconfig: dict[str, ArmConfig]) -> list[Arm]:
arm_motors_buses = {}
for key, cfg in armconfig.items():
if cfg.type == "z1":
from unitree_deploy.robot_devices.arm.z1_arm import Z1ArmController
arm_motors_buses[key] = Z1ArmController(cfg)
elif cfg.type == "g1":
from unitree_deploy.robot_devices.arm.g1_arm import G1_29_ArmController
arm_motors_buses[key] = G1_29_ArmController(cfg)
elif cfg.type == "z1_dual":
from unitree_deploy.robot_devices.arm.z1_dual_arm import Z1_12_ArmController
arm_motors_buses[key] = Z1_12_ArmController(cfg)
else:
raise ValueError(f"The motor type '{cfg.type}' is not valid.")
return arm_motors_buses
def make_arm_motors_bus(arm_type: str, **kwargs) -> Arm:
if arm_type == "z1":
from unitree_deploy.robot_devices.arm.z1_arm import Z1ArmController
config = Z1ArmConfig(**kwargs)
return Z1ArmController(config)
elif arm_type == "z1_dual":
from unitree_deploy.robot_devices.arm.z1_dual_arm import Z1_12_ArmController
config = Z1DualArmConfig(**kwargs)
return Z1_12_ArmController(config)
elif arm_type == "g1":
from unitree_deploy.robot_devices.arm.g1_arm import G1_29_ArmController
config = G1ArmConfig(**kwargs)
return G1_29_ArmController(config)
else:
raise ValueError(f"The motor type '{arm_type}' is not valid.")

View File

@@ -0,0 +1,294 @@
import os
import sys
import threading
import time
from typing import Callable
import numpy as np
sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__))))
import unitree_arm_interface as unitree_z1 # type: ignore
from unitree_deploy.robot_devices.arm.arm_indexs import Z1GripperArmJointIndex
from unitree_deploy.robot_devices.arm.configs import Z1ArmConfig
from unitree_deploy.robot_devices.arm.z1_arm_ik import Z1_Arm_IK
from unitree_deploy.robot_devices.robots_devices_utils import (
DataBuffer,
MotorState,
Robot_Num_Motors,
RobotDeviceAlreadyConnectedError,
)
from unitree_deploy.utils.joint_trajcetory_inter import JointTrajectoryInterpolator
from unitree_deploy.utils.rich_logger import RichLogger
class Z1LowState:
def __init__(self) -> None:
self.motor_state: list[MotorState] = [MotorState() for _ in range(Robot_Num_Motors.Z1_7_Num_Motors)]
class Z1ArmController:
def __init__(self, config: Z1ArmConfig):
self.motors = config.motors
self.init_pose = config.init_pose
self.unit_test = config.unit_test
self.control_dt = config.control_dt
self.robot_kp = config.robot_kp
self.robot_kd = config.robot_kd
self.max_pos_speed = config.max_pos_speed
self.log_level = config.log_level
self.q_target = self.init_pose
self.dq_target = np.zeros(len(Z1GripperArmJointIndex) - 1, dtype=np.float16)
self.ddq_target = np.zeros(len(Z1GripperArmJointIndex) - 1, dtype=np.float16)
self.tauff_target = np.zeros(len(Z1GripperArmJointIndex) - 1, dtype=np.float16)
self.ftip_target = np.zeros(len(Z1GripperArmJointIndex) - 1, dtype=np.float16)
self.time_target = time.monotonic()
self.DELTA_GRIPPER_CMD = 5.0 / 20.0 / 25.6
self.arm_cmd = "schedule_waypoint"
self.ctrl_lock = threading.Lock()
self.lowstate_buffer = DataBuffer()
self.z1_arm_ik = Z1_Arm_IK(unit_test=self.unit_test, visualization=False)
self.logger = RichLogger(self.log_level)
self.is_connected = False
self.grasped = False
@property
def motor_names(self) -> list[str]:
return list(self.motors.keys())
@property
def motor_models(self) -> list[str]:
return [model for _, model in self.motors.values()]
@property
def motor_indices(self) -> list[int]:
return [idx for idx, _ in self.motors.values()]
def _start_daemon_thread(self, target_fn: Callable[[], None], name: str | None = None) -> threading.Thread:
thread = threading.Thread(target=target_fn, name=name)
thread.daemon = True
thread.start()
return thread
def connect(self):
try:
if self.is_connected:
raise RobotDeviceAlreadyConnectedError(
"Z1_Arm is already connected. Do not run `robot.connect()` twice."
)
# Initialize arms
self.z1 = unitree_z1.ArmInterface()
self.z1_model = self.z1._ctrlComp.armModel
self.z1.setFsmLowcmd()
self.z1.lowcmd.setControlGain(self.robot_kp, self.robot_kd)
self.z1.sendRecv()
self.subscribe_thread = self._start_daemon_thread(
self._subscribe_motor_state, name="z1._subscribe_motor_state"
)
while not self.lowstate_buffer.get_data():
time.sleep(0.01)
self.logger.warning("[Z1_ArmController] Waiting Get Data...")
self.publish_thread = self._start_daemon_thread(self._ctrl_motor_state, name="z1._ctrl_motor_state")
self.is_connected = True
except Exception as e:
self.disconnect()
self.logger.error(f"❌ Error in Z1ArmController.connect: {e}")
def _subscribe_motor_state(self):
try:
while True:
lowstate = Z1LowState()
for motor_id in range(Robot_Num_Motors.Z1_7_Num_Motors - 1):
lowstate.motor_state[motor_id].q = self.z1.lowstate.getQ()[motor_id]
lowstate.motor_state[motor_id].dq = self.z1.lowstate.getQd()[motor_id]
gripper_q = self.z1.lowstate.getGripperQ()
lowstate.motor_state[Robot_Num_Motors.Z1_7_Num_Motors - 1].q = gripper_q
lowstate.motor_state[Robot_Num_Motors.Z1_7_Num_Motors - 1].dq = 0.0
self.lowstate_buffer.set_data(lowstate)
time.sleep(self.control_dt)
except Exception as e:
self.disconnect()
self.logger.error(f"❌ Error in Z1ArmController._subscribe_motor_state: {e}")
def _update_z1_arm(
self,
q: np.ndarray,
qd: np.ndarray | None = None,
qdd: np.ndarray | None = None,
ftip: np.ndarray | None = None,
tau: np.ndarray | None = None,
):
"""Update the state and command of a given robotic arm."""
current_gripper_q = self.read_current_gripper_q()
self.z1.q = q[: len(Z1GripperArmJointIndex) - 1]
self.z1.qd = self.dq_target if qd is None else qd
qdd = self.ddq_target if qdd is None else qdd
ftip = self.ftip_target if ftip is None else ftip
self.z1.tau = self.z1_model.inverseDynamics(self.z1.q, self.z1.qd, qdd, ftip) if tau is None else tau
self.z1.setArmCmd(self.z1.q, self.z1.qd, self.z1.tau)
gripper_q = q[len(Z1GripperArmJointIndex) - 1]
self.z1.gripperQ = np.clip(
gripper_q,
current_gripper_q - self.DELTA_GRIPPER_CMD * 3,
current_gripper_q + self.DELTA_GRIPPER_CMD * 3,
)
# self.z1.gripperQ = np.clip(gripper_q, current_gripper_q - self.DELTA_GRIPPER_CMD, current_gripper_q + self.DELTA_GRIPPER_CMD) if self.grasped else np.clip(gripper_q, current_gripper_q - self.DELTA_GRIPPER_CMD*4, current_gripper_q + self.DELTA_GRIPPER_CMD*4) # np.clip(gripper_q, current_gripper_q - self.DELTA_GRIPPER_CMD*3, current_gripper_q + self.DELTA_GRIPPER_CMD*3)
self.z1.setGripperCmd(self.z1.gripperQ, self.z1.gripperQd, self.z1.gripperTau)
self.z1.sendRecv()
time.sleep(self.control_dt)
self.grasped = abs(self.read_current_gripper_q() - current_gripper_q) < self.DELTA_GRIPPER_CMD / 12.0
def _drive_to_waypoint(self, target_pose: np.ndarray, t_insert_time: float):
curr_time = time.monotonic() + self.control_dt
t_insert = curr_time + t_insert_time
self.pose_interp = self.pose_interp.drive_to_waypoint(
pose=target_pose,
time=t_insert,
curr_time=curr_time,
max_pos_speed=self.max_pos_speed,
)
while time.monotonic() < t_insert:
self._update_z1_arm(self.pose_interp(time.monotonic()))
def _schedule_waypoint(
self,
arm_q_target: np.ndarray,
arm_time_target: float,
t_now: float,
start_time: float,
last_waypoint_time: float,
arm_tauff_target: np.ndarray | None = None,
) -> float:
target_time = time.monotonic() - time.perf_counter() + arm_time_target
curr_time = t_now + self.control_dt
target_time = max(target_time, curr_time + self.control_dt)
self.pose_interp = self.pose_interp.schedule_waypoint(
pose=arm_q_target,
time=target_time,
max_pos_speed=self.max_pos_speed,
curr_time=curr_time,
last_waypoint_time=last_waypoint_time,
)
last_waypoint_time = target_time
self._update_z1_arm(q=self.pose_interp(t_now), tau=arm_tauff_target)
def _ctrl_motor_state(self):
try:
self.pose_interp = JointTrajectoryInterpolator(
times=[time.monotonic()],
joint_positions=[self.read_current_arm_q()],
)
self.go_start()
arm_q_target = self.read_current_arm_q()
arm_tauff_target = self.tauff_target
arm_time_target = time.monotonic()
arm_cmd = "schedule_waypoint"
last_waypoint_time = time.monotonic()
while True:
start_time = time.perf_counter()
t_now = time.monotonic()
with self.ctrl_lock:
arm_q_target = self.q_target
arm_tauff_target = self.tauff_target
arm_time_target = self.time_target
arm_cmd = self.arm_cmd
if arm_cmd == "drive_to_waypoint":
self._drive_to_waypoint(target_pose=arm_q_target, t_insert_time=0.8)
elif arm_cmd == "schedule_waypoint":
self._schedule_waypoint(
arm_q_target=arm_q_target,
arm_time_target=arm_time_target,
t_now=t_now,
start_time=start_time,
last_waypoint_time=last_waypoint_time,
arm_tauff_target=arm_tauff_target,
)
except Exception as e:
self.disconnect()
self.logger.error(f"❌ Error in Z1ArmController._ctrl_motor_state: {e}")
def read_current_arm_q(self) -> np.ndarray:
"""Return current state q of the left and right arm motors."""
return np.array([self.lowstate_buffer.get_data().motor_state[id].q for id in Z1GripperArmJointIndex])
def read_current_arm_q_without_gripper(self) -> np.ndarray:
"""Return current state q of the left and right arm motors."""
return np.array([self.lowstate_buffer.get_data().motor_state[id].q for id in list(Z1GripperArmJointIndex)[:-1]])
def read_current_gripper_q(self) -> np.ndarray:
"""Return current state q of the left and right arm motors."""
return np.array([self.lowstate_buffer.get_data().motor_state[list(Z1GripperArmJointIndex)[-1].value].q])
def read_current_arm_dq(self) -> np.ndarray:
"""Return current state dq of the left and right arm motors."""
return np.array([self.lowstate_buffer.get_data().motor_state[id].dq for id in Z1GripperArmJointIndex])
def read_current_arm_dq_without_gripper(self) -> np.ndarray:
"""Return current state dq of the left and right arm motors."""
return np.array(
[self.lowstate_buffer.get_data().motor_state[id].dq for id in list(Z1GripperArmJointIndex)[:-1]]
)
def write_arm(
self,
q_target: list[float] | np.ndarray,
tauff_target: list[float] | np.ndarray = None,
time_target: float | None = None,
cmd_target: str | None = None,
):
"""Set control target values q & tau of the left and right arm motors."""
with self.ctrl_lock:
self.q_target = q_target
self.tauff_target = tauff_target
self.time_target = time_target
self.arm_cmd = cmd_target
def arm_ik(self, ee_target: list[float] | np.ndarray) -> tuple[np.ndarray, np.ndarray] | None:
return self.z1_arm_ik.solve_ik(
ee_target, self.read_current_arm_q_without_gripper(), self.read_current_arm_dq_without_gripper()
)
def arm_fk(self, q: np.ndarray | None = None) -> np.ndarray | None:
return self.z1_model.forwardKinematics(
q if q is not None else self.read_current_arm_q(), len(Z1GripperArmJointIndex)
)
def go_start(self):
self._drive_to_waypoint(target_pose=self.init_pose, t_insert_time=1.0)
self.logger.success("Go Start OK!\n")
def go_home(self):
self.z1.loopOn()
self.z1.backToStart()
self.z1.loopOff()
time.sleep(0.5)
self.logger.success("Go Home OK!\n")
def disconnect(self):
self.is_connected = False
self.go_home()
def __del__(self):
if getattr(self, "is_connected", False):
self.disconnect()

View File

@@ -0,0 +1,253 @@
import time
import casadi
import meshcat.geometry as mg
import numpy as np
import pinocchio as pin
from pinocchio import casadi as cpin
from pinocchio.visualize import MeshcatVisualizer
from unitree_deploy.utils.weighted_moving_filter import WeightedMovingFilter
class Z1_Arm_IK:
def __init__(self, unit_test=False, visualization=False):
np.set_printoptions(precision=5, suppress=True, linewidth=200)
self.unit_test = unit_test
self.visualization = visualization
self.robot = pin.RobotWrapper.BuildFromURDF(
"unitree_deploy/robot_devices/assets/z1/z1.urdf", "unitree_deploy/robot_devices/assets/z1/"
)
self.mixed_jointsToLockIDs = ["base_static_joint"]
self.reduced_robot = self.robot.buildReducedRobot(
list_of_joints_to_lock=self.mixed_jointsToLockIDs,
reference_configuration=np.array([0.0] * self.robot.model.nq),
)
self.reduced_robot.model.addFrame(
pin.Frame(
"ee",
self.reduced_robot.model.getJointId("joint6"),
pin.SE3(np.eye(3), np.array([0.15, 0, 0]).T),
pin.FrameType.OP_FRAME,
)
)
# for i in range(self.reduced_robot.model.nframes):
# frame = self.reduced_robot.model.frames[i]
# frame_id = self.reduced_robot.model.getFrameId(frame.name)
# print(f"Frame ID: {frame_id}, Name: {frame.name}")
# Creating Casadi models and data for symbolic computing
self.cmodel = cpin.Model(self.reduced_robot.model)
self.cdata = self.cmodel.createData()
self.cq = casadi.SX.sym("q", self.reduced_robot.model.nq, 1)
self.cTf = casadi.SX.sym("tf", 4, 4)
cpin.framesForwardKinematics(self.cmodel, self.cdata, self.cq)
self.EE_ID = self.reduced_robot.model.getFrameId("link06")
self.translational_error = casadi.Function(
"translational_error",
[self.cq, self.cTf],
[
casadi.vertcat(
self.cdata.oMf[self.EE_ID].translation - self.cTf[:3, 3],
)
],
)
self.rotational_error = casadi.Function(
"rotational_error",
[self.cq, self.cTf],
[
casadi.vertcat(
cpin.log3(self.cdata.oMf[self.EE_ID].rotation @ self.cTf[:3, :3].T),
)
],
)
self.opti = casadi.Opti()
self.var_q = self.opti.variable(self.reduced_robot.model.nq)
self.var_q_last = self.opti.parameter(self.reduced_robot.model.nq) # for smooth
self.param_tf = self.opti.parameter(4, 4)
self.translational_cost = casadi.sumsqr(self.translational_error(self.var_q, self.param_tf))
self.rotation_cost = casadi.sumsqr(self.rotational_error(self.var_q, self.param_tf))
self.regularization_cost = casadi.sumsqr(self.var_q)
self.smooth_cost = casadi.sumsqr(self.var_q - self.var_q_last)
# Setting optimization constraints and goals
self.opti.subject_to(
self.opti.bounded(
self.reduced_robot.model.lowerPositionLimit,
self.var_q,
self.reduced_robot.model.upperPositionLimit,
)
)
self.opti.minimize(
50 * self.translational_cost
+ self.rotation_cost
+ 0.02 * self.regularization_cost
+ 0.1 * self.smooth_cost
)
# self.opti.minimize(20 * self.cost + self.regularization_cost)
opts = {
"ipopt": {"print_level": 0, "max_iter": 50, "tol": 1e-6},
"print_time": False, # print or not
"calc_lam_p": False, # https://github.com/casadi/casadi/wiki/FAQ:-Why-am-I-getting-%22NaN-detected%22in-my-optimization%3F
}
self.opti.solver("ipopt", opts)
self.init_data = np.zeros(self.reduced_robot.model.nq)
self.smooth_filter = WeightedMovingFilter(np.array([0.4, 0.3, 0.2, 0.1]), 6)
self.vis = None
if self.visualization:
# Initialize the Meshcat visualizer for visualization
self.vis = MeshcatVisualizer(
self.reduced_robot.model, self.reduced_robot.collision_model, self.reduced_robot.visual_model
)
self.vis.initViewer(open=True)
self.vis.loadViewerModel("pinocchio")
self.vis.displayFrames(True, frame_ids=[101, 102], axis_length=0.15, axis_width=5)
self.vis.display(pin.neutral(self.reduced_robot.model))
# Enable the display of end effector target frames with short axis lengths and greater width.
frame_viz_names = ["ee_target"]
frame_axis_positions = (
np.array([[0, 0, 0], [1, 0, 0], [0, 0, 0], [0, 1, 0], [0, 0, 0], [0, 0, 1]])
.astype(np.float32)
.T
)
frame_axis_colors = (
np.array([[1, 0, 0], [1, 0.6, 0], [0, 1, 0], [0.6, 1, 0], [0, 0, 1], [0, 0.6, 1]])
.astype(np.float32)
.T
)
axis_length = 0.1
axis_width = 10
for frame_viz_name in frame_viz_names:
self.vis.viewer[frame_viz_name].set_object(
mg.LineSegments(
mg.PointsGeometry(
position=axis_length * frame_axis_positions,
color=frame_axis_colors,
),
mg.LineBasicMaterial(
linewidth=axis_width,
vertexColors=True,
),
)
)
def solve_ik(self, wrist, current_lr_arm_motor_q=None, current_lr_arm_motor_dq=None):
if current_lr_arm_motor_q is not None:
self.init_data = current_lr_arm_motor_q
self.opti.set_initial(self.var_q, self.init_data)
# left_wrist, right_wrist = self.scale_arms(left_wrist, right_wrist)
if self.visualization:
self.vis.viewer["ee_target"].set_transform(wrist) # for visualization
self.opti.set_value(self.param_tf, wrist)
self.opti.set_value(self.var_q_last, self.init_data) # for smooth
try:
self.opti.solve()
# sol = self.opti.solve_limited()
sol_q = self.opti.value(self.var_q)
self.smooth_filter.add_data(sol_q)
sol_q = self.smooth_filter.filtered_data
v = current_lr_arm_motor_dq * 0.0 if current_lr_arm_motor_dq is not None else (sol_q - self.init_data) * 0.0
self.init_data = sol_q
sol_tauff = pin.rnea(
self.reduced_robot.model,
self.reduced_robot.data,
sol_q,
v,
np.zeros(self.reduced_robot.model.nv),
)
if self.visualization:
self.vis.display(sol_q) # for visualization
return sol_q, sol_tauff
except Exception as e:
print(f"ERROR in convergence, plotting debug info.{e}")
sol_q = self.opti.debug.value(self.var_q)
self.smooth_filter.add_data(sol_q)
sol_q = self.smooth_filter.filtered_data
v = current_lr_arm_motor_dq * 0.0 if current_lr_arm_motor_dq is not None else (sol_q - self.init_data) * 0.0
self.init_data = sol_q
sol_tauff = pin.rnea(
self.reduced_robot.model,
self.reduced_robot.data,
sol_q,
v,
np.zeros(self.reduced_robot.model.nv),
)
if self.visualization:
self.vis.display(sol_q) # for visualization
# return sol_q, sol_tauff
return current_lr_arm_motor_q, np.zeros(self.reduced_robot.model.nv)
if __name__ == "__main__":
arm_ik = Z1_Arm_IK(unit_test=True, visualization=True)
# initial positon
L_tf_target = pin.SE3(
pin.Quaternion(1, 0, 0, 0),
np.array([0.25, 0, 0.2]),
)
rotation_speed = 0.02
noise_amplitude_translation = 0.002
noise_amplitude_rotation = 0.1
user_input = input("Please enter the start signal (enter 's' to start the subsequent program):\n")
if user_input.lower() == "s":
step = 0
while True:
# Apply rotation noise with bias towards y and z axes
rotation_noise_l = pin.Quaternion(
np.cos(np.random.normal(0, noise_amplitude_rotation) / 2),
0,
np.random.normal(0, noise_amplitude_rotation / 2),
0,
).normalized() # y bias
if step <= 120:
angle = rotation_speed * step
L_tf_target.rotation = (
rotation_noise_l * pin.Quaternion(np.cos(angle / 2), 0, np.sin(angle / 2), 0)
).toRotationMatrix() # y axis
L_tf_target.translation += np.array([0.001, 0.001, 0.001]) + np.random.normal(
0, noise_amplitude_translation, 3
)
else:
angle = rotation_speed * (240 - step)
L_tf_target.rotation = (
rotation_noise_l * pin.Quaternion(np.cos(angle / 2), 0, np.sin(angle / 2), 0)
).toRotationMatrix() # y axis
L_tf_target.translation -= np.array([0.001, 0.001, 0.001]) + np.random.normal(
0, noise_amplitude_translation, 3
)
sol_q, _ = arm_ik.solve_ik(L_tf_target.homogeneous)
step += 1
if step > 240:
step = 0
time.sleep(0.01)

View File

@@ -0,0 +1,347 @@
import os
import sys
import threading
import time
from typing import Callable
import numpy as np
sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__))))
import unitree_arm_interface as unitree_z1 # type: ignore
from unitree_deploy.robot_devices.arm.arm_indexs import Z1_12_JointArmIndex
from unitree_deploy.robot_devices.arm.configs import Z1DualArmConfig
from unitree_deploy.robot_devices.arm.z1_arm_ik import Z1_Arm_IK
from unitree_deploy.robot_devices.robots_devices_utils import (
DataBuffer,
MotorState,
Robot_Num_Motors,
RobotDeviceAlreadyConnectedError,
)
from unitree_deploy.utils.joint_trajcetory_inter import JointTrajectoryInterpolator
from unitree_deploy.utils.rich_logger import log_error, log_info, log_success, log_warning
class Z1_12_LowState:
def __init__(self):
self.motor_state = [MotorState() for _ in range(Robot_Num_Motors.Z1_12_Num_Motors)]
class Z1_12_ArmController:
def __init__(self, config: Z1DualArmConfig):
log_info("Initialize Z1_12_ArmController...")
self.left_robot_ip = config.left_robot_ip
self.left_robot_port1 = config.left_robot_port1
self.left_robot_port2 = config.left_robot_port2
self.right_robot_ip = config.right_robot_ip
self.right_robot_port1 = config.right_robot_port1
self.right_robot_port2 = config.right_robot_port2
self.robot_kp = config.robot_kp
self.robot_kd = config.robot_kd
self.max_pos_speed = config.max_pos_speed
self.init_pose_left = np.array(config.init_pose_left)
self.init_pose_right = np.array(config.init_pose_right)
self.init_pose = np.concatenate((self.init_pose_left, self.init_pose_right), axis=0)
self.unit_test = config.unit_test
self.control_dt = config.control_dt
self.motors = config.motors
self.mock = config.mock
self.q_target = np.concatenate((self.init_pose_left, self.init_pose_right))
self.tauff_target = np.zeros(len(Z1_12_JointArmIndex), dtype=np.float64)
self.dq_target = np.zeros(len(Z1_12_JointArmIndex) // 2, dtype=np.float64)
self.ddq_target = np.zeros(len(Z1_12_JointArmIndex) // 2, dtype=np.float64)
self.ftip_target = np.zeros(len(Z1_12_JointArmIndex) // 2, dtype=np.float64)
self.time_target = time.monotonic()
self.arm_cmd = "schedule_waypoint"
self.ctrl_lock = threading.Lock()
self.lowstate_buffer = DataBuffer()
self.stop_event = threading.Event()
self.z1_left_arm_ik = Z1_Arm_IK(unit_test=self.unit_test, visualization=False)
self.z1_right_arm_ik = Z1_Arm_IK(unit_test=self.unit_test, visualization=False)
self.arm_indices_len = len(Z1_12_JointArmIndex) // 2
self.is_connected = False
@property
def motor_names(self) -> list[str]:
return list(self.motors.keys())
@property
def motor_models(self) -> list[str]:
return [model for _, model in self.motors.values()]
@property
def motor_indices(self) -> list[int]:
return [idx for idx, _ in self.motors.values()]
def _start_daemon_thread(self, target_fn: Callable[[], None], name: str | None = None) -> threading.Thread:
thread = threading.Thread(target=target_fn, name=name)
thread.daemon = True
thread.start()
return thread
def initialize_arm(self, ip: str, port1: int, port2: int, name: str):
"""Initialize z1."""
arm = unitree_z1.ArmInterface(ip, port1, port2)
arm_model = arm._ctrlComp.armModel
arm.setFsmLowcmd()
return arm, arm_model
def set_control_gains(self, kp, kd):
"""Initialize kp kd."""
for arm in [self.z1_left, self.z1_right]:
arm.lowcmd.setControlGain(kp, kd)
arm.sendRecv()
def connect(self):
try:
if self.is_connected:
raise RobotDeviceAlreadyConnectedError(
"Z1_Dual_Arm is already connected. Do not run `robot.connect()` twice."
)
# Initialize arms
self.z1_left, self.z1_left_model = self.initialize_arm(
self.left_robot_ip, self.left_robot_port1, self.left_robot_port2, "left"
)
self.z1_right, self.z1_right_model = self.initialize_arm(
self.right_robot_ip, self.right_robot_port1, self.right_robot_port2, "right"
)
# Set control gains
self.set_control_gains(self.robot_kp, self.robot_kd)
# initialize subscribe thread
self.subscribe_thread = self._start_daemon_thread(self._subscribe_motor_state, name="z1._subscribe_motor_state")
while not self.lowstate_buffer.get_data():
time.sleep(0.01)
log_warning("[Z1_12_ArmController] Waiting Get Data...")
self.publish_thread = self._start_daemon_thread(self._ctrl_motor_state, name="z1_dual._ctrl_motor_state")
self.is_connected = True
except Exception as e:
self.disconnect()
log_error(f"❌ Error in Z1_12_ArmController.connect: {e}")
def _subscribe_motor_state(self):
while True:
msg = {
"q": np.concatenate([self.z1_left.lowstate.getQ(), self.z1_right.lowstate.getQ()], axis=0),
"dq": np.concatenate([self.z1_left.lowstate.getQd(), self.z1_right.lowstate.getQd()], axis=0),
}
if msg is not None:
lowstate = Z1_12_LowState()
for id in range(Robot_Num_Motors.Z1_12_Num_Motors):
lowstate.motor_state[id].q = msg["q"][id]
lowstate.motor_state[id].dq = msg["dq"][id]
self.lowstate_buffer.set_data(lowstate)
time.sleep(self.control_dt)
def _update_z1_arm(
self,
arm,
arm_model,
q: np.ndarray,
qd: np.ndarray | None = None,
qdd: np.ndarray | None = None,
ftip: np.ndarray | None = None,
tau: np.ndarray | None = None,
):
"""Update the state and command of a given robotic arm."""
arm.q = q
arm.qd = self.dq_target if qd is None else qd
qdd = self.ddq_target if qdd is None else qdd
ftip = self.ftip_target if ftip is None else ftip
arm.tau = arm_model.inverseDynamics(arm.q, arm.qd, qdd, ftip) if tau is None else tau
arm.setArmCmd(arm.q, arm.qd, arm.tau)
arm.sendRecv()
def _drive_to_waypoint(self, target_pose: np.ndarray, t_insert_time: float):
curr_time = time.monotonic() + self.control_dt
t_insert = curr_time + t_insert_time
self.pose_interp = self.pose_interp.drive_to_waypoint(
pose=target_pose,
time=t_insert,
curr_time=curr_time,
max_pos_speed=self.max_pos_speed,
)
while time.monotonic() < t_insert:
self._update_z1_arm(
self.z1_left, self.z1_left_model, self.pose_interp(time.monotonic())[: self.arm_indices_len]
)
self._update_z1_arm(
self.z1_right, self.z1_right_model, self.pose_interp(time.monotonic())[self.arm_indices_len :]
)
time.sleep(self.control_dt)
def _schedule_waypoint(
self,
arm_q_target: np.ndarray,
arm_time_target: float,
t_now: float,
start_time: float,
last_waypoint_time: float,
arm_tauff_target: np.ndarray | None = None,
) -> float:
target_time = time.monotonic() - time.perf_counter() + arm_time_target
curr_time = t_now + self.control_dt
target_time = max(target_time, curr_time + self.control_dt)
self.pose_interp = self.pose_interp.schedule_waypoint(
pose=arm_q_target,
time=target_time,
max_pos_speed=self.max_pos_speed,
curr_time=curr_time,
last_waypoint_time=last_waypoint_time,
)
last_waypoint_time = target_time
self._update_z1_arm(
arm=self.z1_left,
arm_model=self.z1_left_model,
q=self.pose_interp(t_now)[: self.arm_indices_len],
tau=arm_tauff_target[: self.arm_indices_len] if arm_tauff_target is not None else arm_tauff_target,
)
self._update_z1_arm(
arm=self.z1_right,
arm_model=self.z1_right_model,
q=self.pose_interp(t_now)[self.arm_indices_len :],
tau=arm_tauff_target[self.arm_indices_len :] if arm_tauff_target is not None else arm_tauff_target,
)
time.sleep(max(0, self.control_dt - (time.perf_counter() - start_time)))
def _ctrl_motor_state(self):
try:
self.pose_interp = JointTrajectoryInterpolator(
times=[time.monotonic()],
joint_positions=[self.read_current_arm_q()],
)
self.go_start()
arm_q_target = self.read_current_arm_q()
arm_tauff_target = self.tauff_target
arm_time_target = time.monotonic()
arm_cmd = "schedule_waypoint"
last_waypoint_time = time.monotonic()
while True:
start_time = time.perf_counter()
t_now = time.monotonic()
with self.ctrl_lock:
arm_q_target = self.q_target
arm_tauff_target = self.tauff_target
arm_time_target = self.time_target
arm_cmd = self.arm_cmd
if arm_cmd == "drive_to_waypoint":
self._drive_to_waypoint(target_pose=arm_q_target, t_insert_time=0.8)
elif arm_cmd == "schedule_waypoint":
self._schedule_waypoint(
arm_q_target=arm_q_target,
arm_time_target=arm_time_target,
t_now=t_now,
start_time=start_time,
last_waypoint_time=last_waypoint_time,
arm_tauff_target=arm_tauff_target,
)
except Exception as e:
self.disconnect()
log_error(f"❌ Error in Z1ArmController._ctrl_motor_state: {e}")
def write_arm(
self,
q_target: list[float] | np.ndarray,
tauff_target: list[float] | np.ndarray = None,
time_target: float | None = None,
cmd_target: str | None = None,
):
"""Set control target values q & tau of the left and right arm motors."""
with self.ctrl_lock:
self.q_target = q_target
self.tauff_target = tauff_target
self.time_target = time_target
self.arm_cmd = cmd_target
def read_current_arm_q(self) -> np.ndarray | None:
"""Return current state q of the left and right arm motors."""
return np.array([self.lowstate_buffer.get_data().motor_state[id].q for id in Z1_12_JointArmIndex])
def read_current_arm_dq(self) -> np.ndarray | None:
"""Return current state dq of the left and right arm motors."""
return np.array([self.lowstate_buffer.get_data().motor_state[id].dq for id in Z1_12_JointArmIndex])
def arm_ik(self, l_tf_target, r_tf_target) -> np.ndarray | None:
current_lr_arm_q = self.read_current_arm_q()
current_lr_arm_dq = self.read_current_arm_dq()
left_sol_q, left_sol_tauff = self.z1_left_arm_ik.solve_ik(
l_tf_target,
current_lr_arm_q[: self.arm_indices_len],
current_lr_arm_dq[: self.arm_indices_len],
)
right_sol_q, right_sol_tauff = self.z1_right_arm_ik.solve_ik(
r_tf_target,
current_lr_arm_q[self.arm_indices_len :],
current_lr_arm_dq[self.arm_indices_len :],
)
sol_q = np.concatenate([left_sol_q, right_sol_q], axis=0)
sol_tauff = np.concatenate([left_sol_tauff, right_sol_tauff], axis=0)
return sol_q, sol_tauff
def arm_fk(self, left_q: np.ndarray | None = None, right_q: np.ndarray | None = None) -> np.ndarray | None:
left = self.z1_left_arm_ik.solve_fk(
left_q if left_q is not None else self.read_current_arm_q()[: self.arm_indices_len]
)
right = self.z1_right_arm_ik.solve_fk(
right_q if right_q is not None else self.read_current_arm_q()[self.arm_indices_len :]
)
return left, right
def go_start(self):
self._drive_to_waypoint(target_pose=self.init_pose, t_insert_time=1.0)
log_success("Go Start OK!\n")
def go_home(self):
if self.mock:
self.stop_event.set()
# self.subscribe_thread.join()
# self.publish_thread.join()
time.sleep(1)
else:
self.is_connected = False
self.z1_right.loopOn()
self.z1_right.backToStart()
self.z1_right.loopOff()
self.z1_left.loopOn()
self.z1_left.backToStart()
self.z1_left.loopOff()
time.sleep(0.5)
log_success("Go Home OK!\n")
def disconnect(self):
self.is_connected = False
self.go_home()
def __del__(self):
if getattr(self, "is_connected", False):
self.disconnect()

View File

@@ -0,0 +1,2 @@
*.gv
*.pdf

View File

@@ -0,0 +1,33 @@
# Unitree G1 Description (URDF & MJCF)
## Overview
This package includes a universal humanoid robot description (URDF & MJCF) for the [Unitree G1](https://www.unitree.com/g1/), developed by [Unitree Robotics](https://www.unitree.com/).
MJCF/URDF for the G1 robot:
| MJCF/URDF file name | `mode_machine` | Hip roll reduction ratio | Update status | dof#leg | dof#waist | dof#arm | dof#hand |
| ----------------------------- | :------------: | :----------------------: | ------------- | :-----: | :-------: | :-----: | :------: |
| `g1_23dof` | 1 | 14.5 | Beta | 6*2 | 1 | 5*2 | 0 |
| `g1_29dof` | 2 | 14.5 | Beta | 6*2 | 3 | 7*2 | 0 |
| `g1_29dof_with_hand` | 2 | 14.5 | Beta | 6*2 | 3 | 7*2 | 7*2 |
| `g1_29dof_lock_waist` | 3 | 14.5 | Beta | 6*2 | 1 | 7*2 | 0 |
| `g1_23dof_rev_1_0` | 4 | 22.5 | Up-to-date | 6*2 | 1 | 5*2 | 0 |
| `g1_29dof_rev_1_0` | 5 | 22.5 | Up-to-date | 6*2 | 3 | 7*2 | 0 |
| `g1_29dof_with_hand_rev_1_0` | 5 | 22.5 | Up-to-date | 6*2 | 3 | 7*2 | 7*2 |
| `g1_29dof_lock_waist_rev_1_0` | 6 | 22.5 | Up-to-date | 6*2 | 1 | 7*2 | 0 |
| `g1_dual_arm` | 9 | null | Up-to-date | 0 | 0 | 7*2 | 0 |
## Visulization with [MuJoCo](https://github.com/google-deepmind/mujoco)
1. Open MuJoCo Viewer
```bash
pip install mujoco
python -m mujoco.viewer
```
2. Drag and drop the MJCF/URDF model file (`g1_XXX.xml`/`g1_XXX.urdf`) to the MuJoCo Viewer.
## Note for teleoperate
g1_body29_hand14 is modified from [g1_29dof_with_hand_rev_1_0](https://github.com/unitreerobotics/unitree_ros/blob/master/robots/g1_description/g1_29dof_with_hand_rev_1_0.urdf)

View File

@@ -0,0 +1,381 @@
<mujoco model="g1_29dof_rev_1_0">
<!-- <compiler angle="radian" meshdir="assets"/> -->
<compiler angle="radian" meshdir="meshes"/>
<option integrator="implicitfast"/>
<default>
<default class="g1">
<site rgba="1 0 0 1" size="0.01" group="5"/>
<joint armature="0.01" frictionloss="0.3"/>
<position kp="500" dampratio="1" inheritrange="1"/>
<default class="visual">
<geom group="2" type="mesh" contype="0" conaffinity="0" density="0" material="metal"/>
</default>
<default class="collision">
<geom group="3" type="mesh"/>
<default class="foot">
<geom type="sphere" size="0.005" priority="1" friction="0.6" condim="3"/>
</default>
</default>
</default>
</default>
<asset>
<material name="black" rgba="0.2 0.2 0.2 1"/>
<material name="metal" rgba="0.7 0.7 0.7 1"/>
<mesh file="pelvis.STL"/>
<mesh file="pelvis_contour_link.STL"/>
<mesh file="left_hip_pitch_link.STL"/>
<mesh file="left_hip_roll_link.STL"/>
<mesh file="left_hip_yaw_link.STL"/>
<mesh file="left_knee_link.STL"/>
<mesh file="left_ankle_pitch_link.STL"/>
<mesh file="left_ankle_roll_link.STL"/>
<mesh file="right_hip_pitch_link.STL"/>
<mesh file="right_hip_roll_link.STL"/>
<mesh file="right_hip_yaw_link.STL"/>
<mesh file="right_knee_link.STL"/>
<mesh file="right_ankle_pitch_link.STL"/>
<mesh file="right_ankle_roll_link.STL"/>
<mesh name="waist_yaw_link" file="waist_yaw_link_rev_1_0.STL"/>
<mesh name="waist_roll_link" file="waist_roll_link_rev_1_0.STL"/>
<mesh name="torso_link" file="torso_link_rev_1_0.STL"/>
<mesh file="logo_link.STL"/>
<mesh file="head_link.STL"/>
<mesh file="left_shoulder_pitch_link.STL"/>
<mesh file="left_shoulder_roll_link.STL"/>
<mesh file="left_shoulder_yaw_link.STL"/>
<mesh file="left_elbow_link.STL"/>
<mesh file="left_wrist_roll_link.STL"/>
<mesh file="left_wrist_pitch_link.STL"/>
<mesh file="left_wrist_yaw_link.STL"/>
<mesh file="left_rubber_hand.STL"/>
<mesh file="right_shoulder_pitch_link.STL"/>
<mesh file="right_shoulder_roll_link.STL"/>
<mesh file="right_shoulder_yaw_link.STL"/>
<mesh file="right_elbow_link.STL"/>
<mesh file="right_wrist_roll_link.STL"/>
<mesh file="right_wrist_pitch_link.STL"/>
<mesh file="right_wrist_yaw_link.STL"/>
<mesh file="right_rubber_hand.STL"/>
</asset>
<worldbody>
<light pos="1 0 3.5" dir="0 0 -1" directional="true"/>
<body name="pelvis" pos="0 0 0.793" childclass="g1">
<inertial pos="0 0 -0.07605" quat="1 0 -0.000399148 0" mass="3.813" diaginertia="0.010549 0.0093089 0.0079184"/>
<freejoint name="floating_base_joint"/>
<geom class="visual" material="black" mesh="pelvis"/>
<geom class="visual" mesh="pelvis_contour_link"/>
<geom class="collision" mesh="pelvis_contour_link"/>
<site name="imu_in_pelvis" pos="0.04525 0 -0.08339" size="0.01"/>
<body name="left_hip_pitch_link" pos="0 0.064452 -0.1027">
<inertial pos="0.002741 0.047791 -0.02606" quat="0.954862 0.293964 0.0302556 0.030122" mass="1.35"
diaginertia="0.00181517 0.00153422 0.00116212"/>
<joint name="left_hip_pitch_joint" axis="0 1 0" range="-2.5307 2.8798" actuatorfrcrange="-88 88"/>
<geom class="visual" material="black" mesh="left_hip_pitch_link"/>
<geom class="collision" material="black" mesh="left_hip_pitch_link"/>
<body name="left_hip_roll_link" pos="0 0.052 -0.030465" quat="0.996179 0 -0.0873386 0">
<inertial pos="0.029812 -0.001045 -0.087934" quat="0.977808 -1.97119e-05 0.205576 -0.0403793" mass="1.52"
diaginertia="0.00254986 0.00241169 0.00148755"/>
<joint name="left_hip_roll_joint" axis="1 0 0" range="-0.5236 2.9671" actuatorfrcrange="-139 139"/>
<geom class="visual" mesh="left_hip_roll_link"/>
<geom class="collision" mesh="left_hip_roll_link"/>
<body name="left_hip_yaw_link" pos="0.025001 0 -0.12412">
<inertial pos="-0.057709 -0.010981 -0.15078" quat="0.600598 0.15832 0.223482 0.751181" mass="1.702"
diaginertia="0.00776166 0.00717575 0.00160139"/>
<joint name="left_hip_yaw_joint" axis="0 0 1" range="-2.7576 2.7576" actuatorfrcrange="-88 88"/>
<geom class="visual" mesh="left_hip_yaw_link"/>
<geom class="collision" mesh="left_hip_yaw_link"/>
<body name="left_knee_link" pos="-0.078273 0.0021489 -0.17734" quat="0.996179 0 0.0873386 0">
<inertial pos="0.005457 0.003964 -0.12074" quat="0.923418 -0.0327699 0.0158246 0.382067" mass="1.932"
diaginertia="0.0113804 0.0112778 0.00146458"/>
<joint name="left_knee_joint" axis="0 1 0" range="-0.087267 2.8798" actuatorfrcrange="-139 139"/>
<geom class="visual" mesh="left_knee_link"/>
<geom class="collision" mesh="left_knee_link"/>
<body name="left_ankle_pitch_link" pos="0 -9.4445e-05 -0.30001">
<inertial pos="-0.007269 0 0.011137" quat="0.603053 0.369225 0.369225 0.603053" mass="0.074"
diaginertia="1.89e-05 1.40805e-05 6.9195e-06"/>
<joint name="left_ankle_pitch_joint" axis="0 1 0" range="-0.87267 0.5236" actuatorfrcrange="-50 50"/>
<geom class="visual" mesh="left_ankle_pitch_link"/>
<geom class="collision" mesh="left_ankle_pitch_link"/>
<body name="left_ankle_roll_link" pos="0 0 -0.017558">
<inertial pos="0.026505 0 -0.016425" quat="-0.000481092 0.728482 -0.000618967 0.685065" mass="0.608"
diaginertia="0.00167218 0.0016161 0.000217621"/>
<joint name="left_ankle_roll_joint" axis="1 0 0" range="-0.2618 0.2618" actuatorfrcrange="-50 50"/>
<geom class="visual" material="black" mesh="left_ankle_roll_link"/>
<geom class="foot" pos="-0.05 0.025 -0.03"/>
<geom class="foot" pos="-0.05 -0.025 -0.03"/>
<geom class="foot" pos="0.12 0.03 -0.03"/>
<geom class="foot" pos="0.12 -0.03 -0.03"/>
<site name="left_foot"/>
</body>
</body>
</body>
</body>
</body>
</body>
<body name="right_hip_pitch_link" pos="0 -0.064452 -0.1027">
<inertial pos="0.002741 -0.047791 -0.02606" quat="0.954862 -0.293964 0.0302556 -0.030122" mass="1.35"
diaginertia="0.00181517 0.00153422 0.00116212"/>
<joint name="right_hip_pitch_joint" axis="0 1 0" range="-2.5307 2.8798" actuatorfrcrange="-88 88"/>
<geom class="visual" material="black" mesh="right_hip_pitch_link"/>
<geom class="collision" material="black" mesh="right_hip_pitch_link"/>
<body name="right_hip_roll_link" pos="0 -0.052 -0.030465" quat="0.996179 0 -0.0873386 0">
<inertial pos="0.029812 0.001045 -0.087934" quat="0.977808 1.97119e-05 0.205576 0.0403793" mass="1.52"
diaginertia="0.00254986 0.00241169 0.00148755"/>
<joint name="right_hip_roll_joint" axis="1 0 0" range="-2.9671 0.5236" actuatorfrcrange="-139 139"/>
<geom class="visual" mesh="right_hip_roll_link"/>
<geom class="collision" mesh="right_hip_roll_link"/>
<body name="right_hip_yaw_link" pos="0.025001 0 -0.12412">
<inertial pos="-0.057709 0.010981 -0.15078" quat="0.751181 0.223482 0.15832 0.600598" mass="1.702"
diaginertia="0.00776166 0.00717575 0.00160139"/>
<joint name="right_hip_yaw_joint" axis="0 0 1" range="-2.7576 2.7576" actuatorfrcrange="-88 88"/>
<geom class="visual" mesh="right_hip_yaw_link"/>
<geom class="collision" mesh="right_hip_yaw_link"/>
<body name="right_knee_link" pos="-0.078273 -0.0021489 -0.17734" quat="0.996179 0 0.0873386 0">
<inertial pos="0.005457 -0.003964 -0.12074" quat="0.923439 0.0345276 0.0116333 -0.382012" mass="1.932"
diaginertia="0.011374 0.0112843 0.00146452"/>
<joint name="right_knee_joint" axis="0 1 0" range="-0.087267 2.8798" actuatorfrcrange="-139 139"/>
<geom class="visual" mesh="right_knee_link"/>
<geom class="collision" mesh="right_knee_link"/>
<body name="right_ankle_pitch_link" pos="0 9.4445e-05 -0.30001">
<inertial pos="-0.007269 0 0.011137" quat="0.603053 0.369225 0.369225 0.603053" mass="0.074"
diaginertia="1.89e-05 1.40805e-05 6.9195e-06"/>
<joint name="right_ankle_pitch_joint" axis="0 1 0" range="-0.87267 0.5236" actuatorfrcrange="-50 50"/>
<geom class="visual" mesh="right_ankle_pitch_link"/>
<geom class="collision" mesh="right_ankle_pitch_link"/>
<body name="right_ankle_roll_link" pos="0 0 -0.017558">
<inertial pos="0.026505 0 -0.016425" quat="0.000481092 0.728482 0.000618967 0.685065" mass="0.608"
diaginertia="0.00167218 0.0016161 0.000217621"/>
<joint name="right_ankle_roll_joint" axis="1 0 0" range="-0.2618 0.2618" actuatorfrcrange="-50 50"/>
<geom class="visual" material="black" mesh="right_ankle_roll_link"/>
<geom class="foot" pos="-0.05 0.025 -0.03"/>
<geom class="foot" pos="-0.05 -0.025 -0.03"/>
<geom class="foot" pos="0.12 0.03 -0.03"/>
<geom class="foot" pos="0.12 -0.03 -0.03"/>
<site name="right_foot"/>
</body>
</body>
</body>
</body>
</body>
</body>
<body name="waist_yaw_link">
<inertial pos="0.003494 0.000233 0.018034" quat="0.289697 0.591001 -0.337795 0.672821" mass="0.214"
diaginertia="0.000163531 0.000107714 0.000102205"/>
<joint name="waist_yaw_joint" axis="0 0 1" range="-2.618 2.618" actuatorfrcrange="-88 88"/>
<geom class="visual" mesh="waist_yaw_link"/>
<body name="waist_roll_link" pos="-0.0039635 0 0.044">
<inertial pos="0 2.3e-05 0" quat="0.5 0.5 -0.5 0.5" mass="0.086" diaginertia="8.245e-06 7.079e-06 6.339e-06"/>
<joint name="waist_roll_joint" axis="1 0 0" range="-0.52 0.52" actuatorfrcrange="-50 50"/>
<geom class="visual" mesh="waist_roll_link"/>
<body name="torso_link">
<inertial pos="0.00203158 0.000339683 0.184568" quat="0.999803 -6.03319e-05 0.0198256 0.00131986"
mass="7.818" diaginertia="0.121847 0.109825 0.0273735"/>
<joint name="waist_pitch_joint" axis="0 1 0" range="-0.52 0.52" actuatorfrcrange="-50 50"/>
<geom class="visual" mesh="torso_link"/>
<geom class="collision" mesh="torso_link"/>
<geom pos="0.0039635 0 -0.044" quat="1 0 0 0" class="visual" material="black" mesh="logo_link"/>
<geom pos="0.0039635 0 -0.044" quat="1 0 0 0" class="collision" material="black" mesh="logo_link"/>
<geom pos="0.0039635 0 -0.044" quat="1 0 0 0" class="visual" material="black" mesh="head_link"/>
<geom pos="0.0039635 0 -0.044" quat="1 0 0 0" class="collision" material="black" mesh="head_link"/>
<site name="imu_in_torso" pos="-0.03959 -0.00224 0.14792" size="0.01"/>
<body name="left_shoulder_pitch_link" pos="0.0039563 0.10022 0.24778"
quat="0.990264 0.139201 1.38722e-05 -9.86868e-05">
<inertial pos="0 0.035892 -0.011628" quat="0.654152 0.0130458 -0.326267 0.68225" mass="0.718"
diaginertia="0.000465864 0.000432842 0.000406394"/>
<joint name="left_shoulder_pitch_joint" axis="0 1 0" range="-3.0892 2.6704" actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="left_shoulder_pitch_link"/>
<geom size="0.03 0.025" pos="0 0.04 -0.01" quat="0.707107 0 0.707107 0" type="cylinder"
rgba="0.7 0.7 0.7 1" class="collision"/>
<body name="left_shoulder_roll_link" pos="0 0.038 -0.013831" quat="0.990268 -0.139172 0 0">
<inertial pos="-0.000227 0.00727 -0.063243" quat="0.701256 -0.0196223 -0.00710317 0.712604" mass="0.643"
diaginertia="0.000691311 0.000618011 0.000388977"/>
<joint name="left_shoulder_roll_joint" axis="1 0 0" range="-1.5882 2.2515" actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="left_shoulder_roll_link"/>
<geom size="0.03 0.015" pos="-0.004 0.006 -0.053" type="cylinder" rgba="0.7 0.7 0.7 1" class="collision"/>
<body name="left_shoulder_yaw_link" pos="0 0.00624 -0.1032">
<inertial pos="0.010773 -0.002949 -0.072009" quat="0.716879 -0.0964829 -0.0679942 0.687134"
mass="0.734" diaginertia="0.00106187 0.00103217 0.000400661"/>
<joint name="left_shoulder_yaw_joint" axis="0 0 1" range="-2.618 2.618" actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="left_shoulder_yaw_link"/>
<geom class="collision" mesh="left_shoulder_yaw_link"/>
<body name="left_elbow_link" pos="0.015783 0 -0.080518">
<inertial pos="0.064956 0.004454 -0.010062" quat="0.541765 0.636132 0.388821 0.388129" mass="0.6"
diaginertia="0.000443035 0.000421612 0.000259353"/>
<joint name="left_elbow_joint" axis="0 1 0" range="-1.0472 2.0944" actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="left_elbow_link"/>
<geom class="collision" mesh="left_elbow_link"/>
<body name="left_wrist_roll_link" pos="0.1 0.00188791 -0.01">
<inertial pos="0.0171394 0.000537591 4.8864e-07" quat="0.575338 0.411667 -0.574906 0.411094"
mass="0.085445" diaginertia="5.48211e-05 4.96646e-05 3.57798e-05"/>
<joint name="left_wrist_roll_joint" axis="1 0 0" range="-1.97222 1.97222"
actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="left_wrist_roll_link"/>
<geom class="collision" mesh="left_wrist_roll_link"/>
<body name="left_wrist_pitch_link" pos="0.038 0 0">
<inertial pos="0.0229999 -0.00111685 -0.00111658" quat="0.249998 0.661363 0.293036 0.643608"
mass="0.48405" diaginertia="0.000430353 0.000429873 0.000164648"/>
<joint name="left_wrist_pitch_joint" axis="0 1 0" range="-1.61443 1.61443"
actuatorfrcrange="-5 5"/>
<geom class="visual" mesh="left_wrist_pitch_link"/>
<geom class="collision" mesh="left_wrist_pitch_link"/>
<body name="left_wrist_yaw_link" pos="0.046 0 0">
<inertial pos="0.0708244 0.000191745 0.00161742" quat="0.510571 0.526295 0.468078 0.493188"
mass="0.254576" diaginertia="0.000646113 0.000559993 0.000147566"/>
<joint name="left_wrist_yaw_joint" axis="0 0 1" range="-1.61443 1.61443"
actuatorfrcrange="-5 5"/>
<geom class="visual" mesh="left_wrist_yaw_link"/>
<geom class="collision" mesh="left_wrist_yaw_link"/>
<geom pos="0.0415 0.003 0" quat="1 0 0 0" class="visual" mesh="left_rubber_hand"/>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
<body name="right_shoulder_pitch_link" pos="0.0039563 -0.10021 0.24778"
quat="0.990264 -0.139201 1.38722e-05 9.86868e-05">
<inertial pos="0 -0.035892 -0.011628" quat="0.68225 -0.326267 0.0130458 0.654152" mass="0.718"
diaginertia="0.000465864 0.000432842 0.000406394"/>
<joint name="right_shoulder_pitch_joint" axis="0 1 0" range="-3.0892 2.6704" actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="right_shoulder_pitch_link"/>
<geom size="0.03 0.025" pos="0 -0.04 -0.01" quat="0.707107 0 0.707107 0" type="cylinder"
rgba="0.7 0.7 0.7 1" class="collision"/>
<body name="right_shoulder_roll_link" pos="0 -0.038 -0.013831" quat="0.990268 0.139172 0 0">
<inertial pos="-0.000227 -0.00727 -0.063243" quat="0.712604 -0.00710317 -0.0196223 0.701256"
mass="0.643" diaginertia="0.000691311 0.000618011 0.000388977"/>
<joint name="right_shoulder_roll_joint" axis="1 0 0" range="-2.2515 1.5882" actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="right_shoulder_roll_link"/>
<geom size="0.03 0.015" pos="-0.004 -0.006 -0.053" type="cylinder" rgba="0.7 0.7 0.7 1"
class="collision"/>
<body name="right_shoulder_yaw_link" pos="0 -0.00624 -0.1032">
<inertial pos="0.010773 0.002949 -0.072009" quat="0.687134 -0.0679942 -0.0964829 0.716879"
mass="0.734" diaginertia="0.00106187 0.00103217 0.000400661"/>
<joint name="right_shoulder_yaw_joint" axis="0 0 1" range="-2.618 2.618" actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="right_shoulder_yaw_link"/>
<geom class="collision" mesh="right_shoulder_yaw_link"/>
<body name="right_elbow_link" pos="0.015783 0 -0.080518">
<inertial pos="0.064956 -0.004454 -0.010062" quat="0.388129 0.388821 0.636132 0.541765" mass="0.6"
diaginertia="0.000443035 0.000421612 0.000259353"/>
<joint name="right_elbow_joint" axis="0 1 0" range="-1.0472 2.0944" actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="right_elbow_link"/>
<geom class="collision" mesh="right_elbow_link"/>
<body name="right_wrist_roll_link" pos="0.1 -0.00188791 -0.01">
<inertial pos="0.0171394 -0.000537591 4.8864e-07" quat="0.411667 0.575338 -0.411094 0.574906"
mass="0.085445" diaginertia="5.48211e-05 4.96646e-05 3.57798e-05"/>
<joint name="right_wrist_roll_joint" axis="1 0 0" range="-1.97222 1.97222"
actuatorfrcrange="-25 25"/>
<geom class="visual" mesh="right_wrist_roll_link"/>
<geom class="collision" mesh="right_wrist_roll_link"/>
<body name="right_wrist_pitch_link" pos="0.038 0 0">
<inertial pos="0.0229999 0.00111685 -0.00111658" quat="0.643608 0.293036 0.661363 0.249998"
mass="0.48405" diaginertia="0.000430353 0.000429873 0.000164648"/>
<joint name="right_wrist_pitch_joint" axis="0 1 0" range="-1.61443 1.61443"
actuatorfrcrange="-5 5"/>
<geom class="visual" mesh="right_wrist_pitch_link"/>
<geom class="collision" mesh="right_wrist_pitch_link"/>
<body name="right_wrist_yaw_link" pos="0.046 0 0">
<inertial pos="0.0708244 -0.000191745 0.00161742" quat="0.493188 0.468078 0.526295 0.510571"
mass="0.254576" diaginertia="0.000646113 0.000559993 0.000147566"/>
<joint name="right_wrist_yaw_joint" axis="0 0 1" range="-1.61443 1.61443"
actuatorfrcrange="-5 5"/>
<geom class="visual" mesh="right_wrist_yaw_link"/>
<geom class="collision" mesh="right_wrist_yaw_link"/>
<geom pos="0.0415 -0.003 0" quat="1 0 0 0" class="visual" mesh="right_rubber_hand"/>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</worldbody>
<actuator>
<position class="g1" name="left_hip_pitch_joint" joint="left_hip_pitch_joint"/>
<position class="g1" name="left_hip_roll_joint" joint="left_hip_roll_joint"/>
<position class="g1" name="left_hip_yaw_joint" joint="left_hip_yaw_joint"/>
<position class="g1" name="left_knee_joint" joint="left_knee_joint"/>
<position class="g1" name="left_ankle_pitch_joint" joint="left_ankle_pitch_joint"/>
<position class="g1" name="left_ankle_roll_joint" joint="left_ankle_roll_joint"/>
<position class="g1" name="right_hip_pitch_joint" joint="right_hip_pitch_joint"/>
<position class="g1" name="right_hip_roll_joint" joint="right_hip_roll_joint"/>
<position class="g1" name="right_hip_yaw_joint" joint="right_hip_yaw_joint"/>
<position class="g1" name="right_knee_joint" joint="right_knee_joint"/>
<position class="g1" name="right_ankle_pitch_joint" joint="right_ankle_pitch_joint"/>
<position class="g1" name="right_ankle_roll_joint" joint="right_ankle_roll_joint"/>
<position class="g1" name="waist_yaw_joint" joint="waist_yaw_joint"/>
<position class="g1" name="waist_roll_joint" joint="waist_roll_joint"/>
<position class="g1" name="waist_pitch_joint" joint="waist_pitch_joint"/>
<position class="g1" name="left_shoulder_pitch_joint" joint="left_shoulder_pitch_joint"/>
<position class="g1" name="left_shoulder_roll_joint" joint="left_shoulder_roll_joint"/>
<position class="g1" name="left_shoulder_yaw_joint" joint="left_shoulder_yaw_joint"/>
<position class="g1" name="left_elbow_joint" joint="left_elbow_joint"/>
<position class="g1" name="left_wrist_roll_joint" joint="left_wrist_roll_joint"/>
<position class="g1" name="left_wrist_pitch_joint" joint="left_wrist_pitch_joint"/>
<position class="g1" name="left_wrist_yaw_joint" joint="left_wrist_yaw_joint"/>
<position class="g1" name="right_shoulder_pitch_joint" joint="right_shoulder_pitch_joint"/>
<position class="g1" name="right_shoulder_roll_joint" joint="right_shoulder_roll_joint"/>
<position class="g1" name="right_shoulder_yaw_joint" joint="right_shoulder_yaw_joint"/>
<position class="g1" name="right_elbow_joint" joint="right_elbow_joint"/>
<position class="g1" name="right_wrist_roll_joint" joint="right_wrist_roll_joint"/>
<position class="g1" name="right_wrist_pitch_joint" joint="right_wrist_pitch_joint"/>
<position class="g1" name="right_wrist_yaw_joint" joint="right_wrist_yaw_joint"/>
</actuator>
<sensor>
<gyro site="imu_in_torso" name="imu-torso-angular-velocity" cutoff="34.9" noise="0.0005"/>
<accelerometer site="imu_in_torso" name="imu-torso-linear-acceleration" cutoff="157" noise="0.01"/>
<gyro site="imu_in_pelvis" name="imu-pelvis-angular-velocity" cutoff="34.9" noise="0.0005"/>
<accelerometer site="imu_in_pelvis" name="imu-pelvis-linear-acceleration" cutoff="157" noise="0.01"/>
</sensor>
<statistic center="1.0 0.7 1.0" extent="0.8"/>
<visual>
<headlight diffuse="0.6 0.6 0.6" ambient="0.1 0.1 0.1" specular="0.9 0.9 0.9"/>
<rgba haze="0.15 0.25 0.35 1"/>
<global azimuth="-140" elevation="-20"/>
</visual>
<asset>
<texture type="skybox" builtin="flat" rgb1="0 0 0" rgb2="0 0 0" width="512" height="3072"/>
<texture type="2d" name="groundplane" builtin="checker" mark="edge" rgb1="0.2 0.3 0.4" rgb2="0.1 0.2 0.3" markrgb="0.8 0.8 0.8" width="300" height="300"/>
<material name="groundplane" texture="groundplane" texuniform="true" texrepeat="5 5" reflectance="0.2"/>
</asset>
<worldbody>
<light pos="1 0 3.5" dir="0 0 -1" directional="true"/>
<geom name="floor" size="0 0 0.05" type="plane" material="groundplane"/>
</worldbody>
<keyframe>
<key name="stand"
qpos="
0 0 0.79
1 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0
0 0 0
0.2 0.2 0 1.28 0 0 0
0.2 -0.2 0 1.28 0 0 0
"
ctrl="
0 0 0 0 0 0
0 0 0 0 0 0
0 0 0
0.2 0.2 0 1.28 0 0 0
0.2 -0.2 0 1.28 0 0 0
"/>
</keyframe>
</mujoco>

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,408 @@
<mujoco model="g1">
<compiler angle="radian" meshdir="meshes"/>
<asset>
<mesh name="pelvis" file="pelvis.STL"/>
<mesh name="pelvis_contour_link" file="pelvis_contour_link.STL"/>
<mesh name="left_hip_pitch_link" file="left_hip_pitch_link.STL"/>
<mesh name="left_hip_roll_link" file="left_hip_roll_link.STL"/>
<mesh name="left_hip_yaw_link" file="left_hip_yaw_link.STL"/>
<mesh name="left_knee_link" file="left_knee_link.STL"/>
<mesh name="left_ankle_pitch_link" file="left_ankle_pitch_link.STL"/>
<mesh name="left_ankle_roll_link" file="left_ankle_roll_link.STL"/>
<mesh name="right_hip_pitch_link" file="right_hip_pitch_link.STL"/>
<mesh name="right_hip_roll_link" file="right_hip_roll_link.STL"/>
<mesh name="right_hip_yaw_link" file="right_hip_yaw_link.STL"/>
<mesh name="right_knee_link" file="right_knee_link.STL"/>
<mesh name="right_ankle_pitch_link" file="right_ankle_pitch_link.STL"/>
<mesh name="right_ankle_roll_link" file="right_ankle_roll_link.STL"/>
<mesh name="waist_yaw_link" file="waist_yaw_link_rev_1_0.STL"/>
<mesh name="waist_roll_link" file="waist_roll_link_rev_1_0.STL"/>
<mesh name="torso_link" file="torso_link_rev_1_0.STL"/>
<mesh name="logo_link" file="logo_link.STL"/>
<mesh name="head_link" file="head_link.STL"/>
<mesh name="left_shoulder_pitch_link" file="left_shoulder_pitch_link.STL"/>
<mesh name="left_shoulder_roll_link" file="left_shoulder_roll_link.STL"/>
<mesh name="left_shoulder_yaw_link" file="left_shoulder_yaw_link.STL"/>
<mesh name="left_elbow_link" file="left_elbow_link.STL"/>
<mesh name="left_wrist_roll_link" file="left_wrist_roll_link.STL"/>
<mesh name="left_wrist_pitch_link" file="left_wrist_pitch_link.STL"/>
<mesh name="left_wrist_yaw_link" file="left_wrist_yaw_link.STL"/>
<mesh name="left_hand_palm_link" file="left_hand_palm_link.STL"/>
<mesh name="left_hand_thumb_0_link" file="left_hand_thumb_0_link.STL"/>
<mesh name="left_hand_thumb_1_link" file="left_hand_thumb_1_link.STL"/>
<mesh name="left_hand_thumb_2_link" file="left_hand_thumb_2_link.STL"/>
<mesh name="left_hand_middle_0_link" file="left_hand_middle_0_link.STL"/>
<mesh name="left_hand_middle_1_link" file="left_hand_middle_1_link.STL"/>
<mesh name="left_hand_index_0_link" file="left_hand_index_0_link.STL"/>
<mesh name="left_hand_index_1_link" file="left_hand_index_1_link.STL"/>
<mesh name="right_shoulder_pitch_link" file="right_shoulder_pitch_link.STL"/>
<mesh name="right_shoulder_roll_link" file="right_shoulder_roll_link.STL"/>
<mesh name="right_shoulder_yaw_link" file="right_shoulder_yaw_link.STL"/>
<mesh name="right_elbow_link" file="right_elbow_link.STL"/>
<mesh name="right_wrist_roll_link" file="right_wrist_roll_link.STL"/>
<mesh name="right_wrist_pitch_link" file="right_wrist_pitch_link.STL"/>
<mesh name="right_wrist_yaw_link" file="right_wrist_yaw_link.STL"/>
<mesh name="right_hand_palm_link" file="right_hand_palm_link.STL"/>
<mesh name="right_hand_thumb_0_link" file="right_hand_thumb_0_link.STL"/>
<mesh name="right_hand_thumb_1_link" file="right_hand_thumb_1_link.STL"/>
<mesh name="right_hand_thumb_2_link" file="right_hand_thumb_2_link.STL"/>
<mesh name="right_hand_middle_0_link" file="right_hand_middle_0_link.STL"/>
<mesh name="right_hand_middle_1_link" file="right_hand_middle_1_link.STL"/>
<mesh name="right_hand_index_0_link" file="right_hand_index_0_link.STL"/>
<mesh name="right_hand_index_1_link" file="right_hand_index_1_link.STL"/>
</asset>
<worldbody>
<body name="pelvis" pos="0 0 0.793">
<inertial pos="0 0 -0.07605" quat="1 0 -0.000399148 0" mass="3.813" diaginertia="0.010549 0.0093089 0.0079184"/>
<joint name="floating_base_joint" type="free" limited="false" actuatorfrclimited="false"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.2 0.2 0.2 1" mesh="pelvis"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="pelvis_contour_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="pelvis_contour_link"/>
<site name="imu_in_pelvis" size="0.01" pos="0.04525 0 -0.08339"/>
<body name="left_hip_pitch_link" pos="0 0.064452 -0.1027">
<inertial pos="0.002741 0.047791 -0.02606" quat="0.954862 0.293964 0.0302556 0.030122" mass="1.35" diaginertia="0.00181517 0.00153422 0.00116212"/>
<joint name="left_hip_pitch_joint" pos="0 0 0" axis="0 1 0" range="-2.5307 2.8798" actuatorfrcrange="-88 88"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.2 0.2 0.2 1" mesh="left_hip_pitch_link"/>
<geom type="mesh" rgba="0.2 0.2 0.2 1" mesh="left_hip_pitch_link"/>
<body name="left_hip_roll_link" pos="0 0.052 -0.030465" quat="0.996179 0 -0.0873386 0">
<inertial pos="0.029812 -0.001045 -0.087934" quat="0.977808 -1.97119e-05 0.205576 -0.0403793" mass="1.52" diaginertia="0.00254986 0.00241169 0.00148755"/>
<joint name="left_hip_roll_joint" pos="0 0 0" axis="1 0 0" range="-0.5236 2.9671" actuatorfrcrange="-139 139"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hip_roll_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_hip_roll_link"/>
<body name="left_hip_yaw_link" pos="0.025001 0 -0.12412">
<inertial pos="-0.057709 -0.010981 -0.15078" quat="0.600598 0.15832 0.223482 0.751181" mass="1.702" diaginertia="0.00776166 0.00717575 0.00160139"/>
<joint name="left_hip_yaw_joint" pos="0 0 0" axis="0 0 1" range="-2.7576 2.7576" actuatorfrcrange="-88 88"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hip_yaw_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_hip_yaw_link"/>
<body name="left_knee_link" pos="-0.078273 0.0021489 -0.17734" quat="0.996179 0 0.0873386 0">
<inertial pos="0.005457 0.003964 -0.12074" quat="0.923418 -0.0327699 0.0158246 0.382067" mass="1.932" diaginertia="0.0113804 0.0112778 0.00146458"/>
<joint name="left_knee_joint" pos="0 0 0" axis="0 1 0" range="-0.087267 2.8798" actuatorfrcrange="-139 139"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_knee_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_knee_link"/>
<body name="left_ankle_pitch_link" pos="0 -9.4445e-05 -0.30001">
<inertial pos="-0.007269 0 0.011137" quat="0.603053 0.369225 0.369225 0.603053" mass="0.074" diaginertia="1.89e-05 1.40805e-05 6.9195e-06"/>
<joint name="left_ankle_pitch_joint" pos="0 0 0" axis="0 1 0" range="-0.87267 0.5236" actuatorfrcrange="-50 50"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_ankle_pitch_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_ankle_pitch_link"/>
<body name="left_ankle_roll_link" pos="0 0 -0.017558">
<inertial pos="0.026505 0 -0.016425" quat="-0.000481092 0.728482 -0.000618967 0.685065" mass="0.608" diaginertia="0.00167218 0.0016161 0.000217621"/>
<joint name="left_ankle_roll_joint" pos="0 0 0" axis="1 0 0" range="-0.2618 0.2618" actuatorfrcrange="-50 50"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.2 0.2 0.2 1" mesh="left_ankle_roll_link"/>
<geom size="0.005" pos="-0.05 0.025 -0.03" rgba="0.2 0.2 0.2 1"/>
<geom size="0.005" pos="-0.05 -0.025 -0.03" rgba="0.2 0.2 0.2 1"/>
<geom size="0.005" pos="0.12 0.03 -0.03" rgba="0.2 0.2 0.2 1"/>
<geom size="0.005" pos="0.12 -0.03 -0.03" rgba="0.2 0.2 0.2 1"/>
</body>
</body>
</body>
</body>
</body>
</body>
<body name="right_hip_pitch_link" pos="0 -0.064452 -0.1027">
<inertial pos="0.002741 -0.047791 -0.02606" quat="0.954862 -0.293964 0.0302556 -0.030122" mass="1.35" diaginertia="0.00181517 0.00153422 0.00116212"/>
<joint name="right_hip_pitch_joint" pos="0 0 0" axis="0 1 0" range="-2.5307 2.8798" actuatorfrcrange="-88 88"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.2 0.2 0.2 1" mesh="right_hip_pitch_link"/>
<geom type="mesh" rgba="0.2 0.2 0.2 1" mesh="right_hip_pitch_link"/>
<body name="right_hip_roll_link" pos="0 -0.052 -0.030465" quat="0.996179 0 -0.0873386 0">
<inertial pos="0.029812 0.001045 -0.087934" quat="0.977808 1.97119e-05 0.205576 0.0403793" mass="1.52" diaginertia="0.00254986 0.00241169 0.00148755"/>
<joint name="right_hip_roll_joint" pos="0 0 0" axis="1 0 0" range="-2.9671 0.5236" actuatorfrcrange="-139 139"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hip_roll_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_hip_roll_link"/>
<body name="right_hip_yaw_link" pos="0.025001 0 -0.12412">
<inertial pos="-0.057709 0.010981 -0.15078" quat="0.751181 0.223482 0.15832 0.600598" mass="1.702" diaginertia="0.00776166 0.00717575 0.00160139"/>
<joint name="right_hip_yaw_joint" pos="0 0 0" axis="0 0 1" range="-2.7576 2.7576" actuatorfrcrange="-88 88"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hip_yaw_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_hip_yaw_link"/>
<body name="right_knee_link" pos="-0.078273 -0.0021489 -0.17734" quat="0.996179 0 0.0873386 0">
<inertial pos="0.005457 -0.003964 -0.12074" quat="0.923439 0.0345276 0.0116333 -0.382012" mass="1.932" diaginertia="0.011374 0.0112843 0.00146452"/>
<joint name="right_knee_joint" pos="0 0 0" axis="0 1 0" range="-0.087267 2.8798" actuatorfrcrange="-139 139"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_knee_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_knee_link"/>
<body name="right_ankle_pitch_link" pos="0 9.4445e-05 -0.30001">
<inertial pos="-0.007269 0 0.011137" quat="0.603053 0.369225 0.369225 0.603053" mass="0.074" diaginertia="1.89e-05 1.40805e-05 6.9195e-06"/>
<joint name="right_ankle_pitch_joint" pos="0 0 0" axis="0 1 0" range="-0.87267 0.5236" actuatorfrcrange="-50 50"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_ankle_pitch_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_ankle_pitch_link"/>
<body name="right_ankle_roll_link" pos="0 0 -0.017558">
<inertial pos="0.026505 0 -0.016425" quat="0.000481092 0.728482 0.000618967 0.685065" mass="0.608" diaginertia="0.00167218 0.0016161 0.000217621"/>
<joint name="right_ankle_roll_joint" pos="0 0 0" axis="1 0 0" range="-0.2618 0.2618" actuatorfrcrange="-50 50"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.2 0.2 0.2 1" mesh="right_ankle_roll_link"/>
<geom size="0.005" pos="-0.05 0.025 -0.03" rgba="0.2 0.2 0.2 1"/>
<geom size="0.005" pos="-0.05 -0.025 -0.03" rgba="0.2 0.2 0.2 1"/>
<geom size="0.005" pos="0.12 0.03 -0.03" rgba="0.2 0.2 0.2 1"/>
<geom size="0.005" pos="0.12 -0.03 -0.03" rgba="0.2 0.2 0.2 1"/>
</body>
</body>
</body>
</body>
</body>
</body>
<body name="waist_yaw_link">
<inertial pos="0.003494 0.000233 0.018034" quat="0.289697 0.591001 -0.337795 0.672821" mass="0.214" diaginertia="0.000163531 0.000107714 0.000102205"/>
<joint name="waist_yaw_joint" pos="0 0 0" axis="0 0 1" range="-2.618 2.618" actuatorfrcrange="-88 88"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="waist_yaw_link"/>
<body name="waist_roll_link" pos="-0.0039635 0 0.044">
<inertial pos="0 2.3e-05 0" quat="0.5 0.5 -0.5 0.5" mass="0.086" diaginertia="8.245e-06 7.079e-06 6.339e-06"/>
<joint name="waist_roll_joint" pos="0 0 0" axis="1 0 0" range="-0.52 0.52" actuatorfrcrange="-50 50"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="waist_roll_link"/>
<body name="torso_link">
<inertial pos="0.00203158 0.000339683 0.184568" quat="0.999803 -6.03319e-05 0.0198256 0.00131986" mass="7.818" diaginertia="0.121847 0.109825 0.0273735"/>
<joint name="waist_pitch_joint" pos="0 0 0" axis="0 1 0" range="-0.52 0.52" actuatorfrcrange="-50 50"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="torso_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="torso_link"/>
<geom pos="0.0039635 0 -0.044" quat="1 0 0 0" type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.2 0.2 0.2 1" mesh="logo_link"/>
<geom pos="0.0039635 0 -0.044" quat="1 0 0 0" type="mesh" rgba="0.2 0.2 0.2 1" mesh="logo_link"/>
<geom pos="0.0039635 0 -0.044" type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.2 0.2 0.2 1" mesh="head_link"/>
<geom pos="0.0039635 0 -0.044" type="mesh" rgba="0.2 0.2 0.2 1" mesh="head_link"/>
<site name="imu_in_torso" size="0.01" pos="-0.03959 -0.00224 0.14792"/>
<body name="left_shoulder_pitch_link" pos="0.0039563 0.10022 0.24778" quat="0.990264 0.139201 1.38722e-05 -9.86868e-05">
<inertial pos="0 0.035892 -0.011628" quat="0.654152 0.0130458 -0.326267 0.68225" mass="0.718" diaginertia="0.000465864 0.000432842 0.000406394"/>
<joint name="left_shoulder_pitch_joint" pos="0 0 0" axis="0 1 0" range="-3.0892 2.6704" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_shoulder_pitch_link"/>
<geom size="0.03 0.025" pos="0 0.04 -0.01" quat="0.707107 0 0.707107 0" type="cylinder" rgba="0.7 0.7 0.7 1"/>
<body name="left_shoulder_roll_link" pos="0 0.038 -0.013831" quat="0.990268 -0.139172 0 0">
<inertial pos="-0.000227 0.00727 -0.063243" quat="0.701256 -0.0196223 -0.00710317 0.712604" mass="0.643" diaginertia="0.000691311 0.000618011 0.000388977"/>
<joint name="left_shoulder_roll_joint" pos="0 0 0" axis="1 0 0" range="-1.5882 2.2515" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_shoulder_roll_link"/>
<geom size="0.03 0.015" pos="-0.004 0.006 -0.053" type="cylinder" rgba="0.7 0.7 0.7 1"/>
<body name="left_shoulder_yaw_link" pos="0 0.00624 -0.1032">
<inertial pos="0.010773 -0.002949 -0.072009" quat="0.716879 -0.0964829 -0.0679942 0.687134" mass="0.734" diaginertia="0.00106187 0.00103217 0.000400661"/>
<joint name="left_shoulder_yaw_joint" pos="0 0 0" axis="0 0 1" range="-2.618 2.618" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_shoulder_yaw_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_shoulder_yaw_link"/>
<body name="left_elbow_link" pos="0.015783 0 -0.080518">
<inertial pos="0.064956 0.004454 -0.010062" quat="0.541765 0.636132 0.388821 0.388129" mass="0.6" diaginertia="0.000443035 0.000421612 0.000259353"/>
<joint name="left_elbow_joint" pos="0 0 0" axis="0 1 0" range="-1.0472 2.0944" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_elbow_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_elbow_link"/>
<body name="left_wrist_roll_link" pos="0.1 0.00188791 -0.01">
<inertial pos="0.0171394 0.000537591 4.8864e-07" quat="0.575338 0.411667 -0.574906 0.411094" mass="0.085445" diaginertia="5.48211e-05 4.96646e-05 3.57798e-05"/>
<joint name="left_wrist_roll_joint" pos="0 0 0" axis="1 0 0" range="-1.97222 1.97222" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_wrist_roll_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_wrist_roll_link"/>
<body name="left_wrist_pitch_link" pos="0.038 0 0">
<inertial pos="0.0229999 -0.00111685 -0.00111658" quat="0.249998 0.661363 0.293036 0.643608" mass="0.48405" diaginertia="0.000430353 0.000429873 0.000164648"/>
<joint name="left_wrist_pitch_joint" pos="0 0 0" axis="0 1 0" range="-1.61443 1.61443" actuatorfrcrange="-5 5"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_wrist_pitch_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_wrist_pitch_link"/>
<body name="left_wrist_yaw_link" pos="0.046 0 0">
<inertial pos="0.0885506 0.00212216 -0.000374562" quat="0.487149 0.493844 0.513241 0.505358" mass="0.457415" diaginertia="0.00105989 0.000895419 0.000323842"/>
<joint name="left_wrist_yaw_joint" pos="0 0 0" axis="0 0 1" range="-1.61443 1.61443" actuatorfrcrange="-5 5"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_wrist_yaw_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_wrist_yaw_link"/>
<geom pos="0.0415 0.003 0" quat="1 0 0 0" type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hand_palm_link"/>
<geom pos="0.0415 0.003 0" quat="1 0 0 0" type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_hand_palm_link"/>
<body name="left_hand_thumb_0_link" pos="0.067 0.003 0">
<inertial pos="-0.000884246 -0.00863407 0.000944293" quat="0.462991 0.643965 -0.460173 0.398986" mass="0.0862366" diaginertia="1.6546e-05 1.60058e-05 1.43741e-05"/>
<joint name="left_hand_thumb_0_joint" pos="0 0 0" axis="0 1 0" range="-1.0472 1.0472" actuatorfrcrange="-2.45 2.45"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hand_thumb_0_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_hand_thumb_0_link"/>
<body name="left_hand_thumb_1_link" pos="-0.0025 -0.0193 0">
<inertial pos="-0.000827888 -0.0354744 -0.0003809" quat="0.685598 0.705471 -0.15207 0.0956069" mass="0.0588507" diaginertia="1.28514e-05 1.22902e-05 5.9666e-06"/>
<joint name="left_hand_thumb_1_joint" pos="0 0 0" axis="0 0 1" range="-0.724312 1.0472" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hand_thumb_1_link"/>
<geom size="0.01 0.015 0.01" pos="-0.001 -0.032 0" type="box" rgba="0.7 0.7 0.7 1"/>
<body name="left_hand_thumb_2_link" pos="0 -0.0458 0">
<inertial pos="-0.00171735 -0.0262819 0.000107789" quat="0.703174 0.710977 -0.00017564 -0.00766553" mass="0.0203063" diaginertia="4.61314e-06 3.86645e-06 1.53495e-06"/>
<joint name="left_hand_thumb_2_joint" pos="0 0 0" axis="0 0 1" range="0 1.74533" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hand_thumb_2_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_hand_thumb_2_link"/>
</body>
</body>
</body>
<body name="left_hand_middle_0_link" pos="0.1192 0.0046 -0.0285">
<inertial pos="0.0354744 0.000827888 0.0003809" quat="0.391313 0.552395 0.417187 0.606373" mass="0.0588507" diaginertia="1.28514e-05 1.22902e-05 5.9666e-06"/>
<joint name="left_hand_middle_0_joint" pos="0 0 0" axis="0 0 1" range="-1.5708 0" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hand_middle_0_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_hand_middle_0_link"/>
<body name="left_hand_middle_1_link" pos="0.0458 0 0">
<inertial pos="0.0262819 0.00171735 -0.000107789" quat="0.502612 0.491799 0.502639 0.502861" mass="0.0203063" diaginertia="4.61314e-06 3.86645e-06 1.53495e-06"/>
<joint name="left_hand_middle_1_joint" pos="0 0 0" axis="0 0 1" range="-1.74533 0" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hand_middle_1_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_hand_middle_1_link"/>
</body>
</body>
<body name="left_hand_index_0_link" pos="0.1192 0.0046 0.0285">
<inertial pos="0.0354744 0.000827888 0.0003809" quat="0.391313 0.552395 0.417187 0.606373" mass="0.0588507" diaginertia="1.28514e-05 1.22902e-05 5.9666e-06"/>
<joint name="left_hand_index_0_joint" pos="0 0 0" axis="0 0 1" range="-1.5708 0" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hand_index_0_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_hand_index_0_link"/>
<body name="left_hand_index_1_link" pos="0.0458 0 0">
<inertial pos="0.0262819 0.00171735 -0.000107789" quat="0.502612 0.491799 0.502639 0.502861" mass="0.0203063" diaginertia="4.61314e-06 3.86645e-06 1.53495e-06"/>
<joint name="left_hand_index_1_joint" pos="0 0 0" axis="0 0 1" range="-1.74533 0" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="left_hand_index_1_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="left_hand_index_1_link"/>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
<body name="right_shoulder_pitch_link" pos="0.0039563 -0.10021 0.24778" quat="0.990264 -0.139201 1.38722e-05 9.86868e-05">
<inertial pos="0 -0.035892 -0.011628" quat="0.68225 -0.326267 0.0130458 0.654152" mass="0.718" diaginertia="0.000465864 0.000432842 0.000406394"/>
<joint name="right_shoulder_pitch_joint" pos="0 0 0" axis="0 1 0" range="-3.0892 2.6704" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_shoulder_pitch_link"/>
<geom size="0.03 0.025" pos="0 -0.04 -0.01" quat="0.707107 0 0.707107 0" type="cylinder" rgba="0.7 0.7 0.7 1"/>
<body name="right_shoulder_roll_link" pos="0 -0.038 -0.013831" quat="0.990268 0.139172 0 0">
<inertial pos="-0.000227 -0.00727 -0.063243" quat="0.712604 -0.00710317 -0.0196223 0.701256" mass="0.643" diaginertia="0.000691311 0.000618011 0.000388977"/>
<joint name="right_shoulder_roll_joint" pos="0 0 0" axis="1 0 0" range="-2.2515 1.5882" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_shoulder_roll_link"/>
<geom size="0.03 0.015" pos="-0.004 -0.006 -0.053" type="cylinder" rgba="0.7 0.7 0.7 1"/>
<body name="right_shoulder_yaw_link" pos="0 -0.00624 -0.1032">
<inertial pos="0.010773 0.002949 -0.072009" quat="0.687134 -0.0679942 -0.0964829 0.716879" mass="0.734" diaginertia="0.00106187 0.00103217 0.000400661"/>
<joint name="right_shoulder_yaw_joint" pos="0 0 0" axis="0 0 1" range="-2.618 2.618" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_shoulder_yaw_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_shoulder_yaw_link"/>
<body name="right_elbow_link" pos="0.015783 0 -0.080518">
<inertial pos="0.064956 -0.004454 -0.010062" quat="0.388129 0.388821 0.636132 0.541765" mass="0.6" diaginertia="0.000443035 0.000421612 0.000259353"/>
<joint name="right_elbow_joint" pos="0 0 0" axis="0 1 0" range="-1.0472 2.0944" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_elbow_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_elbow_link"/>
<body name="right_wrist_roll_link" pos="0.1 -0.00188791 -0.01">
<inertial pos="0.0171394 -0.000537591 4.8864e-07" quat="0.411667 0.575338 -0.411094 0.574906" mass="0.085445" diaginertia="5.48211e-05 4.96646e-05 3.57798e-05"/>
<joint name="right_wrist_roll_joint" pos="0 0 0" axis="1 0 0" range="-1.97222 1.97222" actuatorfrcrange="-25 25"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_wrist_roll_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_wrist_roll_link"/>
<body name="right_wrist_pitch_link" pos="0.038 0 0">
<inertial pos="0.0229999 0.00111685 -0.00111658" quat="0.643608 0.293036 0.661363 0.249998" mass="0.48405" diaginertia="0.000430353 0.000429873 0.000164648"/>
<joint name="right_wrist_pitch_joint" pos="0 0 0" axis="0 1 0" range="-1.61443 1.61443" actuatorfrcrange="-5 5"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_wrist_pitch_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_wrist_pitch_link"/>
<body name="right_wrist_yaw_link" pos="0.046 0 0">
<inertial pos="0.0885506 -0.00212216 -0.000374562" quat="0.505358 0.513241 0.493844 0.487149" mass="0.457415" diaginertia="0.00105989 0.000895419 0.000323842"/>
<joint name="right_wrist_yaw_joint" pos="0 0 0" axis="0 0 1" range="-1.61443 1.61443" actuatorfrcrange="-5 5"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_wrist_yaw_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_wrist_yaw_link"/>
<geom pos="0.0415 -0.003 0" quat="1 0 0 0" type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hand_palm_link"/>
<geom pos="0.0415 -0.003 0" quat="1 0 0 0" type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_hand_palm_link"/>
<body name="right_hand_thumb_0_link" pos="0.067 -0.003 0">
<inertial pos="-0.000884246 0.00863407 0.000944293" quat="0.643965 0.462991 -0.398986 0.460173" mass="0.0862366" diaginertia="1.6546e-05 1.60058e-05 1.43741e-05"/>
<joint name="right_hand_thumb_0_joint" pos="0 0 0" axis="0 1 0" range="-1.0472 1.0472" actuatorfrcrange="-2.45 2.45"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hand_thumb_0_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_hand_thumb_0_link"/>
<body name="right_hand_thumb_1_link" pos="-0.0025 0.0193 0">
<inertial pos="-0.000827888 0.0354744 -0.0003809" quat="0.705471 0.685598 -0.0956069 0.15207" mass="0.0588507" diaginertia="1.28514e-05 1.22902e-05 5.9666e-06"/>
<joint name="right_hand_thumb_1_joint" pos="0 0 0" axis="0 0 1" range="-1.0472 0.724312" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hand_thumb_1_link"/>
<geom size="0.01 0.015 0.01" pos="-0.001 0.032 0" type="box" rgba="0.7 0.7 0.7 1"/>
<body name="right_hand_thumb_2_link" pos="0 0.0458 0">
<inertial pos="-0.00171735 0.0262819 0.000107789" quat="0.710977 0.703174 0.00766553 0.00017564" mass="0.0203063" diaginertia="4.61314e-06 3.86645e-06 1.53495e-06"/>
<joint name="right_hand_thumb_2_joint" pos="0 0 0" axis="0 0 1" range="-1.74533 0" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hand_thumb_2_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_hand_thumb_2_link"/>
</body>
</body>
</body>
<body name="right_hand_middle_0_link" pos="0.1192 -0.0046 -0.0285">
<inertial pos="0.0354744 -0.000827888 0.0003809" quat="0.606373 0.417187 0.552395 0.391313" mass="0.0588507" diaginertia="1.28514e-05 1.22902e-05 5.9666e-06"/>
<joint name="right_hand_middle_0_joint" pos="0 0 0" axis="0 0 1" range="0 1.5708" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hand_middle_0_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_hand_middle_0_link"/>
<body name="right_hand_middle_1_link" pos="0.0458 0 0">
<inertial pos="0.0262819 -0.00171735 -0.000107789" quat="0.502861 0.502639 0.491799 0.502612" mass="0.0203063" diaginertia="4.61314e-06 3.86645e-06 1.53495e-06"/>
<joint name="right_hand_middle_1_joint" pos="0 0 0" axis="0 0 1" range="0 1.74533" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hand_middle_1_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_hand_middle_1_link"/>
</body>
</body>
<body name="right_hand_index_0_link" pos="0.1192 -0.0046 0.0285">
<inertial pos="0.0354744 -0.000827888 0.0003809" quat="0.606373 0.417187 0.552395 0.391313" mass="0.0588507" diaginertia="1.28514e-05 1.22902e-05 5.9666e-06"/>
<joint name="right_hand_index_0_joint" pos="0 0 0" axis="0 0 1" range="0 1.5708" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hand_index_0_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_hand_index_0_link"/>
<body name="right_hand_index_1_link" pos="0.0458 0 0">
<inertial pos="0.0262819 -0.00171735 -0.000107789" quat="0.502861 0.502639 0.491799 0.502612" mass="0.0203063" diaginertia="4.61314e-06 3.86645e-06 1.53495e-06"/>
<joint name="right_hand_index_1_joint" pos="0 0 0" axis="0 0 1" range="0 1.74533" actuatorfrcrange="-1.4 1.4"/>
<geom type="mesh" contype="0" conaffinity="0" group="1" density="0" rgba="0.7 0.7 0.7 1" mesh="right_hand_index_1_link"/>
<geom type="mesh" rgba="0.7 0.7 0.7 1" mesh="right_hand_index_1_link"/>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</body>
</worldbody>
<actuator>
<motor name="left_hip_pitch_joint" joint="left_hip_pitch_joint"/>
<motor name="left_hip_roll_joint" joint="left_hip_roll_joint"/>
<motor name="left_hip_yaw_joint" joint="left_hip_yaw_joint"/>
<motor name="left_knee_joint" joint="left_knee_joint"/>
<motor name="left_ankle_pitch_joint" joint="left_ankle_pitch_joint"/>
<motor name="left_ankle_roll_joint" joint="left_ankle_roll_joint"/>
<motor name="right_hip_pitch_joint" joint="right_hip_pitch_joint"/>
<motor name="right_hip_roll_joint" joint="right_hip_roll_joint"/>
<motor name="right_hip_yaw_joint" joint="right_hip_yaw_joint"/>
<motor name="right_knee_joint" joint="right_knee_joint"/>
<motor name="right_ankle_pitch_joint" joint="right_ankle_pitch_joint"/>
<motor name="right_ankle_roll_joint" joint="right_ankle_roll_joint"/>
<motor name="waist_yaw_joint" joint="waist_yaw_joint"/>
<motor name="waist_roll_joint" joint="waist_roll_joint"/>
<motor name="waist_pitch_joint" joint="waist_pitch_joint"/>
<motor name="left_shoulder_pitch_joint" joint="left_shoulder_pitch_joint"/>
<motor name="left_shoulder_roll_joint" joint="left_shoulder_roll_joint"/>
<motor name="left_shoulder_yaw_joint" joint="left_shoulder_yaw_joint"/>
<motor name="left_elbow_joint" joint="left_elbow_joint"/>
<motor name="left_wrist_roll_joint" joint="left_wrist_roll_joint"/>
<motor name="left_wrist_pitch_joint" joint="left_wrist_pitch_joint"/>
<motor name="left_wrist_yaw_joint" joint="left_wrist_yaw_joint"/>
<motor name="left_hand_thumb_0_joint" joint="left_hand_thumb_0_joint"/>
<motor name="left_hand_thumb_1_joint" joint="left_hand_thumb_1_joint"/>
<motor name="left_hand_thumb_2_joint" joint="left_hand_thumb_2_joint"/>
<motor name="left_hand_middle_0_joint" joint="left_hand_middle_0_joint"/>
<motor name="left_hand_middle_1_joint" joint="left_hand_middle_1_joint"/>
<motor name="left_hand_index_0_joint" joint="left_hand_index_0_joint"/>
<motor name="left_hand_index_1_joint" joint="left_hand_index_1_joint"/>
<motor name="right_shoulder_pitch_joint" joint="right_shoulder_pitch_joint"/>
<motor name="right_shoulder_roll_joint" joint="right_shoulder_roll_joint"/>
<motor name="right_shoulder_yaw_joint" joint="right_shoulder_yaw_joint"/>
<motor name="right_elbow_joint" joint="right_elbow_joint"/>
<motor name="right_wrist_roll_joint" joint="right_wrist_roll_joint"/>
<motor name="right_wrist_pitch_joint" joint="right_wrist_pitch_joint"/>
<motor name="right_wrist_yaw_joint" joint="right_wrist_yaw_joint"/>
<motor name="right_hand_thumb_0_joint" joint="right_hand_thumb_0_joint"/>
<motor name="right_hand_thumb_1_joint" joint="right_hand_thumb_1_joint"/>
<motor name="right_hand_thumb_2_joint" joint="right_hand_thumb_2_joint"/>
<motor name="right_hand_index_0_joint" joint="right_hand_index_0_joint"/>
<motor name="right_hand_index_1_joint" joint="right_hand_index_1_joint"/>
<motor name="right_hand_middle_0_joint" joint="right_hand_middle_0_joint"/>
<motor name="right_hand_middle_1_joint" joint="right_hand_middle_1_joint"/>
</actuator>
<sensor>
<gyro name="imu-torso-angular-velocity" site="imu_in_torso" noise="5e-4" cutoff="34.9"/>
<accelerometer name="imu-torso-linear-acceleration" site="imu_in_torso" noise="1e-2" cutoff="157"/>
<gyro name="imu-pelvis-angular-velocity" site="imu_in_pelvis" noise="5e-4" cutoff="34.9"/>
<accelerometer name="imu-pelvis-linear-acceleration" site="imu_in_pelvis" noise="1e-2" cutoff="157"/>
</sensor>
<!-- setup scene -->
<statistic center="1.0 0.7 1.0" extent="0.8"/>
<visual>
<headlight diffuse="0.6 0.6 0.6" ambient="0.1 0.1 0.1" specular="0.9 0.9 0.9"/>
<rgba haze="0.15 0.25 0.35 1"/>
<global azimuth="-140" elevation="-20"/>
</visual>
<asset>
<texture type="skybox" builtin="flat" rgb1="0 0 0" rgb2="0 0 0" width="512" height="3072"/>
<texture type="2d" name="groundplane" builtin="checker" mark="edge" rgb1="0.2 0.3 0.4" rgb2="0.1 0.2 0.3" markrgb="0.8 0.8 0.8" width="300" height="300"/>
<material name="groundplane" texture="groundplane" texuniform="true" texrepeat="5 5" reflectance="0.2"/>
</asset>
<worldbody>
<light pos="1 0 3.5" dir="0 0 -1" directional="true"/>
<geom name="floor" size="0 0 0.05" type="plane" material="groundplane"/>
</worldbody>
</mujoco>

Some files were not shown because too many files have changed in this diff Show More