Table of Contents
Fetching ...

Dynamic Whole-Body Dancing with Humanoid Robots -- A Model-Based Control Approach

Shibowen Zhang, Jiayang Wu, Guannan Liu, Helin Zhu, Junjie Liu, Zhehan Li, Junhong Guo, Xiaokun Leng, Hangxin Liu, Jingwen Zhang, Jikai Wang, Zonghai Chen, Zhicheng He, Jiayi Wang, Yao Su

Abstract

This paper presents an integrated model-based framework for generating and executing dynamic whole-body dance motions on humanoid robots. The framework operates in two stages: offline motion generation and online motion execution, both leveraging future state prediction to enable robust and dynamic dance motions in real-world environments. In the offline motion generation stage, human dance demonstrations are captured via a motion capture (MoCap) system, retargeted to the robot by solving a Quadratic Programming (QP) problem, and further refined using Trajectory Optimization (TO) to ensure dynamic feasibility. In the online motion execution stage, a centroidal dynamics-based Model Predictive Control (MPC) framework tracks the planned motions in real time and proactively adjusts swing foot placement to adapt to real world disturbances. We validate our framework on the full-size humanoid robot Kuavo 4Pro, demonstrating the dynamic dance motions both in simulation and in a four-minute live public performance with a team of four robots. Experimental results show that longer prediction horizons improve both motion expressiveness in planning and stability in execution.

Dynamic Whole-Body Dancing with Humanoid Robots -- A Model-Based Control Approach

Abstract

This paper presents an integrated model-based framework for generating and executing dynamic whole-body dance motions on humanoid robots. The framework operates in two stages: offline motion generation and online motion execution, both leveraging future state prediction to enable robust and dynamic dance motions in real-world environments. In the offline motion generation stage, human dance demonstrations are captured via a motion capture (MoCap) system, retargeted to the robot by solving a Quadratic Programming (QP) problem, and further refined using Trajectory Optimization (TO) to ensure dynamic feasibility. In the online motion execution stage, a centroidal dynamics-based Model Predictive Control (MPC) framework tracks the planned motions in real time and proactively adjusts swing foot placement to adapt to real world disturbances. We validate our framework on the full-size humanoid robot Kuavo 4Pro, demonstrating the dynamic dance motions both in simulation and in a four-minute live public performance with a team of four robots. Experimental results show that longer prediction horizons improve both motion expressiveness in planning and stability in execution.

Paper Structure

This paper contains 16 sections, 5 equations, 10 figures, 2 tables.

Figures (10)

  • Figure 1: Dynamic whole-body dance motions performed by a team of four humanoid robots.
  • Figure 2: Overview of the dance motion generation framework. (a) The mocap system captures the motion patterns of markers on the human demonstrator using a high-resolution camera group. (b) Geometric motion retargeting morphs the human motions into robot motions by minimizing the position error between robot joints and human skeleton joints. The corresponding joints of these two models are marked in the same color in (b) and (c). (c) A snapshot of the simulation using the trajectory output by geometric motion retargeting shows a dynamically infeasible motion. (d) Dynamic motion retargeting adapts the trajectory to meet dynamic and kinematic constraints via to. (e) A snapshot of the simulation using the trajectory after adjustment by dynamic motion retargeting shows a dynamically feasible motion.
  • Figure 3: The placement of the markers on the human body. The markers are grouped by different colors, red for head, green for torso, yellow for arms, pink for waist, purple for legs, and blue for feet.
  • Figure 4: Overview of the online motion execution framework. (i) The centroidal mpc takes the offline-generated dance trajectory and the robot's current state as inputs to compute an optimal reference state trajectory. (ii) The whole-body control tracks the reference state trajectory and generates torque control commands for the motors.
  • Figure 5: Hardware design and configuration of the Kuavo 4Pro humanoid robot. The total height of the robot is 1.66 m with a mass of 55 kg. Each leg contains 6 dof and each arm contains 7 dof. The head integrates a 2-dof actuation system with a Realsense D435 camera and a Livox Mid360 lidar. The robot contains two computers: an Intel NUC computer (Core i9-13900H) for control and communications, and an NVIDIA Jetson AGX Orin (64 GB) for vision and AI algorithms.
  • ...and 5 more figures