Table of Contents
Fetching ...

Learning Smooth and Robust Space Robotic Manipulation of Dynamic Target via Inter-frame Correlation

Siyi Lang, Hongyi Gao, Yingxin Zhang, Zihao Liu, Hanlin Dong, Zhaoke Ning, Zhiqiang Ma, Panfeng Huang

Abstract

On-orbit servicing represents a critical frontier in future aerospace engineering, with the manipulation of dynamic non-cooperative targets serving as a key technology. In microgravity environments, objects are typically free-floating, lacking the support and frictional constraints found on Earth, which significantly escalates the complexity of tasks involving space robotic manipulation. Conventional planning and control-based methods are primarily limited to known, static scenarios and lack real-time responsiveness. To achieve precise robotic manipulation of dynamic targets in unknown and unstructured space environments, this letter proposes a data-driven space robotic manipulation approach that integrates historical temporal information and inter-frame correlation mechanisms. By exploiting the temporal correlation between historical and current frames, the system can effectively capture motion features within the scene, thereby producing stable and smooth manipulation trajectories for dynamic targets. To validate the effectiveness of the proposed method, we developed a ground-based experimental platform consisting of a PIPER X robotic arm and a dual-axis linear stage, which accurately simulates micro-gravity free-floating motion in a 2D plane.

Learning Smooth and Robust Space Robotic Manipulation of Dynamic Target via Inter-frame Correlation

Abstract

On-orbit servicing represents a critical frontier in future aerospace engineering, with the manipulation of dynamic non-cooperative targets serving as a key technology. In microgravity environments, objects are typically free-floating, lacking the support and frictional constraints found on Earth, which significantly escalates the complexity of tasks involving space robotic manipulation. Conventional planning and control-based methods are primarily limited to known, static scenarios and lack real-time responsiveness. To achieve precise robotic manipulation of dynamic targets in unknown and unstructured space environments, this letter proposes a data-driven space robotic manipulation approach that integrates historical temporal information and inter-frame correlation mechanisms. By exploiting the temporal correlation between historical and current frames, the system can effectively capture motion features within the scene, thereby producing stable and smooth manipulation trajectories for dynamic targets. To validate the effectiveness of the proposed method, we developed a ground-based experimental platform consisting of a PIPER X robotic arm and a dual-axis linear stage, which accurately simulates micro-gravity free-floating motion in a 2D plane.

Paper Structure

This paper contains 12 sections, 3 equations, 12 figures, 1 table.

Figures (12)

  • Figure 1: Illustration of trajectory oscillation.
  • Figure 2: Overview of the proposed network architecture. The top-left portion displays the CVAE encoder, which predicts the style latent variable $z$ representing task characteristics. The bottom-left portion integrates an inter-frame correlation network to extract key motion tokens by comparing consecutive visual frames. The right portion shows the CVAE decoder, which maps the current multi-modal perception data and latent variables into the final robotic action sequences.
  • Figure 3: Schematic of the Inter-frame Correlation Network. The network processes spatio-temporal correlation information through two primary stages: the first stage involves cost volume calculation, encoding, and compression, transforming consecutive image features into cost volume semantic representations. The second stage implements a Spatial Self-Attention mechanism, utilizing intra-token, vertical, and horizontal self-attention blocks to enhance the network's ability to capture global spatial dependencies within the cost maps.
  • Figure 4: Overview of the ground-based experimental platform.
  • Figure 5: Workflow of the experiment.
  • ...and 7 more figures