Table of Contents
Fetching ...

Structure-Preserving Learning of Nonholonomic Dynamics

Thomas Beckers, Anthony Bloch, Leonardo Colombo

Abstract

Data-driven modeling is playing an increasing role in robotics and control, yet standard learning methods typically ignore the geometric structure of nonholonomic systems. As a consequence, the learned dynamics may violate the nonholonomic constraints and produce physically inconsistent motions. In this paper, we introduce a structure-preserving Gaussian process (GP) framework for learning nonholonomic dynamics. Our main ingredient is a nonholonomic matrix-valued kernel that incorporates the constraint distribution directly into the GP prior. This construction ensures that the learned vector field satisfies the nonholonomic constraints for all inputs. We show that the proposed kernel is positive semidefinite, characterize its associated reproducing kernel Hilbert space as a space of admissible vector fields, and prove that the resulting estimator admits a coordinate representation adapted to the constraint distribution. We also establish the consistency of the learned model. Numerical simulations on a vertical rolling disk illustrate the effectiveness of the proposed approach.

Structure-Preserving Learning of Nonholonomic Dynamics

Abstract

Data-driven modeling is playing an increasing role in robotics and control, yet standard learning methods typically ignore the geometric structure of nonholonomic systems. As a consequence, the learned dynamics may violate the nonholonomic constraints and produce physically inconsistent motions. In this paper, we introduce a structure-preserving Gaussian process (GP) framework for learning nonholonomic dynamics. Our main ingredient is a nonholonomic matrix-valued kernel that incorporates the constraint distribution directly into the GP prior. This construction ensures that the learned vector field satisfies the nonholonomic constraints for all inputs. We show that the proposed kernel is positive semidefinite, characterize its associated reproducing kernel Hilbert space as a space of admissible vector fields, and prove that the resulting estimator admits a coordinate representation adapted to the constraint distribution. We also establish the consistency of the learned model. Numerical simulations on a vertical rolling disk illustrate the effectiveness of the proposed approach.

Paper Structure

This paper contains 10 sections, 6 theorems, 25 equations, 4 figures, 2 tables.

Key Result

Proposition 1

Let $\mathcal{H}_{kI_n}$ be the RKHS associated with the matrix-valued kernel $K_0(q,q')=k(q,q')I_n$, and define the operator $(Tg)(q):=P(q)g(q)$. Then the RKHS $\mathcal{H}_{\mathrm{NH}}$ associated with the nonholonomic kernel NHK is given by $\mathcal{H}_{\mathrm{NH}} = \{\, Tg : g\in \mathcal{H}

Figures (4)

  • Figure 2: Trajectory comparison in the plane: true dynamics, nominal model, nonholonomic GP, and standard GP.
  • Figure 3: Planar trajectory error $\Delta(t)$ for the nominal, nonholonomic GP, and standard GP models.
  • Figure 4: Constraint violation metric $e_{\mathrm{nh}}(q)=\|A(q)\hat{f}(q)\|$ for the standard and nonholonomic GP models.
  • Figure 5: Pointwise field prediction error $e_f(q)=\|\hat{f}(q)-f^\star(q)\|$ on the test set.

Theorems & Definitions (18)

  • Definition 1
  • Definition 2
  • Proposition 1
  • proof
  • Remark 1
  • Proposition 2
  • proof
  • Proposition 3
  • proof
  • Theorem 1
  • ...and 8 more