Table of Contents
Fetching ...

Relaxing Continuous Constraints of Equivariant Graph Neural Networks for Physical Dynamics Learning

Zinan Zheng, Yang Liu, Jia Li, Jianhua Yao, Yu Rong

TL;DR

This work proposes a general Discrete Equivariant Graph Neural Network (DEGNN), and shows that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings, and shows that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.

Abstract

Incorporating Euclidean symmetries (e.g. rotation equivariance) as inductive biases into graph neural networks has improved their generalization ability and data efficiency in unbounded physical dynamics modeling. However, in various scientific and engineering applications, the symmetries of dynamics are frequently discrete due to the boundary conditions. Thus, existing GNNs either overlook necessary symmetry, resulting in suboptimal representation ability, or impose excessive equivariance, which fails to generalize to unobserved symmetric dynamics. In this work, we propose a general Discrete Equivariant Graph Neural Network (DEGNN) that guarantees equivariance to a given discrete point group. Specifically, we show that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings. Through relaxing continuous equivariant constraints, DEGNN can employ more geometric feature combinations to approximate unobserved physical object interaction functions. Two implementation approaches of DEGNN are proposed based on ranking or pooling permutation-invariant functions. We apply DEGNN to various physical dynamics, ranging from particle, molecular, crowd to vehicle dynamics. In twenty scenarios, DEGNN significantly outperforms existing state-of-the-art approaches. Moreover, we show that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.

Relaxing Continuous Constraints of Equivariant Graph Neural Networks for Physical Dynamics Learning

TL;DR

This work proposes a general Discrete Equivariant Graph Neural Network (DEGNN), and shows that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings, and shows that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.

Abstract

Incorporating Euclidean symmetries (e.g. rotation equivariance) as inductive biases into graph neural networks has improved their generalization ability and data efficiency in unbounded physical dynamics modeling. However, in various scientific and engineering applications, the symmetries of dynamics are frequently discrete due to the boundary conditions. Thus, existing GNNs either overlook necessary symmetry, resulting in suboptimal representation ability, or impose excessive equivariance, which fails to generalize to unobserved symmetric dynamics. In this work, we propose a general Discrete Equivariant Graph Neural Network (DEGNN) that guarantees equivariance to a given discrete point group. Specifically, we show that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings. Through relaxing continuous equivariant constraints, DEGNN can employ more geometric feature combinations to approximate unobserved physical object interaction functions. Two implementation approaches of DEGNN are proposed based on ranking or pooling permutation-invariant functions. We apply DEGNN to various physical dynamics, ranging from particle, molecular, crowd to vehicle dynamics. In twenty scenarios, DEGNN significantly outperforms existing state-of-the-art approaches. Moreover, we show that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.

Paper Structure

This paper contains 55 sections, 2 theorems, 15 equations, 12 figures, 6 tables.

Key Result

theorem 1

For arbitrary rotation or reflection matrix $\bm{O}\in P$, if $\phi$ is a permutation-invariant function, the function $\mu_P$ satisfies the $P$-equivariance:

Figures (12)

  • Figure 1: Illustration of discrete equivariance. (a) Rotation equivariance of vehicle trajectories. They have similar patterns when they drive in lanes of opposite directions; (b) Reflection equivariance of molecular dynamics. The effects of boundaries and molecules are symmetric when we reflect the entire system.
  • Figure 2: Examples of N-body system trajectories. The square is the boundary where its symmetry is described by the $D_4$ group, a specific point group. EGNN and GNN fail to distinguish the interactions of the objects that result in different dynamics, while DEGNN successfully maps different messages to different equivariant embeddings.
  • Figure 3: Two realizations of permutation-invariant embedding functions. We omit velocity features in MLP inputs for simplicity. Ranking-based methods rearrange the unordered set to ordered features while pooling-based approaches utilize a permutation-invariant function to aggregate features.
  • Figure 4: MSE of models on vehicle dataset. They are trained on trajectories from the left to right and tested on those from the right to left. (Official EqMotion reports "Nan" in Highway 5 datasets; $Log\, MSE = log (100 \times MSE + 1)$)
  • Figure 5: Generalization experiments across different scenarios on vehicle dynamic datasets. Row/Column denotes the training/testing scenarios. ($Log\, MSE = log (MSE + 1)$)
  • ...and 7 more figures

Theorems & Definitions (7)

  • definition 1
  • definition 2
  • definition 3
  • definition 4
  • Remark
  • theorem 1
  • proposition 1