Table of Contents
Fetching ...

Geometrically-Constrained Radar-Inertial Odometry via Continuous Point-Pose Uncertainty Modeling

Wooseong Yang, Dongjae Lee, Minwoo Jung, Ayoung Kim

Abstract

Radar odometry is crucial for robust localization in challenging environments; however, the sparsity of reliable returns and distinctive noise characteristics impede its performance. This paper introduces geometrically-constrained radar-inertial odometry and mapping that jointly consolidates point and pose uncertainty. We employ the continuous trajectory model to estimate the pose uncertainty at any arbitrary timestamp by propagating uncertainties of the control points. These pose uncertainties are continuously integrated with heteroscedastic measurement uncertainty during point projection, thereby enabling dynamic evaluation of observation confidence and adaptive down-weighting of uninformative radar points. By leveraging quantified uncertainties in radar mapping, we construct a high-fidelity map that improves odometry accuracy under imprecise radar measurements. Moreover, we reveal the effectiveness of explicit geometrical constraints in radar-inertial odometry when incorporated with the proposed uncertainty-aware mapping framework. Extensive experiments on diverse real-world datasets demonstrate the superiority of our method, yielding substantial performance improvements in both accuracy and efficiency compared to existing baselines.

Geometrically-Constrained Radar-Inertial Odometry via Continuous Point-Pose Uncertainty Modeling

Abstract

Radar odometry is crucial for robust localization in challenging environments; however, the sparsity of reliable returns and distinctive noise characteristics impede its performance. This paper introduces geometrically-constrained radar-inertial odometry and mapping that jointly consolidates point and pose uncertainty. We employ the continuous trajectory model to estimate the pose uncertainty at any arbitrary timestamp by propagating uncertainties of the control points. These pose uncertainties are continuously integrated with heteroscedastic measurement uncertainty during point projection, thereby enabling dynamic evaluation of observation confidence and adaptive down-weighting of uninformative radar points. By leveraging quantified uncertainties in radar mapping, we construct a high-fidelity map that improves odometry accuracy under imprecise radar measurements. Moreover, we reveal the effectiveness of explicit geometrical constraints in radar-inertial odometry when incorporated with the proposed uncertainty-aware mapping framework. Extensive experiments on diverse real-world datasets demonstrate the superiority of our method, yielding substantial performance improvements in both accuracy and efficiency compared to existing baselines.

Paper Structure

This paper contains 19 sections, 25 equations, 13 figures, 6 tables.

Figures (13)

  • Figure 1: We jointly model and continuously consolidate point-pose uncertainty in radar odometry and mapping, enabling the effective usage of explicit local geometry.
  • Figure 2: Our method consists of preprocessing, state estimation, and mapping. In preprocessing, we remove dynamic points from radar measurements using ego-velocity estimates and perform data association for the measurements used in the state update. The state estimation module iteratively performs residual computation, uncertainty propagation, and localizability-constrained IEKF updates until convergence. After convergence, we utilize high-confidence points to enhance mapping accuracy.
  • Figure 3: Conceptual diagram of our uncertainty propagation module. Uncertainties of B-spline control points are propagated to the pose at the radar timestamp and subsequently to individual radar returns. The resulting per-point uncertainties in the global frame perform as confidence weights in the observation model for local geometry-guided scan-to-submap registration.
  • Figure 4: Estimated trajectory for Street. Our method (blue) shows the best alignment with the ground-truth trajectory (black) and even surpasses LiDAR-Inertial Odometry (red).
  • Figure 5: Proposed
  • ...and 8 more figures