Table of Contents
Fetching ...

MRReP: Mixed Reality-based Hand-drawn Reference Path Editing Interface for Mobile Robot Navigation

Takumi Taki, Masato Kobayashi, Yuki Uranishi

Abstract

Autonomous mobile robots operating in human-shared indoor environments often require paths that reflect human spatial intentions, such as avoiding interference with pedestrian flow or maintaining comfortable clearance. However, conventional path planners primarily optimize geometric costs and provide limited support for explicit route specification by human operators. This paper presents MRReP, a Mixed Reality-based interface that enables users to draw a Hand-drawn Reference Path (HRP) directly on the physical floor using hand gestures. The drawn HRP is integrated into the robot navigation stack through a custom Hand-drawn Reference Path Planner, which converts the user-specified point sequence into a global path for autonomous navigation. We evaluated MRReP in a within-subject experiment against a conventional 2D baseline interface. The results demonstrated that MRReP enhanced path specification accuracy, usability, and perceived workload, while enabling more stable path specification in the physical environment. These findings suggest that direct path specification in MR is an effective approach for incorporating human spatial intention into mobile robot navigation. Additional material is available at https://mertcookimg.github.io/mrrep

MRReP: Mixed Reality-based Hand-drawn Reference Path Editing Interface for Mobile Robot Navigation

Abstract

Autonomous mobile robots operating in human-shared indoor environments often require paths that reflect human spatial intentions, such as avoiding interference with pedestrian flow or maintaining comfortable clearance. However, conventional path planners primarily optimize geometric costs and provide limited support for explicit route specification by human operators. This paper presents MRReP, a Mixed Reality-based interface that enables users to draw a Hand-drawn Reference Path (HRP) directly on the physical floor using hand gestures. The drawn HRP is integrated into the robot navigation stack through a custom Hand-drawn Reference Path Planner, which converts the user-specified point sequence into a global path for autonomous navigation. We evaluated MRReP in a within-subject experiment against a conventional 2D baseline interface. The results demonstrated that MRReP enhanced path specification accuracy, usability, and perceived workload, while enabling more stable path specification in the physical environment. These findings suggest that direct path specification in MR is an effective approach for incorporating human spatial intention into mobile robot navigation. Additional material is available at https://mertcookimg.github.io/mrrep

Paper Structure

This paper contains 24 sections, 21 figures, 5 tables, 1 algorithm.

Figures (21)

  • Figure 3: Overview of MRReP. (a) A user draws a Hand-drawn Reference Path (HRP) directly on the physical floor using hand gestures and an MR-HMD. (b) The HRP is sent to the robot navigation system, enabling autonomous navigation along the specified path.
  • Figure 4: Example of an HRP and its corresponding global path. (a) HRP overlaid on the floor in HoloLens 2; the yellow pin indicates the goal. (b) Generated global path and costmap.
  • Figure 5: System architecture and data flow. The drawn HRP is stored in the HoloLens 2 database, transmitted to ROS 2 via the SEND function, converted into a global path and goal pose, and then used for robot navigation in Navigation2.
  • Figure 6: Menu panel. (a) Main menu in the initial state. (b) Path operation menu with selectable operation modes.
  • Figure 7: ADD operation. (a) A floor cursor appears when the user extends the hand. (b) Maintaining a pinch gesture draws an HRP as a sequence of waypoints. (c) Releasing the pinch completes the path and places a goal pin at the endpoint.
  • ...and 16 more figures