Table of Contents
Fetching ...

EBuddy: a workflow orchestrator for industrial human-machine collaboration

Michele Banfi, Rocco Felici, Stefano Baraldo, Oliver Avram, Anna Valente

Abstract

This paper presents EBuddy, a voice-guided workflow orchestrator for natural human-machine collaboration in industrial environments. EBuddy targets a recurrent bottleneck in tool-intensive workflows: expert know-how is effective but difficult to scale, and execution quality degrades when procedures are reconstructed ad hoc across operators and sessions. EBuddy operationalizes expert practice as a finite state machine (FSM) driven application that provides an interpretable decision frame at runtime (current state and admissible actions), so that spoken requests are interpreted within state-grounded constraints, while the system executes and monitors the corresponding tool interactions. Through modular workflow artifacts, EBuddy coordinates heterogeneous resources, including GUI-driven software and a collaborative robot, leveraging fully voice-based interaction through automatic speech recognition and intent understanding. An industrial pilot on impeller blade inspection and repair preparation for directed energy deposition (DED), realized by human-robot collaboration, shows substantial reductions in end-to-end process duration across onboarding, 3D scanning and processing, and repair program generation, while preserving repeatability and low operator burden.

EBuddy: a workflow orchestrator for industrial human-machine collaboration

Abstract

This paper presents EBuddy, a voice-guided workflow orchestrator for natural human-machine collaboration in industrial environments. EBuddy targets a recurrent bottleneck in tool-intensive workflows: expert know-how is effective but difficult to scale, and execution quality degrades when procedures are reconstructed ad hoc across operators and sessions. EBuddy operationalizes expert practice as a finite state machine (FSM) driven application that provides an interpretable decision frame at runtime (current state and admissible actions), so that spoken requests are interpreted within state-grounded constraints, while the system executes and monitors the corresponding tool interactions. Through modular workflow artifacts, EBuddy coordinates heterogeneous resources, including GUI-driven software and a collaborative robot, leveraging fully voice-based interaction through automatic speech recognition and intent understanding. An industrial pilot on impeller blade inspection and repair preparation for directed energy deposition (DED), realized by human-robot collaboration, shows substantial reductions in end-to-end process duration across onboarding, 3D scanning and processing, and repair program generation, while preserving repeatability and low operator burden.

Paper Structure

This paper contains 25 sections, 4 equations, 6 figures, 2 tables.

Figures (6)

  • Figure 1: Collaboration between users and EBuddy to achieve optimal tool interactions, guided by domain expert knowledge encoded within EBuddy's workflow framework.
  • Figure 2: The EBuddy preview tab enables users to initiate and configure the 3D Studio application in preview mode.
  • Figure 3: Graphical information managed by EBuddy: GUI-based Tools (left), EBuddy GUI (center), Helper app (right).
  • Figure 4: EBuddy GUI is displaying the TabComponents SM. The current state is Ready, available commands are NextState or BackState.
  • Figure 5: Inputs and output of the language-based interaction for FSM navigation.
  • ...and 1 more figures