Table of Contents
Fetching ...

Usability Evaluation and Improvement of a Tool for Self-Service Learning Analytics

Shoeb Joarder, Mohamed Amine Chatti, Louis Born

Abstract

Self-Service Learning Analytics (SSLA) tools aim to support educational stakeholders in creating learning analytics indicators without requiring technical expertise. While such tools promise user control and trans- parency, their effectiveness and adoption depend critically on usability aspects. This paper presents a compre- hensive usability evaluation and improvement of the Indicator Editor, a no-code, exploratory SSLA tool that enables non-technical users to implement custom learning analytics indicators through a structured workflow. Using an iterative evaluation approach, we conduct an exploratory qualitative user study, usability inspections of high-fidelity prototypes, and a workshop-based evaluation in an authentic educational setting with n = 46 students using standardized instruments, namely System Usability Scale (SUS), User Experience Question- naire (UEQ), and Net Promoter Score (NPS). Based on the evaluation findings, we derive concrete design implications that inform improvements in workflow guidance, feedback, and information presentation in the Indicator Editor. Furthermore, our evaluation provides practical insights for the design of usable SSLA tools.

Usability Evaluation and Improvement of a Tool for Self-Service Learning Analytics

Abstract

Self-Service Learning Analytics (SSLA) tools aim to support educational stakeholders in creating learning analytics indicators without requiring technical expertise. While such tools promise user control and trans- parency, their effectiveness and adoption depend critically on usability aspects. This paper presents a compre- hensive usability evaluation and improvement of the Indicator Editor, a no-code, exploratory SSLA tool that enables non-technical users to implement custom learning analytics indicators through a structured workflow. Using an iterative evaluation approach, we conduct an exploratory qualitative user study, usability inspections of high-fidelity prototypes, and a workshop-based evaluation in an authentic educational setting with n = 46 students using standardized instruments, namely System Usability Scale (SUS), User Experience Question- naire (UEQ), and Net Promoter Score (NPS). Based on the evaluation findings, we derive concrete design implications that inform improvements in workflow guidance, feedback, and information presentation in the Indicator Editor. Furthermore, our evaluation provides practical insights for the design of usable SSLA tools.

Paper Structure

This paper contains 32 sections, 5 figures.

Figures (5)

  • Figure 1: Evolution of the Indicator Editor interface across iterative design stages: a) the Original User Interface of the Indicator Editor, b) the Initial High-Fidelity Prototype, and c) the Intermediate High-Fidelity Prototype, developed in response to the identified usability findings. Across all iterations, the indicator creation workflow follows five main steps: 1) choose dataset, 2) apply filters, 3) select an analysis method, 4) select a visualization, and 5) preview and finalize the indicator.
  • Figure 2: Mean scores and 95% confidence intervals ($p = 0.05$) for the six User Experience Questionnaire (UEQ) dimensions of the Indicator Editor, measured on a scale from $-3$ (negative) to $+3$ (positive).
  • Figure 3: Aggregated mean UEQ scores for overall attractiveness, pragmatic quality (perspicuity, efficiency, dependability), and hedonic quality (stimulation, novelty) for the Indicator Editor.
  • Figure 4: UEQ item-level response distributions for the Indicator Editor.
  • Figure 5: The Final User Interface of the Indicator Editor. The Editor screen supports dataset selection, filtering, analysis, visualization, preview/customization, saving, and review of a selection summary ($a$--$g$). The My Indicators screen supports indicator management and indicator preview ($h$--$i$).