Table of Contents
Fetching ...

Mixture-of-Experts in Remote Sensing: A Survey

Yongchuan Cui, Peng Liu, Lajiao Chen

Abstract

Remote sensing data analysis and interpretation present unique challenges due to the diversity in sensor modalities and spatiotemporal dynamics of Earth observation data. Mixture-of-Experts (MoE) model has emerged as a powerful paradigm that addresses these challenges by dynamically routing inputs to specialized experts designed for different aspects of a task. However, despite rapid progress, the community still lacks a comprehensive review of MoE for remote sensing. This survey provides the first systematic overview of MoE applications in remote sensing, covering fundamental principles, architectural designs, and key applications across a variety of remote sensing tasks. The survey also outlines future trends to inspire further research and innovation in applying MoE to remote sensing.

Mixture-of-Experts in Remote Sensing: A Survey

Abstract

Remote sensing data analysis and interpretation present unique challenges due to the diversity in sensor modalities and spatiotemporal dynamics of Earth observation data. Mixture-of-Experts (MoE) model has emerged as a powerful paradigm that addresses these challenges by dynamically routing inputs to specialized experts designed for different aspects of a task. However, despite rapid progress, the community still lacks a comprehensive review of MoE for remote sensing. This survey provides the first systematic overview of MoE applications in remote sensing, covering fundamental principles, architectural designs, and key applications across a variety of remote sensing tasks. The survey also outlines future trends to inspire further research and innovation in applying MoE to remote sensing.

Paper Structure

This paper contains 39 sections, 3 equations, 9 figures.

Figures (9)

  • Figure 1: Word cloud of the most frequent words appearing in MoE-related remote sensing papers.
  • Figure 2: Basic architecture of Mixture-of-Experts (MoE).
  • Figure 3: Overview of Mixture-of-Experts applications in remote sensing.
  • Figure 4: MoE in adaptive mixture-of-experts distillation (AMoED) fu2025adaptive for cross-satellite generalizable incremental scene classification.
  • Figure 5: MoE in mixture-of-spectral-spatial-experts state space model (MambaMoE) xu2025mambamoe for hyperspectral image classification.
  • ...and 4 more figures