Table of Contents
Fetching ...

Fundamental Analysis of Scalable Fluid Antenna Systems: Identifiability Limits, Information Theory, and Joint Processing

Tuo Wu, Kai-Kit Wong, Jie Tang, Ye Tian, Baiyang Liu, Maged Elkashlan, Kin-Fai Tong, Hing Cheung So, Matthew C. Valenti, Fumiyuki Adachi, Kwai-Man Luk

Abstract

Unlike fixed-position arrays with static observation entropy, the scalable fluid antenna system (S-FAS) can dynamically adjust its aperture to form different observation spaces with configuration-dependent entropy budgets. This reconfigurability requires an information-theoretic framework beyond traditional algebraic identifiability analysis. This paper establishes an observation entropy framework for S-FAS, which unifies the derivation of identifiability limits, the diagnosis of processing bottlenecks, and system design optimization. For an S-FAS with mutual coupling suppression, we derive a complete capacity hierarchy among compressed, extended, and jointly stacked configurations. The entropy framework reveals that sequential two-stage processing suffers from an information bottleneck that restricts achievable capacity, while the noise entropy ratio can be used to distinguish fundamental performance limits from algorithmic deficiencies. A joint MUSIC algorithm is proposed to approach the theoretical joint capacity bound. Extensive Monte Carlo simulations, validated by both algebraic and information-theoretic criteria, verify the derived capacity hierarchy and identifiability boundaries.

Fundamental Analysis of Scalable Fluid Antenna Systems: Identifiability Limits, Information Theory, and Joint Processing

Abstract

Unlike fixed-position arrays with static observation entropy, the scalable fluid antenna system (S-FAS) can dynamically adjust its aperture to form different observation spaces with configuration-dependent entropy budgets. This reconfigurability requires an information-theoretic framework beyond traditional algebraic identifiability analysis. This paper establishes an observation entropy framework for S-FAS, which unifies the derivation of identifiability limits, the diagnosis of processing bottlenecks, and system design optimization. For an S-FAS with mutual coupling suppression, we derive a complete capacity hierarchy among compressed, extended, and jointly stacked configurations. The entropy framework reveals that sequential two-stage processing suffers from an information bottleneck that restricts achievable capacity, while the noise entropy ratio can be used to distinguish fundamental performance limits from algorithmic deficiencies. A joint MUSIC algorithm is proposed to approach the theoretical joint capacity bound. Extensive Monte Carlo simulations, validated by both algebraic and information-theoretic criteria, verify the derived capacity hierarchy and identifiability boundaries.

Paper Structure

This paper contains 32 sections, 11 theorems, 55 equations, 15 figures, 1 table, 1 algorithm.

Key Result

Lemma 1

If the array manifold $\mathbf{A} \in \mathbb{C}^{M_{\text{eff}} \times K}$ has full column rank (i.e., $\text{rank}(\mathbf{A}) = K$) and $K < M_{\text{eff}}$, then the covariance matrix $\mathbf{R}$ has exactly $K$ eigenvalues larger than $\sigma_n^2$ and $M_{\text{eff}} - K$ eigenvalues equal to

Figures (15)

  • Figure 1: Fundamental constraint: Algebraic validation via $\dim(\mathbf{U}_n)$ versus number of Sources $K$, illustrating Subspace Dimension behavior.
  • Figure 2: Fundamental constraint: Information-theoretic validation via noise Entropy ratio $\rho_n(K)$ at different SNR levels for different numbers of Sources $K$.
  • Figure 3: DoF Reduction: Validation showing $K_{\max} = M - 2p - 1$ as function of edge Removal $p$ for different values of $p$.
  • Figure 4: Compressed configuration: Algebraic validation via $\dim(\mathbf{U}_n)$ versus number of Sources $K$ under realistic mutual coupling.
  • Figure 5: Compressed configuration: Information-theoretic validation via noise entropy ratio $\rho_n(K)$.
  • ...and 10 more figures

Theorems & Definitions (17)

  • Definition 1
  • Lemma 1
  • Proposition 1
  • Remark 1: Information-Theoretic Interpretation
  • Lemma 2
  • Theorem 1
  • Corollary 1
  • Remark 2
  • Proposition 2
  • Theorem 2
  • ...and 7 more