Table of Contents
Fetching ...

A Multi-resolution Low-rank Tensor Decomposition

Sergio Rozada, Antonio G. Marques

TL;DR

The paper addresses the challenge of efficiently decomposing high-order tensors by proposing a multi-resolution low-rank tensor decomposition (MRLR) that represents a tensor as a sum of low-rank components derived from multiple lower-order unfoldings. It introduces partitions ${\mathcal{P}}^{(l)}$ to obtain representations $\mathbf Z_l$ with rank constraints $R_l$, and develops an alternating least squares (ALS) algorithm that sequentially estimates these components, modeling each $\mathbf Z_i$ via a PARAFAC factorization and updating through matricized residuals ${\hat{\mathbf X}}^i$. The method is demonstrated on amino acids, video, and multivariate function data, showing that MRLR achieves lower normalised Frobenius error with fewer parameters than PARAFAC, particularly when using coarser-to-finer partitions. This work provides a scalable, structure-exploiting framework for tensor decomposition that leverages information at different dimensional orders, with practical impact for multi-way data analysis in fields such as chemometrics, computer vision, and signal processing. Future work includes exploring alternative partition designs and partition-aware initialization to further enhance performance.

Abstract

The (efficient and parsimonious) decomposition of higher-order tensors is a fundamental problem with numerous applications in a variety of fields. Several methods have been proposed in the literature to that end, with the Tucker and PARAFAC decompositions being the most prominent ones. Inspired by the latter, in this work we propose a multi-resolution low-rank tensor decomposition to describe (approximate) a tensor in a hierarchical fashion. The central idea of the decomposition is to recast the tensor into \emph{multiple} lower-dimensional tensors to exploit the structure at different levels of resolution. The method is first explained, an alternating least squares algorithm is discussed, and preliminary simulations illustrating the potential practical relevance are provided.

A Multi-resolution Low-rank Tensor Decomposition

TL;DR

The paper addresses the challenge of efficiently decomposing high-order tensors by proposing a multi-resolution low-rank tensor decomposition (MRLR) that represents a tensor as a sum of low-rank components derived from multiple lower-order unfoldings. It introduces partitions to obtain representations with rank constraints , and develops an alternating least squares (ALS) algorithm that sequentially estimates these components, modeling each via a PARAFAC factorization and updating through matricized residuals . The method is demonstrated on amino acids, video, and multivariate function data, showing that MRLR achieves lower normalised Frobenius error with fewer parameters than PARAFAC, particularly when using coarser-to-finer partitions. This work provides a scalable, structure-exploiting framework for tensor decomposition that leverages information at different dimensional orders, with practical impact for multi-way data analysis in fields such as chemometrics, computer vision, and signal processing. Future work includes exploring alternative partition designs and partition-aware initialization to further enhance performance.

Abstract

The (efficient and parsimonious) decomposition of higher-order tensors is a fundamental problem with numerous applications in a variety of fields. Several methods have been proposed in the literature to that end, with the Tucker and PARAFAC decompositions being the most prominent ones. Inspired by the latter, in this work we propose a multi-resolution low-rank tensor decomposition to describe (approximate) a tensor in a hierarchical fashion. The central idea of the decomposition is to recast the tensor into \emph{multiple} lower-dimensional tensors to exploit the structure at different levels of resolution. The method is first explained, an alternating least squares algorithm is discussed, and preliminary simulations illustrating the potential practical relevance are provided.

Paper Structure

This paper contains 10 sections, 18 equations, 3 figures.

Figures (3)

  • Figure 1: Normalized Squared Frobenius Error \ref{['eq:NFE']} between the original $5 \times 201 \times 61$ amino acids tensor and its approximation obtained via the MRLR and the PARAFAC tensor decompositions when the number of parameters (tensor rank) is increased.
  • Figure 2: Normalized Squared Frobenius Error \ref{['eq:NFE']} between the original $9 \times 36 \times 54 \times 3$ video signal tensor and its approximation obtained via the MRLR and the PARAFAC tensor decompositions when the number of parameters (tensor rank) is increased.
  • Figure 3: Normalized Squared Frobenius Error \ref{['eq:NFE']} between the $100 \times 100 \times 100$ tensor sampled from the multivariate function in \ref{['eq:func_approx']} and its approximation obtained via the MRLR and the PARAFAC tensor decompositions when the number of parameters (tensor rank) is increased.