Table of Contents
Fetching ...

Hierarchical Tensor Network Structure Search for High-Dimensional Data

Zheng Guo, Aditya Deshpande, Xinyu Wang, Brian C. Kiedrowski, Alex A. Gorodetsky

Abstract

Tensor network methods provide a scalable solution to represent high-dimensional data. However, their efficacy is often limited by static, expert-defined structures that fail to adapt to evolving data correlations. We address this limitation by formalizing the tensor network structural rounding problem and introducing the hierarchical structure search algorithm HISS, which automatically identifies near-optimal structures and index reshaping for arbitrary tree networks. To navigate the combinatorial explosion of the structural search space, HISS integrates stochastic sub-network sampling with hierarchical refinement. This approach utilizes entropy-guided index clustering to reduce dimensionality and targeted reshaping to expose latent data correlations. Numerical experiments on analytical functions and real-world physics applications, including thermal radiation transport, neutron diffusion, and computational fluid dynamics, demonstrate that HISS exhibits empirical polynomial scaling with dimensionality relative to the sampling budget, bypassing the scalability barriers in prior work. HISS achieves compression ratios $2.5\times$ to $100\times$ higher than standard fixed formats such as Tensor Trains and Hierarchical Tuckers~(peaking at $1000\times$). Furthermore, HISS discovers structures that generalize effectively: applying a structure optimized for one data instance to a related target data typically maintains compression performance within $10\%$ of the result obtained by performing structure search on that target data. These results highlight HISS as a robust, automated tool for adaptive data representation and high-dimensional simulation compression with tensor network methods.

Hierarchical Tensor Network Structure Search for High-Dimensional Data

Abstract

Tensor network methods provide a scalable solution to represent high-dimensional data. However, their efficacy is often limited by static, expert-defined structures that fail to adapt to evolving data correlations. We address this limitation by formalizing the tensor network structural rounding problem and introducing the hierarchical structure search algorithm HISS, which automatically identifies near-optimal structures and index reshaping for arbitrary tree networks. To navigate the combinatorial explosion of the structural search space, HISS integrates stochastic sub-network sampling with hierarchical refinement. This approach utilizes entropy-guided index clustering to reduce dimensionality and targeted reshaping to expose latent data correlations. Numerical experiments on analytical functions and real-world physics applications, including thermal radiation transport, neutron diffusion, and computational fluid dynamics, demonstrate that HISS exhibits empirical polynomial scaling with dimensionality relative to the sampling budget, bypassing the scalability barriers in prior work. HISS achieves compression ratios to higher than standard fixed formats such as Tensor Trains and Hierarchical Tuckers~(peaking at ). Furthermore, HISS discovers structures that generalize effectively: applying a structure optimized for one data instance to a related target data typically maintains compression performance within of the result obtained by performing structure search on that target data. These results highlight HISS as a robust, automated tool for adaptive data representation and high-dimensional simulation compression with tensor network methods.

Paper Structure

This paper contains 30 sections, 11 equations, 20 figures, 1 table, 6 algorithms.

Figures (20)

  • Figure 1: Traditional tensor network structure search guo2025tensor with the extension to support index reshaping. It enumerates different reshaping of indices and searches for optimal structures for every reshaping configuration. This naïve extension often fails as dimensionality increases.
  • Figure 2: Hierarchical tensor network structure search (Hiss): b1 stochastically samples a sub-network from the input network, refines the local structure, and substitute the resulting network into the input; b2 groups indices into sets and performs structure search on clustered indices to get an optimized network; b3 partitions the network into subnets for recursive optimization, eventually reintegrating them into a refined tree.
  • Figure 3: Cost computation for index pairs. Given a tree network, the process orthonormalizes the nodes relative to $\mathscr{G}_1$. Then, it merges two nodes $\mathscr{G}_1$ and $\mathscr{G}_2$, computes singular values for index pair $(I_1, I_2)$, swaps the index $I_1$ to the right of $I_2$, and repeats the previous steps until the costs of all index pairs involving $I_1$ have been computed. Orthonormalized nodes are marked as squares, and operating nodes are highlighted in green.
  • Figure 4: Illustration of singular value computation in tensor networks with clustered indices. The initial tensor network has $6$ free indices that forms $4$ clusters $\{I_1\}, \{I_2, I_4\}, \{I_3, I_5\}, \{I_6\}$. The two rows describe the singular value computation for $\mathcal{N}^{(I_2, I_3, I_4, I_5)}$ and $\mathcal{N}^{(I_1, I_3, I_5)}$ respectively. Swapping nodes are highlighted in green.
  • Figure 5: Resolution of structural conflicts via lazy merge during structure transformations. The modifying sub-networks are bolded and highlighted in green. Node splits are marked with orange lines, and the resulting edges are also highlighted in orange. (a) The algorithm resolves the conflict by merging the overlapping core tensors, and the intermediate node is split to satisfy the target bi-partition. (b) The algorithm only resolves the conflicting bi-partition $\{I_1,I_2,I_3\} \mid \overline{\{I_1,I_2,I_3\}}$ without merging $\{I_1\} \mid \overline{\{I_1\}}$.
  • ...and 15 more figures