A Matrix Product State Model for Simultaneous Classification and Generation
Alex Mossi, Bojan Žunkovic, Kyriakos Flouris
TL;DR
This work introduces a Matrix Product State (MPS) model that acts as both a classifier and a generator within a GAN-inspired training regime to improve generative realism without sacrificing classification accuracy. The method leverages embedding functions that map inputs to an MPS-compatible representation, and employs an exact sampling procedure for non-normalized PDFs via a reduced density matrix. It analyzes Fourier versus Legendre embeddings, demonstrating GAN-style training reduces outliers and improves generation (measured by a bound-like $FID$-style score) while preserving accuracy, with latent-space dynamics providing insights into class structure and perturbation robustness. The approach offers a scalable, tensor-network-based alternative for joint classification-generation tasks on low-dimensional data and points to extensions with richer embeddings and integration with broader generative frameworks.
Abstract
Quantum machine learning (QML) is a rapidly expanding field that merges the principles of quantum computing with the techniques of machine learning. One of the powerful mathematical frameworks in this domain is tensor networks. These networks are used to approximate high-order tensors by contracting tensors with lower ranks. Initially developed for simulating quantum systems, tensor networks have become integral to quantum computing and, by extension, to QML. Drawing inspiration from these quantum methods, specifically the Matrix Product States (MPS), we apply them in a classical machine learning setting. Their ability to efficiently represent and manipulate complex, high-dimensional data makes them effective in a supervised learning framework. Here, we present an MPS model, in which the MPS functions as both a classifier and a generator. The dual functionality of this novel MPS model permits a strategy that enhances the traditional training of supervised MPS models. This framework is inspired by generative adversarial networks and is geared towards generating more realistic samples by reducing outliers. In addition, our contributions offer insights into the mechanics of tensor network methods for generation tasks. Specifically, we discuss alternative embedding functions and a new sampling method from non-normalized MPSs.
