Table of Contents
Fetching ...

"Extrinsic" and "intrinsic" data in quantum measurements: asymptotic convex decomposition of positive operator valued measures

Andreas Winter

TL;DR

The paper addresses how to separate extrinsic (non-informative) data from intrinsic (informative) data in quantum measurements described by POVMs, deriving an asymptotically tight convex-decomposition theorem. The intrinsic data rate is given by the Holevo mutual information $I(\lambda; \hat{\rho})$, while the extrinsic rate is $H(\lambda)-I(\lambda; \hat{\rho})$, with block-length $l$ controlling the exponential rates via $M=\exp(l I(\lambda; \hat{\rho})+O(\sqrt{l}))$ and $N=\exp(l(H(\lambda)-I(\lambda; \hat{\rho}))+O(\sqrt{l}))$. A strong converse establishes the asymptotic optimality of these rates, and the framework extends to quantum instruments and Kraus maps, linking to entropy exchange and the Holevo bound. The discussion clarifies the data-information distinction, connects to classical sufficiency, and outlines open questions for arbitrarily varying sources, illustrating broad implications for quantum measurement theory and open-dynamics entropy analysis.

Abstract

We study the problem of separating the data produced by a given quantum measurement (on states from a memoryless source which is unknown except for its average state), described by a positive operator valued measure (POVM), into a "meaningful" (intrinsic) and a "not meaningful" (extrinsic) part. We are able to give an asymptotically tight separation of this form, with the "intrinsic" data quantfied by the Holevo mutual information of a certain state ensemble associated to the POVM and the source, in a model that can be viewed as the asymptotic version of the convex decomposition of POVMs into extremal ones. This result is applied to a similar separation therorem for quantum instruments and quantum operations, in their Kraus form. Finally we comment on links to related subjects: we stress the difference between data and information (in particular by pointing out that information typically is strictly less than data), derive the Holevo bound from our main result, and look at its classical case: we show that this includes the solution to the problem of extrinsic/intrinsic data separation with a known source, then compare with the well-known notion of sufficient statistics. The result on decomposition of quantum operations is used to exhibit a new aspect of the concept of entropy exchange of an open dynamics. An appendix collects several estimates for mixed state fidelity and trace norm distance, that seem to be new, in particular a construction of canonical purification of mixed states that turns out to be valuable to analyze their fidelity.

"Extrinsic" and "intrinsic" data in quantum measurements: asymptotic convex decomposition of positive operator valued measures

TL;DR

The paper addresses how to separate extrinsic (non-informative) data from intrinsic (informative) data in quantum measurements described by POVMs, deriving an asymptotically tight convex-decomposition theorem. The intrinsic data rate is given by the Holevo mutual information , while the extrinsic rate is , with block-length controlling the exponential rates via and . A strong converse establishes the asymptotic optimality of these rates, and the framework extends to quantum instruments and Kraus maps, linking to entropy exchange and the Holevo bound. The discussion clarifies the data-information distinction, connects to classical sufficiency, and outlines open questions for arbitrarily varying sources, illustrating broad implications for quantum measurement theory and open-dynamics entropy analysis.

Abstract

We study the problem of separating the data produced by a given quantum measurement (on states from a memoryless source which is unknown except for its average state), described by a positive operator valued measure (POVM), into a "meaningful" (intrinsic) and a "not meaningful" (extrinsic) part. We are able to give an asymptotically tight separation of this form, with the "intrinsic" data quantfied by the Holevo mutual information of a certain state ensemble associated to the POVM and the source, in a model that can be viewed as the asymptotic version of the convex decomposition of POVMs into extremal ones. This result is applied to a similar separation therorem for quantum instruments and quantum operations, in their Kraus form. Finally we comment on links to related subjects: we stress the difference between data and information (in particular by pointing out that information typically is strictly less than data), derive the Holevo bound from our main result, and look at its classical case: we show that this includes the solution to the problem of extrinsic/intrinsic data separation with a known source, then compare with the well-known notion of sufficient statistics. The result on decomposition of quantum operations is used to exhibit a new aspect of the concept of entropy exchange of an open dynamics. An appendix collects several estimates for mixed state fidelity and trace norm distance, that seem to be new, in particular a construction of canonical purification of mixed states that turns out to be valuable to analyze their fidelity.

Paper Structure

This paper contains 14 sections, 12 theorems, 158 equations, 3 figures.

Key Result

Theorem 2

There exist POVMs ${\bf A}^{(\nu)}$ on $[m]^l$, $\nu=1,\ldots,N$, each supported on a set of cardinality at most $M$, where such that for ${\bf A}=\frac{1}{N}\sum_\nu {\bf A}^{(\nu)}$ condition (CM) is satisfied.

Figures (3)

  • Figure 1: The source represents a number of possible states encountered by the POVM, but there is no way of knowing which is present (apart from the apriori distribution). The data produced by the measurement is then stored in a record. The rates of these processes are represented by the sizes of the different boxes and width of the data flow arrows: originally the rates of the source and of the measurement outcomes are both large.
  • Figure 2: A nice way of picturing the content of theorem \ref{['thm:ex-and-in']} is in the form of an elaborate bottleneck between source and outcomes: it is supplied from outside with the extrinsic data $\nu$, and conditional on this and the incoming $k$ produces the intrisic data $j^l$. Only the intrinsic data are correlated to the signal $k$, while the extrinsic data (though evidently an indispensable part of the whole data) is independent of it. To put it pointedly: while it is difficult and possibly ambiguous to speak of "useful data", one can clearly identify data of no import in all respects: the unrelated randomness $\nu$. This is put into the focus by theorem \ref{['thm:ex-and-in']}, and our concept of usefulness is just the remainder after extracting as much uselessness as possible.
  • Figure 3: In massar:popescu and winter:massar:POVMcompr the original POVM is replaced by an "equivalent" one (as made precise in theorems \ref{['thm:massar:popescu']} and \ref{['thm:POVM:compr:1']}) with much fewer outcomes. So, POVM and data record need much less rate of processing and storage, respectively. Of course, compared to theorem \ref{['thm:ex-and-in']} we loose many potential measurement results in constructing the new POVM.

Theorems & Definitions (15)

  • Example 1
  • Theorem 2
  • Lemma 3: Ahlswede, Winter ahlswede:winter:QID, thm. A.19
  • Lemma 4: Lemma V.9 of winter:ieee_strong
  • Theorem 5: Massar, Popescu massar:popescu
  • Theorem 6: Winter, Massar winter:massar:POVMcompr
  • Theorem 7: Winter, Massar winter:massar:POVMcompr
  • Theorem 8
  • Remark 9
  • Theorem 10
  • ...and 5 more