Accessible information

The accessible information is the amount of classical information that can be extracted from a quantum system by an optimal measurement when the information is encoded using a particular ensemble of quantum states.

More precisely, consider an ensemble E = {(pX, ρX)} where the probabilities come from the random variable X. Let YP be the random variable that denotes the outcomes of the measurement described by a POVM P. The mutual information between X and Y


I(X : YP) = H(X) − H(XYP)

quantifies how much information Y contains about X. The accessible information is the maximum of this when all possible POVMs are possible,


Iacc(E) = maxPI(X : YP).

Bounds

The accessible information is upper bounded by the Holevo quantityHolevo1973,


Iacc(E) ≤ χ(E) = S(∑pXρX) − ∑pXS(ρX), 
where S is the von Neumann entropy.

By substituting the von Neumann entropy in the Holevo quantity for the subentropy Q, one gets a lower bound JozsaRobbWootters1994,


Iacc(E) ≥ χ(E) = Q(∑pXρX) − ∑pXQ(ρX).

SOMIM (open-source code)

There is an open-source program code called SOMIM (Search for Optimal Measurements by an Iterative Method), which calculates the maximal mutual information (accessible information). For a given set of statistical operators, SOMIM finds the POVMs that maximize the accessed information, and thus determines the accessible information and one or all of the POVMs that retrieve it. The maximization procedure is a steepest-ascent method that follows the gradient in the POVM space, and also uses conjugate gradients for speed-up.

The complete set of files including the codes and manual can be found at the SOMIM website: http://www.quantumlah.org/publications/software/SOMIM/.

Category:Quantum Information Theory

Last modified: 

Monday, October 26, 2015 - 17:56