Quantum relative entropy

The quantum relative entropy, in analogy with the classical relative entropy or Kullback-Leibler divergence, is a measure of closeness between two states. It was defined by Denis Petz as


D(ρ∣∣σ) = Trρ(logρ − logσ)

where ρ and σ are two density matrices.

It has a number of applications including its use in proving strong sub-additivity. One can also define a relative entropy distance to some set of states, where one minimises D(ρ∣∣σ) over all σ from some convex set. If the convex set is chosen to be the set of seperable states, then the relative entropy distance is an entanglement measure, called the relative entropy of entanglement. If the set of states are product states, then the relative entropy distance is the quantum mutual information

See also

Category:Quantum Information Theory Category:Handbook of Quantum Information Category:Entropy

Last modified: 

Sunday, November 12, 2017 - 00:49