Quantum mutual information
Encyclopedia
In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann
, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information
.
The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables p(x, y), the two marginal distributions are
The classical mutual information I(X, Y) is defined by
where S(q) denotes the Shannon entropy of the probability distribution q.
One can calculate directly
So the mutual information is
But this is precisely the relative entropy between p(x, y) and p(x)p(y). In other words, if we assume the two variables x and y to be uncorrelated, mutual information is the discrepancy in uncertainty resulting from this (possibly erroneous) assumption.
It follows from the property of relative entropy that I(X,Y) ≥ 0 and equality holds if and only if p(x, y) = p(x)p(y).
.
Consider a composite quantum system whose state space is the tensor product
Let ρAB be a density matrix acting on H. The von Neumann entropy
of ρ, which is the quantum mechanical analogy of the Shannon entropy, is given by
For a probability distribution p(x,y), the marginal distributions are obtained by integrating away the variables x or y. The corresponding operation for density matrices is the partial trace
. So one can assign to ρ a state on the subsystem A by
where TrB is partial trace with respect to system B. This is the reduced state of ρAB on system A. The reduced von Neumann entropy of ρAB with respect to system A is
S(ρB) is defined in the same way.
Technical Note: In mathematical language, passing from the classical to quantum setting can be described as follows. The algebra of observables of a physical system is a C*-algebra and states are unital linear functionals on the algebra. Classical systems are described by commutative C*-algebras, therefore classical states are probability measure
s. Quantum mechanical systems have non-commutative observable algebras. In concrete considerations, quantum states are density operators. If the probability measure μ is a state on classical composite system consisting of two subsystem A and B, we project μ onto the system A to obtain the reduced state. As stated above, the quantum analog of this is the partial trace operation, which can be viewed as projection onto a tensor component. End of note
It can now be seen that the appropriate definition of quantum mutual information should be
Quantum mutual information can be interpreted the same way as in the classical case: it can be shown that
where denotes quantum relative entropy
.
There is an open-source program code called SOMIM (Search for Optimal Measurements by an Iterative Method), which calculates the maximal mutual information (accessible information). For a given set of statistical operators, SOMIM finds the POVMs that maximize the accessed information, and thus determines the accessible information and one or all of the POVMs that retrieve it. The maximization procedure is a steepest-ascent method that follows the gradient in the POVM space, and also uses conjugate gradients
for speed-up.
The complete set of files including the codes and manual can be found at the SOMIM website: http://www.quantumlah.org/publications/software/SOMIM/.
SeCQC:
Another open-source program code called SeCQC (Search for the classical Capacity of Quantum Channels). Given a quantum channel, SeCQC finds the statistical operators and POVM outcomes that maximize the accessible information, and thus determines the classical capacity of the quantum channel.
The complete set of files including the codes and manual can be found at the SeCQC website: http://www.quantumlah.org/publications/software/SeCQC/.
John von Neumann
John von Neumann was a Hungarian-American mathematician and polymath who made major contributions to a vast number of fields, including set theory, functional analysis, quantum mechanics, ergodic theory, geometry, fluid dynamics, economics and game theory, computer science, numerical analysis,...
, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information
Mutual information
In probability theory and information theory, the mutual information of two random variables is a quantity that measures the mutual dependence of the two random variables...
.
Motivation
For simplicity, it will be assumed that all objects in the article are finite dimensional.The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables p(x, y), the two marginal distributions are
The classical mutual information I(X, Y) is defined by
where S(q) denotes the Shannon entropy of the probability distribution q.
One can calculate directly
So the mutual information is
But this is precisely the relative entropy between p(x, y) and p(x)p(y). In other words, if we assume the two variables x and y to be uncorrelated, mutual information is the discrepancy in uncertainty resulting from this (possibly erroneous) assumption.
It follows from the property of relative entropy that I(X,Y) ≥ 0 and equality holds if and only if p(x, y) = p(x)p(y).
Definition
The quantum mechanical counterpart of classical probability distributions are density matricesDensity matrix
In quantum mechanics, a density matrix is a self-adjoint positive-semidefinite matrix of trace one, that describes the statistical state of a quantum system...
.
Consider a composite quantum system whose state space is the tensor product
Let ρAB be a density matrix acting on H. The von Neumann entropy
Von Neumann entropy
In quantum statistical mechanics, von Neumann entropy, named after John von Neumann, is the extension of classical entropy concepts to the field of quantum mechanics....
of ρ, which is the quantum mechanical analogy of the Shannon entropy, is given by
For a probability distribution p(x,y), the marginal distributions are obtained by integrating away the variables x or y. The corresponding operation for density matrices is the partial trace
Partial trace
In linear algebra and functional analysis, the partial trace is a generalization of the trace. Whereas the trace is a scalar valued function on operators, the partial trace is an operator-valued function...
. So one can assign to ρ a state on the subsystem A by
where TrB is partial trace with respect to system B. This is the reduced state of ρAB on system A. The reduced von Neumann entropy of ρAB with respect to system A is
S(ρB) is defined in the same way.
Technical Note: In mathematical language, passing from the classical to quantum setting can be described as follows. The algebra of observables of a physical system is a C*-algebra and states are unital linear functionals on the algebra. Classical systems are described by commutative C*-algebras, therefore classical states are probability measure
Probability measure
In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as countable additivity...
s. Quantum mechanical systems have non-commutative observable algebras. In concrete considerations, quantum states are density operators. If the probability measure μ is a state on classical composite system consisting of two subsystem A and B, we project μ onto the system A to obtain the reduced state. As stated above, the quantum analog of this is the partial trace operation, which can be viewed as projection onto a tensor component. End of note
It can now be seen that the appropriate definition of quantum mutual information should be
Quantum mutual information can be interpreted the same way as in the classical case: it can be shown that
where denotes quantum relative entropy
Quantum relative entropy
In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy.- Motivation :...
.
Open-source code (SOMIM and SeCQC)
SOMIM:There is an open-source program code called SOMIM (Search for Optimal Measurements by an Iterative Method), which calculates the maximal mutual information (accessible information). For a given set of statistical operators, SOMIM finds the POVMs that maximize the accessed information, and thus determines the accessible information and one or all of the POVMs that retrieve it. The maximization procedure is a steepest-ascent method that follows the gradient in the POVM space, and also uses conjugate gradients
Conjugate gradient method
In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is symmetric and positive-definite. The conjugate gradient method is an iterative method, so it can be applied to sparse systems that are too...
for speed-up.
The complete set of files including the codes and manual can be found at the SOMIM website: http://www.quantumlah.org/publications/software/SOMIM/.
SeCQC:
Another open-source program code called SeCQC (Search for the classical Capacity of Quantum Channels). Given a quantum channel, SeCQC finds the statistical operators and POVM outcomes that maximize the accessible information, and thus determines the classical capacity of the quantum channel.
The complete set of files including the codes and manual can be found at the SeCQC website: http://www.quantumlah.org/publications/software/SeCQC/.