- Art Gallery -

In quantum statistical mechanics, the von Neumann entropy, named after John von Neumann, is the extension of classical Gibbs entropy concepts to the field of quantum mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is

$$S=-\operatorname {tr} (\rho \ln \rho ),} where \( \operatorname {tr}$$ denotes the trace and ln denotes the (natural) matrix logarithm. If ρ is written in terms of its eigenvectors $$|1\rangle ,|2\rangle ,|3\rangle ,\dots }$$ as

$$\rho =\sum _{j}\eta _{j}\left|j\right\rangle \left\langle j\right|~,$$

then the von Neumann entropy is merely

$$S=-\sum _{j}\eta _{j}\ln \eta _{j}.}$$

In this form, S can be seen as the information theoretic Shannon entropy.

The von Neumann entropy is also used in different forms (conditional entropies, relative entropies, etc.) in the framework of quantum information theory to characterize the entropy of entanglement.

Background

John von Neumann established a rigorous mathematical framework for quantum mechanics in his 1932 work Mathematical Foundations of Quantum Mechanics. In it, he provided a theory of measurement, where the usual notion of wave-function collapse is described as an irreversible process (the so-called von Neumann or projective measurement).

The density matrix was introduced, with different motivations, by von Neumann and by Lev Landau. The motivation that inspired Landau was the impossibility of describing a subsystem of a composite quantum system by a state vector. On the other hand, von Neumann introduced the density matrix in order to develop both quantum statistical mechanics and a theory of quantum measurements.

The density matrix formalism, thus developed, extended the tools of classical statistical mechanics to the quantum domain. In the classical framework, the probability distribution and partition function of the system allows us to compute all possible thermodynamic quantities. Von Neumann introduced the density matrix to play the same role in the context of quantum states and operators in a complex Hilbert space. The knowledge of the statistical density matrix operator would allow us to compute all average quantum entities in a conceptually similar, but mathematically different, way.

Let us suppose we have a set of wave functions |Ψ〉 that depend parametrically on a set of quantum numbers n1, n2, ..., nN. The natural variable which we have is the amplitude with which a particular wavefunction of the basic set participates in the actual wavefunction of the system. Let us denote the square of this amplitude by p(n1, n2, ..., nN). The goal is to turn this quantity p into the classical density function in phase space. We have to verify that p goes over into the density function in the classical limit, and that it has ergodic properties. After checking that p(n1, n2, ..., nN) is a constant of motion, an ergodic assumption for the probabilities p(n1, n2, ..., nN) makes p a function of the energy only.

After this procedure, one finally arrives at the density matrix formalism when seeking a form where p(n1, n2, ..., nN) is invariant with respect to the representation used. In the form it is written, it will only yield the correct expectation values for quantities which are diagonal with respect to the quantum numbers n1, n2, ..., nN.

Expectation values of operators which are not diagonal involve the phases of the quantum amplitudes. Suppose we encode the quantum numbers n1, n2, ..., nN into the single index i or j. Then our wave function has the form

$$\left|\Psi \right\rangle \,=\,\sum _{i}a_{i}\,\left|\psi _{i}\right\rangle .$$

The expectation value of an operator B which is not diagonal in these wave functions, so

$$\left\langle B\right\rangle \,=\,\sum _{i,j}a_{i}^{*}a_{j}\,\left\langle i\right|B\left|j\right\rangle .$$

The role which was originally reserved for the quantities $$\left|a_{i}\right|^{2}$$ is thus taken over by the density matrix of the system S.

$$\left\langle j\right|\,\rho \,\left|i\right\rangle \,=\,a_{j}\,a_{i}^{*}.$$

$$\left\langle B\right\rangle \,=\,\operatorname {tr} (\rho B)~.}$$

The invariance of the above term is described by matrix theory. A mathematical framework was described where the expectation value of quantum operators, as described by matrices, is obtained by taking the trace of the product of the density operator $${\hat {\rho }}$$ and an operator $${\hat {B}}$$ (Hilbert scalar product between operators). The matrix formalism here is in the statistical mechanics framework, although it applies as well for finite quantum systems, which is usually the case, where the state of the system cannot be described by a pure state, but as a statistical operator $${\hat {\rho }}$$ of the above form. Mathematically, $${\hat {\rho }}$$ is a positive-semidefinite Hermitian matrix with unit trace.
Definition

Given the density matrix ρ, von Neumann defined the entropy as

$$S(\rho )\,=\,-\operatorname {tr} (\rho \ln \rho ),}$$

which is a proper extension of the Gibbs entropy (up to a factor kB) and the Shannon entropy to the quantum case. To compute S(ρ) it is convenient (see logarithm of a matrix) to compute the eigendecomposition of $$~\rho =\sum _{j}\eta _{j}\left|j\right\rangle \left\langle j\right|$$. The von Neumann entropy is then given by

$$S(\rho )\,=\,-\sum _{j}\eta _{j}\ln \eta _{j}~.$$

Since, for a pure state, the density matrix is idempotent, ρ = ρ2, the entropy S(ρ) for it vanishes. Thus, if the system is finite (finite-dimensional matrix representation), the entropy S(ρ) quantifies the departure of the system from a pure state. In other words, it codifies the degree of mixing of the state describing a given finite system. Measurement decoheres a quantum system into something noninterfering and ostensibly classical; so, e.g., the vanishing entropy of a pure state $$\Psi =(\left|0\right\rangle +\left|1\right\rangle )/{\sqrt {2}}$$, corresponding to a density matrix

$$\rho ={1 \over 2}{\begin{pmatrix}1&1\\1&1\end{pmatrix}}$$

increases to $$S=\ln 2\approx 0.69}$$ for the measurement outcome mixture

$$\rho ={1 \over 2}{\begin{pmatrix}1&0\\0&1\end{pmatrix}}$$

as the quantum interference information is erased.
Properties

Some properties of the von Neumann entropy:

S(ρ) is zero if and only if ρ represents a pure state.
S(ρ) is maximal and equal to ln N for a maximally mixed state, N being the dimension of the Hilbert space.
S(ρ) is invariant under changes in the basis of ρ, that is, S(ρ) = S(UρU†), with U a unitary transformation.
S(ρ) is concave, that is, given a collection of positive numbers λi which sum to unity ( $$\Sigma _{i}\lambda _{i}=1)$$ and density operators ρi, we have

$$S{\bigg (}\sum _{i=1}^{k}\lambda _{i}\,\rho _{i}{\bigg )}\,\geq \,\sum _{i=1}^{k}\lambda _{i}\,S(\rho _{i}).$$

S(ρ) satisfies the bound

$$S{\bigg (}\sum _{i=1}^{k}\lambda _{i}\,\rho _{i}{\bigg )}\,\leq \,\sum _{i=1}^{k}\lambda _{i}\,S(\rho _{i})-\sum _{i=1}^{k}\lambda _{i}\log \lambda _{i}.} where equality is achieved if the ρi have orthogonal support, and as before ρi are density operators and λi is a collection of positive numbers which sum to unity ( \( \Sigma _{i}\lambda _{i}=1$$ )

S(ρ) is additive for independent systems. Given two density matrices ρA , ρB describing independent systems A and B, we have

$$S(\rho _{A}\otimes \rho _{B})=S(\rho _{A})+S(\rho _{B}).$$

S(ρ) is strongly subadditive for any three systems A, B, and C:

$$S(\rho _{ABC})+S(\rho _{B})\leq S(\rho _{AB})+S(\rho _{BC}).$$

This automatically means that S(ρ) is subadditive:

$$} S(\rho _{AC})\leq S(\rho _{A})+S(\rho _{C}).$$

Below, the concept of subadditivity is discussed, followed by its generalization to strong subadditivity.

If ρA, ρB are the reduced density matrices of the general state ρAB, then

$$\left|S(\rho _{A})\,-\,S(\rho _{B})\right|\,\leq \,S(\rho _{AB})\,\leq \,S(\rho _{A})\,+\,S(\rho _{B})~.$$

This right hand inequality is known as subadditivity. The two inequalities together are sometimes known as the triangle inequality. They were proved in 1970 by Huzihiro Araki and Elliott H. Lieb. While in Shannon's theory the entropy of a composite system can never be lower than the entropy of any of its parts, in quantum theory this is not the case, i.e., it is possible that S(ρAB) = 0, while S(ρA) = S(ρB) > 0.

Intuitively, this can be understood as follows: In quantum mechanics, the entropy of the joint system can be less than the sum of the entropy of its components because the components may be entangled. For instance, as seen explicitly, the Bell state of two spin-½s,

$$\left|\psi \right\rangle =\left|\uparrow \downarrow \right\rangle +\left|\downarrow \uparrow \right\rangle ,$$

is a pure state with zero entropy, but each spin has maximum entropy when considered individually in its reduced density matrix. The entropy in one spin can be "cancelled" by being correlated with the entropy of the other. The left-hand inequality can be roughly interpreted as saying that entropy can only be cancelled by an equal amount of entropy.

If system A and system B have different amounts of entropy, the smaller can only partially cancel the greater, and some entropy must be left over. Likewise, the right-hand inequality can be interpreted as saying that the entropy of a composite system is maximized when its components are uncorrelated, in which case the total entropy is just a sum of the sub-entropies. This may be more intuitive in the phase space formulation, instead of Hilbert space one, where the Von Neumann entropy amounts to minus the expected value of the ★-logarithm of the Wigner function, −∫ f ★ log★ f dx dp, up to an offset shift. Up to this normalization offset shift, the entropy is majorized by that of its classical limit.
Main article: Strong subadditivity of quantum entropy

The von Neumann entropy is also strongly subadditive. Given three Hilbert spaces, A, B, C,

$$S(\rho _{ABC})\,+\,S(\rho _{B})\,\leq \,S(\rho _{AB})\,+\,S(\rho _{BC}).$$

This is a more difficult theorem and was proved first by J. Kiefer in 1959 and independently by Elliott H. Lieb and Mary Beth Ruskai in 1973, using a matrix inequality of Elliott H. Lieb proved in 1973. By using the proof technique that establishes the left side of the triangle inequality above, one can show that the strong subadditivity inequality is equivalent to the following inequality.

$$S(\rho _{A})\,+\,S(\rho _{C})\,\leq \,S(\rho _{AB})\,+\,S(\rho _{BC})$$

when ρAB, etc. are the reduced density matrices of a density matrix ρABC. If we apply ordinary subadditivity to the left side of this inequality, and consider all permutations of A, B, C, we obtain the triangle inequality for ρABC: Each of the three numbers S(ρAB), S(ρBC), S(ρAC) is less than or equal to the sum of the other two.

Entropy (information theory)
Linear entropy
Partition function (mathematics)
Quantum conditional entropy
Quantum mutual information
Quantum entanglement
Wehrl entropy

References

Bengtsson, Ingemar; Zyczkowski, Karol. Geometry of Quantum States: An Introduction to Quantum Entanglement (1st ed.). p. 301.
Nielsen, Michael A. and Isaac Chuang (2001). Quantum computation and quantum information (Repr. ed.). Cambridge [u.a.]: Cambridge Univ. Press. p. 700. ISBN 978-0-521-63503-5.
Von Neumann, John (1932). Mathematische Grundlagen der Quantenmechanik. Berlin: Springer. ISBN 3-540-59207-5.; Von Neumann, John (1955). Mathematical Foundations of Quantum Mechanics. Princeton University Press. ISBN 978-0-691-02893-4.
Landau, L. (1927). "Das Daempfungsproblem in der Wellenmechanik". Zeitschrift für Physik. 45 (5–6): 430–464. Bibcode:1927ZPhy...45..430L. doi:10.1007/BF01343064.
Geometry of Quantum States: An Introduction to Quantum Entanglement, by Ingemar Bengtsson, Karol Życzkowski, p301
Zachos, C. K. (2007). "A classical bound on quantum entropy". Journal of Physics A: Mathematical and Theoretical. 40 (21): F407. arXiv:hep-th/0609148. Bibcode:2007JPhA...40..407Z. doi:10.1088/1751-8113/40/21/F02.
Huzihiro Araki and Elliott H. Lieb, Entropy Inequalities, Communications in Mathematical Physics, vol 18, 160–170 (1970).
Zurek, W. H. (2003). "Decoherence, einselection, and the quantum origins of the classical". Reviews of Modern Physics. 75 (3): 715. arXiv:quant-ph/0105127. Bibcode:2003RvMP...75..715Z. doi:10.1103/RevModPhys.75.715.
Kiefer, J. (July 1959). "Optimum Experimental Designs". Journal of the Royal Statistical Society: Series B (Methodological). 21 (2): 272–310.
Ruskai, Mary Beth. "Evolution of a Fundemental Theorem on Quantum Entropy". youtube.com. World Scientific. Retrieved 20 August 2020. "Invited talk at the Conference in Honour of the 90th Birthday of Freeman Dyson, Institute of Advanced Studies, Nanyang Technological University, Singapore, 26-29 August 2013. The note on Kiefer (1959) is at the 26:40 mark."
Elliott H. Lieb and Mary Beth Ruskai, Proof of the Strong Subadditivity of Quantum-Mechanical Entropy, Journal of Mathematical Physics, vol 14, 1938–1941 (1973).
Elliott H. Lieb, Convex Trace Functions and the Wigner–Yanase–Dyson Conjecture, Advances in Mathematics, vol 67, 267–288 (1973).

Quantum mechanics
Background

Introduction History
timeline Glossary Classical mechanics Old quantum theory

Fundamentals

Bra–ket notation Casimir effect Coherence Coherent control Complementarity Density matrix Energy level
degenerate levels excited state ground state QED vacuum QCD vacuum Vacuum state Zero-point energy Hamiltonian Heisenberg uncertainty principle Pauli exclusion principle Measurement Observable Operator Probability distribution Quantum Qubit Qutrit Scattering theory Spin Spontaneous parametric down-conversion Symmetry Symmetry breaking
Spontaneous symmetry breaking No-go theorem No-cloning theorem Von Neumann entropy Wave interference Wave function
collapse Universal wavefunction Wave–particle duality
Matter wave Wave propagation Virtual particle

Quantum

quantum coherence annealing decoherence entanglement fluctuation foam levitation noise nonlocality number realm state superposition system tunnelling Quantum vacuum state

Mathematics
Equations

Dirac Klein–Gordon Pauli Rydberg Schrödinger

Formulations

Heisenberg Interaction Matrix mechanics Path integral formulation Phase space Schrödinger

Other

Quantum
algebra calculus
differential stochastic geometry group Q-analog
List

Interpretations

Bayesian Consistent histories Cosmological Copenhagen de Broglie–Bohm Ensemble Hidden variables Many worlds Objective collapse Quantum logic Relational Stochastic Transactional

Experiments

Afshar Bell's inequality Cold Atom Laboratory Davisson–Germer Delayed-choice quantum eraser Double-slit Elitzur–Vaidman Franck–Hertz experiment Leggett–Garg inequality Mach-Zehnder inter. Popper Quantum eraser Quantum suicide and immortality Schrödinger's cat Stern–Gerlach Wheeler's delayed choice

Science

Measurement problem QBism

Quantum

biology chemistry chaos cognition complexity theory computing
Timeline cosmology dynamics economics finance foundations game theory information nanoscience metrology mind optics probability social science spacetime

Technologies

Quantum technology
links Matrix isolation Phase qubit Quantum dot
cellular automaton display laser single-photon source solar cell Quantum well
laser

Extensions

Dirac sea Fractional quantum mechanics Quantum electrodynamics
links Quantum geometry Quantum field theory
links Quantum statistical mechanics Relativistic quantum mechanics De Broglie–Bohm theory Stochastic electrodynamics

Related

Quantum mechanics of time travel Textbooks

Physics Encyclopedia

World

Index