Vlatko Vedral
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780199215706
- eISBN:
- 9780191706783
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199215706.003.0005
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
Information is often considered classical in a definite state rather than in a superposition of states. It seems rather strange to consider information in superpositions. Some people would, on the ...
More
Information is often considered classical in a definite state rather than in a superposition of states. It seems rather strange to consider information in superpositions. Some people would, on the basis of this argument, conclude that quantum information can never exist and we can only have access to classical information. It turns out, however, that quantum information can be quantified in the same way as classical information using Shannon's prescription. There is a unique measure (up to a constant additive or multiplicative term) of quantum information such that S (the von Neumann entropy) is purely a function of the probabilities of outcomes of measurements made on a quantum system (that is, a function of a density operator); S is a continuous function of probability; S is additive. This chapter discusses the fidelity of pure quantum states, Helstrom's discrimination, quantum data compression, entropy of observation, conditional entropy and mutual information, relative entropy, and statistical interpretation of relative entropy.Less
Information is often considered classical in a definite state rather than in a superposition of states. It seems rather strange to consider information in superpositions. Some people would, on the basis of this argument, conclude that quantum information can never exist and we can only have access to classical information. It turns out, however, that quantum information can be quantified in the same way as classical information using Shannon's prescription. There is a unique measure (up to a constant additive or multiplicative term) of quantum information such that S (the von Neumann entropy) is purely a function of the probabilities of outcomes of measurements made on a quantum system (that is, a function of a density operator); S is a continuous function of probability; S is additive. This chapter discusses the fidelity of pure quantum states, Helstrom's discrimination, quantum data compression, entropy of observation, conditional entropy and mutual information, relative entropy, and statistical interpretation of relative entropy.
Robert Alicki and Mark Fannes
- Published in print:
- 2001
- Published Online:
- February 2010
- ISBN:
- 9780198504009
- eISBN:
- 9780191708503
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198504009.003.0009
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter begins with the discussion of various meanings of the entropy: Boltzmann entropy, Shannon entropy, von Neumann entropy. It then presents the basic properties of the von Neumann entropy ...
More
This chapter begins with the discussion of various meanings of the entropy: Boltzmann entropy, Shannon entropy, von Neumann entropy. It then presents the basic properties of the von Neumann entropy such as concavity, sub-additivity, strong sub-additivity, and continuity. It proves the existence of mean entropy for shift-invariant states on quantum spin chains, and then derives the expression for the mean entropy for quasi-free Fermions on a chain. The chapter defines the quantum relative entropy and presents its basic properties, including behaviour with respect to completely positive maps. The maximum entropy principle defines thermal equilibrium states (Gibbs states). This variational principle is illustrated by the Hartree–Fock approximation for a model of interacting Fermions. The entropy of an equilibrium state for a free quantum particle on a compact Riemannian manifold is also estimated. Finally, the notion of relative entropy is formulated in the algebraic setting using the relative modular operator.Less
This chapter begins with the discussion of various meanings of the entropy: Boltzmann entropy, Shannon entropy, von Neumann entropy. It then presents the basic properties of the von Neumann entropy such as concavity, sub-additivity, strong sub-additivity, and continuity. It proves the existence of mean entropy for shift-invariant states on quantum spin chains, and then derives the expression for the mean entropy for quasi-free Fermions on a chain. The chapter defines the quantum relative entropy and presents its basic properties, including behaviour with respect to completely positive maps. The maximum entropy principle defines thermal equilibrium states (Gibbs states). This variational principle is illustrated by the Hartree–Fock approximation for a model of interacting Fermions. The entropy of an equilibrium state for a free quantum particle on a compact Riemannian manifold is also estimated. Finally, the notion of relative entropy is formulated in the algebraic setting using the relative modular operator.
Vlatko Vedral
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780199215706
- eISBN:
- 9780191706783
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199215706.003.0014
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This book has discussed the foundations of quantum information science as well as the relationship between physics and information theory in general. It has considered the quantum equivalents of the ...
More
This book has discussed the foundations of quantum information science as well as the relationship between physics and information theory in general. It has considered the quantum equivalents of the Shannon coding and channel capacity theorems. The von Neumann entropy plays a role analogous to the Shannon entropy, and the Holevo bound is the analogue of Shannon's mutual information used to quantify the capacity of a classical channel. Quantum systems can process information more efficiently than classical systems in a number of different ways. Quantum teleportation and quantum dense coding can be performed using quantum entanglement. Entanglement is an excess of correlations that can exist in quantum physics and is impossible to reproduce classically (with what is termed “separable” states). The book has also demonstrated how to discriminate entangled from separable states using entanglement witnesses, as well as how to quantify entanglement, and looked at quantum computation and quantum algorithms.Less
This book has discussed the foundations of quantum information science as well as the relationship between physics and information theory in general. It has considered the quantum equivalents of the Shannon coding and channel capacity theorems. The von Neumann entropy plays a role analogous to the Shannon entropy, and the Holevo bound is the analogue of Shannon's mutual information used to quantify the capacity of a classical channel. Quantum systems can process information more efficiently than classical systems in a number of different ways. Quantum teleportation and quantum dense coding can be performed using quantum entanglement. Entanglement is an excess of correlations that can exist in quantum physics and is impossible to reproduce classically (with what is termed “separable” states). The book has also demonstrated how to discriminate entangled from separable states using entanglement witnesses, as well as how to quantify entanglement, and looked at quantum computation and quantum algorithms.
Sandip Tiwari
- Published in print:
- 2017
- Published Online:
- August 2017
- ISBN:
- 9780198759874
- eISBN:
- 9780191820847
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198759874.003.0001
- Subject:
- Physics, Condensed Matter Physics / Materials, Atomic, Laser, and Optical Physics
Information is physical, so its manipulation through devices is subject to its own mechanics: the science and engineering of behavioral description, which is intermingled with classical, quantum and ...
More
Information is physical, so its manipulation through devices is subject to its own mechanics: the science and engineering of behavioral description, which is intermingled with classical, quantum and statistical mechanics principles. This chapter is a unification of these principles and physical laws with their implications for nanoscale. Ideas of state machines, Church-Turing thesis and its embodiment in various state machines, probabilities, Bayesian principles and entropy in its various forms (Shannon, Boltzmann, von Neumann, algorithmic) with an eye on the principle of maximum entropy as an information manipulation tool. Notions of conservation and non-conservation are applied to example circuit forms folding in adiabatic, isothermal, reversible and irreversible processes. This brings out implications of fluctuation and transitions, the interplay of errors and stability and the energy cost of determinism. It concludes discussing networks as tools to understand information flow and decision making and with an introduction to entanglement in quantum computing.Less
Information is physical, so its manipulation through devices is subject to its own mechanics: the science and engineering of behavioral description, which is intermingled with classical, quantum and statistical mechanics principles. This chapter is a unification of these principles and physical laws with their implications for nanoscale. Ideas of state machines, Church-Turing thesis and its embodiment in various state machines, probabilities, Bayesian principles and entropy in its various forms (Shannon, Boltzmann, von Neumann, algorithmic) with an eye on the principle of maximum entropy as an information manipulation tool. Notions of conservation and non-conservation are applied to example circuit forms folding in adiabatic, isothermal, reversible and irreversible processes. This brings out implications of fluctuation and transitions, the interplay of errors and stability and the energy cost of determinism. It concludes discussing networks as tools to understand information flow and decision making and with an introduction to entanglement in quantum computing.
Sandip Tiwari
- Published in print:
- 2020
- Published Online:
- November 2020
- ISBN:
- 9780198759867
- eISBN:
- 9780191820830
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198759867.003.0002
- Subject:
- Physics, Condensed Matter Physics / Materials
Chapter 2 brings forth the links between entropy and energy through their intimate link to information. Probabilities—as a statistical tool when there are unknowns—connect to information as well as ...
More
Chapter 2 brings forth the links between entropy and energy through their intimate link to information. Probabilities—as a statistical tool when there are unknowns—connect to information as well as to the various forms of entropy. Entropy is a variable introduced to characterize circumstances involving unknowns. Boltzmann entropy, von Neumann entropy, Shannon entropy and others can be viewed through this common viewpoint. This chapter broadens this discussion to include Fisher entropy—a measure that stresses locality—and the principle of minimum negentropy (or maximum entropy) to show how a variety of physical descriptions represented by equations such as the Schrödinger equation, diffusion equations, Maxwell-Boltzmann distributions, et cetera, can be seen through a probabilistic information-centric perspective.Less
Chapter 2 brings forth the links between entropy and energy through their intimate link to information. Probabilities—as a statistical tool when there are unknowns—connect to information as well as to the various forms of entropy. Entropy is a variable introduced to characterize circumstances involving unknowns. Boltzmann entropy, von Neumann entropy, Shannon entropy and others can be viewed through this common viewpoint. This chapter broadens this discussion to include Fisher entropy—a measure that stresses locality—and the principle of minimum negentropy (or maximum entropy) to show how a variety of physical descriptions represented by equations such as the Schrödinger equation, diffusion equations, Maxwell-Boltzmann distributions, et cetera, can be seen through a probabilistic information-centric perspective.