Category: Characterization of Quantum Information
-

Entanglement Measures
Owing to the importance of entanglement as a resource in quantum in- formation processing, it is necessary to construct measures of entanglement between two component systems. We saw in Chapter 4 a condition for the separability of 2-qubit states. For a generic higher dimensional density matrix to be separable, a test known as the positive…
-

Fidelity
Another important measure for comparing probability distributions is the fidelity, which is easily extended to quantum states. This is variously defined in different texts, but we will stick to a simple operational definition here: F(p(x), q(x)) = X x p p(x)q(x). (11.50) The square root is used so that we have F(p(x), p(x)) = 1.…
-

Distance Measures
An important consideration in information theory is the comparison of two systems: probability distributions in the classical context and states (pure or mixed) in the quantum. For such comparisons, various measures collectively labeled distance measures have been proposed. We’ll consider some of them here, to educate ourselves in the concepts involved. Characterization of Quantum Information…
-

Entropy of composite systems
Some of the properties of the von Neumann entropy for composite systems are similar to those of Shannon entropy, while some others are quite different. We discuss a few here. 1. Concavity: S(ρ) is a concave function. That is, for a linear combination of states ρ = c 1 ρ A + c 2 ρ…
-

Properties of the von Neumann entropy
Some properties of the von Neumann entropy immediately follow from the definition. Characterization of Quantum Information 225 1. The minimum value of S(ρ), zero, occurs for pure states. S(ρ) ≥ 0. (11.29) Thus even though a pure state embodies probabilities of measurement out- comes, the information carried by it is zero since it represents a…
-

Mathematical characteristics of the entropy function
We will denote by an ensemble X the collection of events (represented by a random variable) x occurring with probability p(x): X ≡ {x, p(x)}. (11.4) Definition 11.1. The entropy function for an ensemble X is given by H(X) = −k X x p(x) log p(x). (11.5) The number k is a constant which depends…
-

Measures of Information
We would like to develop a measure for the rather abstract concept of information, so that efficiencies of different protocols or physical systems of communication can be compared, and more efficient systems designed. We need to first have a model for the process we are describing, and Shan- non’s proposed model has stuck (Figure 11.1).…
-

information
When we manipulate physical systems for various purposes, we are essentially encoding and decoding the information content in those systems in precisely defined terms tailored to our purpose. Conversationally speaking of informa- tion, one thinks of the “new knowledge” gained when a particular physical process is completed, such as watching television, reading an article, or…