•What is Information? - It is facts provided or learned about something or someone.
•The definition above is qualitative. In a scientific theory, there is a need to quantify
it.
•Q. Does every piece of information convey something? What determines whether
one piece of information is more meaningful than another?
•Let’s consider an example:-
•Consider a statement like “ It is hot in Agra today”.
•Does it convey enough information? Is it hot compared to New Delhi? Is it hotter
than it was yesterday? – Imprecise Information.
•Suppose we define a day to be hot if the temperature is in the range 36 ≤ T ≤ 41.
Then we have a little more knowledge.
•Our Information is better, but we have an equal probability (1/6) of T being 36, 37,
38, 39, 40 or 41
•Additional information – yesterday’s temperature was 39 and it is hotter today.
Entropy
•Measurement is very important in Information theory.
•Physicists use entropy to measure the amount of disorder
in a physical system.
•Entropy is the key concept of quantum information
theory.
•In information theory , entropy is the expected value
(average) of the information contained in each message
received. It measures how much uncertainty there is in
the state of a physical system.
•It can be considered as the degree of randomness in a
message.
•The entropy of the message is its amount of uncertainty,
it increases when the message is closer to random, and
decreases when it is less random.
•The less likely an event is, the more information it
provides when it occurs.
•The message aaaaa appears to be very structured and
not random at all . It contains much less information
than the message alphabet, which is some what
structured , but more random.
•The first information has low entropy and the second
one has higher entropy.
•The key concept of classical information theory
is the Shannon entropy. Claude E. Shannon
introduced Shannon’s entropy , used for
measuring entropy of a classical system. It
measures the amount of order in a classical
system.
•Von Neumann entropy is used for measuring
entropy of a quantum system. It gauges order
in a given quantum system.
Shannon Entropy
The Shannon entropy can be used to define other measures of information which
capture the relationships between two random variables X and Y. Four such
measures are the following:
•Relative entropy: Relative entropy measures the similarity between two
random variables.
•Joint entropy: joint entropy measures the combined information in two
random variables.
•Conditional entropy: Conditional entropy measures the information
contained in one random variable given that the outcome of another
random variable is known.
•Mutual information: Mutual information measures the correlation between
two random variables, in terms of how much knowledge of one random
variable reduces the information gained from learning the other random
variable.
Shannon’s entropy
• For a discrete random variable X corresponding to a
physical process with possible outcomes x
1
,x
2
,———x
n
with probabilities , p
1,————
p
n ;
∑p
i.
= 1
•The average uncertainty associated with the event X = x
i
•The Shannon entropy associated with this probability
distribution is defined by
•H(
p
1,————
p
n
)
•
= - ∑p
i
log
2
(p
i
)
• = ∑p
i
log
2
(1/pi )
Von Neumann Entropy
•Quantum entropy refers to the measure of uncertainty or
information content within a quantum system.
•It quantifies the amount of information that is missing or unknown
about a quantum state.
• Unlike classical entropy, which deals with disorder or randomness in
classical systems, quantum entropy deals with the complexities
arising from superposition, entanglement, and the probabilistic
nature of quantum states.Quantum entropy plays a pivotal role in
understanding the behavior of quantum systems.
•It provides fundamental insights into the information content,
complexity, and predictability of quantum states.
•As quantum mechanics governs the behavior of particles at a
fundamental level, understanding and quantifying entropy in this
context are crucial for various applications, including quantum
computing, cryptography, and information theory.
•
•Named after mathematician John von Neumann, Von Neumann entropy is
a key concept in quantum information theory.
•It quantifies the amount of uncertainty or information content associated
with a quantum state.
•Von Neumann defined the entropy of a quantum state ƿ by the formula
•Von Neumann entropy
•
S(ƿ) = - tr (ƿ log
2
ƿ). Where ƿ = ∑ p
i
IΨ
i
><Ψ
i
I.
•Suppose we have a mixture of quantum states IΨ
i
> with probability p
i
.
Each IΨ
i
> can be represented by a vector in the 2
n space.
•If λ
i
are eigenvalues of the density matrix ƿ then Von Neumann’s
definition can be re-expressed
•S(ƿ) = ∑λ
i
log
2
1/
λ
i
•Von Neumann entropy S(ƿ) provides a
measure of degree of mixedness of the
ensemble.
•S(ƿ) = 0 for a well ordered state.
Classical versus Quantum Entropy
•Classical Information:
•Classical information is based on classical physics and follows classical
laws of physics.
•It is represented in bits (0 or 1) and operates on classical states that
are deterministic and definite.
•Information in classical systems can be copied without limitations due
to the principle of no-cloning.
•Classical information processing relies on classical computers that use
classical bits for computation.
•Classical information theory, developed by Claude Shannon, deals
with encoding, transmitting, and decoding information in classical
systems.
•Quantum Information:
•Quantum information is based on quantum physics and operates using
quantum states and qubits.
•It represented in qubits, which can exist in superposition states (0, 1, or
both simultaneously) due to quantum superposition.
•Quantum information cannot be cloned perfectly due to the no-cloning
theorem.
•Quantum computers leverage quantum bits (qubits) and quantum gates
for computation, enabling parallel processing and potential exponential
speedup.
•Quantum information theory extends classical information theory into the
quantum realm, dealing with quantum encoding, transmission, and
decoding.