Focused on basic terminology used in Statistical Mechanics, Relation ship between Information Theory and Statistical Mechanics and few terms related to quantum mechanics
Size: 513.32 KB
Language: en
Added: Jan 12, 2021
Slides: 30 pages
Slide Content
A Presentation on: “Information Theory and Statistical Mechanics ”
Table of contents Abstract Introduction Basic terminology used in statistical Mechanics Critiques approach to statistical mechanics Thermodynamic entropy and its relation to information entropy Shannon entropy vs Von Neumann entropy Discussion Conclusion
Abstract Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate . It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function , are an immediate consequence of the maximum-entropy principle . In the resulting " subjective statistical mechanics ," the usual rules are thus justified independently of any physical argument , and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available. It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity , metric transitivity, equal a priori probabilities , etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects . The former consists only of the correct enumeration of the states of a system and their properties ; the latter is a straightforward example of statistical inference .
Introduction Information Theory and Statistical Mechanics – a branch of physics that deals with many–particle systems using probabilitistic /statistical methods in the microscopic level Actually, in Information Theory, the term entropy was coined after the thermodynamic entropy. The thermodynamic entropy was first introduced by Clausius (around 1850), whereas its probabilistic–statistical interpretation is due to Boltzmann (1872 ). It is virtually impossible to miss the functional resemblance between the two notions of entropy, and indeed it was recognized by Shannon and von Neumann.
Introduction ( contd ) The Maximum Entropy ( M) Principle Oldest concept that ties the two fields and it has attracted a great deal of attention, not only of information theortists , but also that of researchers in related fields like signal processing , image processing, and the like . If in a certain problem, the observed data comes from an unknown probability distribution, but we do have some knowledge (that stems e.g., from measurements) of certain moments of the underlying quantity/signal/random–variable, then assume that the unknown underlying probability distribution is the one with maximum entropy subject to ( s.t ) moment constraints corresponding to this knowledge Large Deviations Theory as a Bridge Between Information Theory and Statistical Physics. Both Information Theory and Statistical Physics have an intimate relation to large deviations theory, a branch of probability theory which focuses on the assessment of the exponential rates of decay of probabilities of rare events, where the most fundamental mathematical tool is the Chernoff bound
Basic terms in S tatistical Mechanics System Pressure No of particles Temperature Energy Volume Pressure, No of particles, Temperature, Energy, Volume are Macroscopic parameters
Basic terms in Statistical Mechanics Ensemble: Collection of many system System-1 System-2 System-3 System-4 System-5 System-6 System-7 System-8 System-9 System-10 System-11 System-12 System-13 System-14 System-15 These all System should be Macroscopically Identical and Independent System ( i.e One system donot interact with other)
Basic terms in Statistical Mechanics Types of Ensemble 1. Microcanonical Ensemble E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N E, V, N Wall should be rigid(Outer), impermeable, Insulated
Basic terms in Statistical Mechanics Types of Ensemble 2. Canonical Ensemble T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N T, V, N Wall should be rigid(Outer), impermeable, Conducting
Basic terms in Statistical Mechanics Types of Ensemble 3. Grand Canonical Ensemble T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ T, V, µ Wall should be rigid(Outer), permeable, Conducting. µ is chemical potential
Basic terms in Statistical Mechanics Macrostate : If we consider parameter of whole system, then it is referred to as Macrostate or Macroscopic parameter. Microstate: Inside a system, if particle or molecule is consider i.e its particle velocity, particle rotational motion, particle degree of freedom.Then these are referred to as Microstate.
Basic terms in Statistical Mechanics Static System : When system is static (rest).Position of particle is determined by 3D coordinate system
Basic terms in Statistical Mechanics Dynamic System : When system is dynamic(motion).Position of particle cannot be determined only by 3D coordinate system. Each particle momentum should also be known. x, y, z : Position Space Px , Py , Pz : Momentum Space Combinely it is called Phase Space.
Basic terms in Statistical Mechanics Equipartition Theorem : It states that for a system obeying classical statistical Mechanics which is in thermal equilibrium at absolute temperature T, the mean energy associated with each degree of freedom is KT/2. K: Boltzmann constant T: Temperature Degree of freedom: No of independent motion performed by Particle.
Basic terms in Statistical Mechanics Boltzmann Canonical Distribution law : n n = no. of particle in i th cell = energy of ith cell This eqn gives the number of gas molecules in each cell as a function of energy associated with each particle in that cell and is called Boltzmann’s Canonical Distribution law. Let n1,n2,…. ni ….be the no. of gas molecule in cell1, cell2, …cell i….is in equilibrium state. As the gas molecule are moving continuously, ni’s will change continuously in many different ways but will always keep values close to those for the state of equilibrium. i.e the state of maximum probability . Here ni’s change will obey the fundamental hypothesis of statistical Mechanics.
Basic terms in Statistical Mechanics Boltzmann Canonical Distribution law : The Total no of molecules is constant. N = n1 + n2 + n3+……+ ni +…..=constant ɗ N = ɗn1 + ɗn2 + ɗn3 +……+ ɗni +…..= 0 2. The Total Energy of the System is constant. E = Ɛ1.n1 + Ɛ2.n2 +Ɛ3.n3 +……+ Ɛi.ni +…..=constant ɗE = Ɛ1.ɗn1 + Ɛ2.ɗn2 + Ɛ3.ɗn3 +……+ Ɛi.ɗni +…..= 3. When gas is in equilibrium, thermodynamic probability (ɷ)is maximum i.e 1 therefore log(ɷ)=0 ɗ log (ɷ)=0
Basic terms in Statistical Mechanics Classical Statistics (Observable , Distingushable particle) In Classical Mechanics, all particles in the system are considered distinguishable and can be labelled. This means that individual particles in a system can be tracked. As a result changing the positions of any two particles leads to a completely different configuration. Quantum Statistics (Non observable, Indistingushable particle) All particle in the system are considered (indistinguishable) i.e you cannot label the particle. Interchanging the particle position doesnot change the system
Basic terms in Statistical Mechanics Maxwell Boltzmann statistics This statistics is obeyed by identical, distinguishable particles. For Indistinguishable particle Quantam Mechanics. Under Quantam Mechanics 1. Bose Einstein statistics 2. Fermi Dirac statistics .
Basic terms in Statistical Mechanics
Critiques approach to statistical mechanics There is a microscopic theory of the phenomenon under consideration --- for the moment that can be the existence of a (quantum or classical) hamiltonian formulation, which then ensures the existence of a preferred Louiville form. Thus it makes sense to discuss probabilities of trajectories (ensemble), independently of where those probabilities come from. There exists a viable macroscopic, coarse-grained description, which is only the case if an experiment says so --- the key is what Jaynes sometimes calls "reproducibility". If a phenomenon is not readily reproduced then clearly one has not gained sufficient control over enough variables --- e.g. it was an experimental fact that controlling temperature and volume of a gas was sufficient to determine its pressure.
Thermodynamic entropy and its relation to information entropy Thermodynamic entropy is a measure of the number of microenergy states. S= klogN where k is Boltzmann constant, N number of microenergy states. If you have two non-interacting systems A and B, you want the entropy of the combined system to be, SAB=SA+SB. If the two systems have NA, NB states each, then the combined system has NA.NB states. So to get additivity in the entropy, you need to take the log. Information entropy is based upon the number of letters in the alphabet; but this makes entropy relative to the lexicon of the observer. Thermodynamic entropy is, however, based on a frame of reference (the system being investigated). Further, it is based on distinguishable differences; if two indistinguishable gases are combined, entropy does not change. The number of distinguishable elements might be used as the "alphabet" in thermodynamics; but this still seems to require an observer.
Thermodynamic entropy and its relation to information entropy ( contd ) Entropy can be used in several different ways. Entropy is sometimes used to describe a system state (how much entropy in the system; as in Shannon, Boltzmann and Gibbs equations), a change in state ( Clausius ' equation) or even as a force. One area where thermodynamics and information converge is in artificial intelligence, and specifically, in the idea of the Boltzmann Machine. Using simulated annealing, Boltzmann Machines "heat up" their internal weights and then "cool". This cooling process minimizes internal energy.
Shannon entropy vs Von Neumann entropy Classical Information Processing Bit 0 or 1 Quantum Information Processing Qubit The most basic unit of quantum information (the quantum bit, or qubit ) is therefore written as a combination of the states |0〉 and |1〉 defined by the complex numbers α and β.
Shannon entropy vs Von Neumann entropy Then Shannon’s entropy is defined by We shall take logarithms to base 2, which corresponds to measuring entropy in bits. Information theory is the notion of the entropy of a random variable X introduced by Shannon in 1948. Let X be a random variable. That is, let Ω be a finite sample space, and let where and be the probability distribution for X on Ω .
Shannon entropy vs Von Neumann entropy There is a notion of the entropy introduced by von Neumann in 1927. Which actually contains Shannon’s notion as a special case. Let be the density matrix of a quantum states having n parts 1,2,3,.......n. That is, is a self- adjoint matrix whose rows and columns are indexed by .
Shannon entropy vs Von Neumann entropy Eigen values of satisfy and Then von Neumann’s entropy is defined by: If is a numerical function and where, are normalized eigenvectors of we shall agree that With this convention, von Neumann’s entropy can be written as: Where denotes the trace of the matrix .
Discussion We consider laws under equilibrium conditions. We still donot have complete and satisfactory theory which is free from objection on mathematical grounds, involve no any arbitrary assumptions, automatically include an explanation of nonequlibrium conditions and irreversible processes. Since equilibrum thermodynamics is merely an ideal limiting case of the behavior of matter. Shannon only claimed to have identified the entropy functional. He was not interested in "energy" or "temperature". That doesn't mean that these concepts cannot exist within the formalism of information theory. "Temperature" and "energy" are just names of variables that appear in certain physical problems. Jaynes realized this and drew the connection between Information Theory and Statistical Mechanics
Conclusion Von-Neumann—Shannon expression for entropy gives a measure of amount of uncertainty represented by a probability distribution Thus Entropy becomes a primitive concept with which we work,more fundamental even than energy. The von Neumann entropy that calculates the entropy of a quantum system. Shannon only claimed to have identified the entropy functional. He was not interested in "energy" or "temperature