Thermodynamics AND GIBBS PARADOX

825 views 35 slides Nov 08, 2021
Slide 1
Slide 1 of 35
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35

About This Presentation

LAWS OF THERMODYNAMICS


Slide Content

Thermodynamics and
the Gibbs Paradox
Presented by: Chua Hui Ying Grace
Goh Ying Ying
Ng Gek Puey Yvonne

Overview
The three laws of thermodynamics
The Gibbs Paradox
The Resolution of the Paradox
Gibbs / Jaynes
Von Neumann
Shu Kun Lin’s revolutionary idea
Conclusion

The Three Laws of
Thermodynamics
1
st
Law
Energy is always conserved
2
nd
Law
Entropy of the Universe always increase
3
rd
Law
Entropy of a perfect crystalline substance is
taken as zero at the absolute temperature
of 0K.

Unravel the mystery of The
Gibbs Paradox

The mixing of
non-identical gases

Shows obvious increase in entropy (disorder)

The mixing of identical gases

Shows zero increase in entropy as action is reversible

Compare the two scenarios of
mixing and we realize that……

To resolve the Contradiction
Look at how people do this
1.Gibbs /Jaynes
2.Von Neumann
3.Lin Shu Kun

Gibbs’ opinion
When 2 non-identical gases mix and entropy
increase, we imply that the gases can be
separated and returned to their original state
When 2 identical gases mix, it is impossible to
separate the two gases into their original
state as there is no recognizable difference
between the gases

Gibbs’ opinion (2)
Thus, these two cases stand on
different footing and should not be
compared with each other
The mixing of gases of different kinds
that resulted in the entropy change was
independent of the nature of the gases
Hence independent of the degree of
similarity between them

Entropy
S
max
Similarity
S=0
Z=0 Z = 1

Jaynes’ explanation
The entropy of a macrostate is given as)(log)( CWkXS
Where S(X)is the entropy associated with a chosen
set of macroscopic quantities
W(C)is the phase volume occupied by all the
microstates in a chosen reference class C

Jaynes’ explanation (2)
This thermodynamic entropy S(X)is not a
property of a microstate, but of a certain
reference class C(X)of microstates
For entropy to always increase, we need to
specify the variables we want to control and
those we want to change.
Any manipulation of variables outside this
chosen set may cause us to see a violation of
the second law.

Von Neumann’s Resolution
Makes use of the quantum mechanical
approach to the problem
He derives the equation 2log21log11log1
2






 
Nk
S
Where measures the degree of orthogonality, which
is the degree of similarity between the gases.

Von Neumann’s Resolution (2)
Hence when= 0 entropy is at its highest
and when = 1 entropy is at its lowest
Therefore entropy decreases continuously
with increasing similarity

Entropy
S
max
Similarity
S=0
Z=0 Z = 1

Resolving the Gibbs Paradox -Using Entropy and its
revised relation with Similarityproposed by Lin Shu Kun.
•Draws a connection between information theory and entropy
•proposed that entropy increases continuously with similarity
of the gases

Analyse 3 concepts!
(1) high symmetry = high similarity,
(2) entropy = information loss and
(3) similarity = information loss.
Why “entropy increases with similarity” ?
Due to Lin’s proposition that
•entropy is the degree of symmetry and
•information is the degree of non-symmetry

(1) high symmetry = high similarity
•symmetry is a measure of indistinguishability
•high symmetry contributes to high indistinguishability
similarity can be described as a continuous measure of
imperfect symmetry
High Symmetry Indistinguishability High
similarity

(2) entropy = information loss
an increase in entropymeans an increase in
disorder.
a decrease in entropy reflects an increase in order.
A more ordered system is more highly organized
thuspossesses greater information content.

Do you have any
idea what the
picture is all about?

From the previous example,
•Greater entropy would result in least information registered
Higher entropy , higher information loss
Thus if the system is more ordered,
•This means lower entropyand thus less information loss.

(3) similarity = information loss.
1Particle (n-1)particles
For a system with distinguishable particles,
Information on N particles
= different informationof each particle
= N piecesof information
Highsimilarity(highsymmetry)thereisgreaterinformationloss.
For a system with
indistinguishable particles,
Information of N particles
= Information of 1 particle
= 1 pieceof information

Concepts explained:
(1) high symmetry = high similarity
(2) entropy = information loss and
(3) similarity = information loss
After establishing the links between the various concepts,
If a system is
highly symmetrical high similarity
Greater
informationlossHigher entropy

The mixing of identical
gases (revisited)

Lin’s Resolution of the Gibbs Paradox
Compared to the non-identical gases, we have less
information about the identical gases
According to his theory,
less information=higher entropy
Therefore, the mixing of gases should result in an
increase with entropy.

Comparing the 3 graphs
Entropy
S
max
Similarity
S=0
Z=0 Z = 1
Entropy
S
max
Similarity
S=0
Z=0 Z = 1 Z=0
Entropy
S
max
Similarity
S=0
Z = 1
Gibbs Von Neumann Lin

Why are there differentways in
resolving the paradox?
Different ways of considering Entropy
Lin—Static Entropy: consideration of
configurations of fixed particles in a system
Gibbs & von Neumann—Dynamic Entropy:
dependent of the changes in the dispersal of
energy in the microstates of atoms and
molecules

We cannot compare the two
ways of resolving the paradox!
Since Lin’s definition of entropy is
essentially different from that of Gibbs
and von Neumann, it is unjustified to
compare the two ways of resolving the
paradox.

Conclusion
The Gibbs Paradox poses problem to
the second law due to an inadequate
understanding of the system involved.
Lin’s novel idea sheds new light on
entropy and information theory, but
which also leaves conflicting grey areas
for further exploration.

Acknowledgements
We would like to thank
Dr. Chin Wee Shong for her support and
guidance throughout the semester
Dr Kuldip Singh for his kind support
And all who have helped in one way or another
Tags