Entropy and the Second Law of Thermodynamics

 
Entropy and the
Second Law
 
Lecture 2
 
What is
What is
We have an intuitive sense of what energy is.
o
We know it take energy to throw a ball and we can feel that energy when the ball
strikes us.
o
We sense that after a long workout, we have expended a lot of energy.
o
(that said, we have no intuitive sense that energy is conserved or that the sum of
work and heat is constant. Joule’s proposal of it to the British Association in 1843 was
met by “entire incredulity”.
We generally don’t have an good intuitive sense of what
entropy is.
o
Although we do have a sense that is often a natural direction and progression of
things.
o
We also know that some things are hard, if not impossible, to undo
All the king’s horses and all the kings men could not put Humpty Dumpty
together again.”
So we will start by trying to get a sense of just what entropy is.
 
Getting to know Entropy
 
Imagine a box containing
two different gases (for
example, He and Ne) on
either side of a
removable partition.
What happens when you
remove the partition?
Did the energy state of
the system change?
Is the process reversible?
(All the kings horses and
all the kings men…)
What changed?
 
The Second Law
 
It is impossible to construct a machine that is able to
convey heat by a cyclical process from one
reservoir at a lower temperature to another at a
higher temperature unless work is done by some
outside agency (i.e., air conditioning is never free).
Heat cannot be entirely extracted from a body and
turned into work (thus car engines always have
cooling systems).
Every system left to itself will, change toward a
condition of maximum probability.
The 
entropy
 of the universe always increases.
The Second Law
My favorites:
o
You can’t shovel manure into
the rear end of a horse and
expect to get hay out of its
mouth.
o
If you push a car backwards
down the street, water and
CO
2
 won’t spontaneously turn
back into gasoline.
 
The Second Law states that there is a natural
direction in which reactions will tend to proceed.
 
Statistical Mechanics and
Entropy
 
The Microscopic Viewpoint
 
Back to our Box
 
Imagine there were just two atoms of
in each side of our box.
o
Only one possible arrangement
If we remove the partition, there are
2
4
 = 16 possible arrangements.
Basic Postulate of Statistical
Mechanics: 
a system is equally likely
to be found in any of the states
accessible to it
.
The odds of atoms being arranged
the same as the initial way: 1 in 16.
When we removed the partition, we
simply increased the number of
possible arrangements.
Suppose we had a mole of gas. What
are the odds of atoms being
arranged the same way?
 
Another Example
 
Imagine two copper
blocks at different
temperatures
 separated
by an insulator.
What happens if we
remove the insulator?
Suppose we initially had 1
unit (quanta) of energy in
the left block and 5 in the
right (total of 6). How will
they be distributed after
we remove the insulator?
How many ways can we
arrange the energy?
 
How many 
combinations
 are there that correspond
to the first block having 1 quantum?
 
 
 
Each of these arrangements is equally possible.
o
Although since we can’t tell the individual quanta apart, some of these
arrangements –combinations– are identical.
How many when the left
block has 2?
 
Is there a rule we can use to
figure out questions like this?
 
 
 
Where 
E
 is the total number
of energy units and 
e
 is the
number the left block has.
 
Doing the math, we find
there are 15 ways.
 
How many all together?
How many all together?
 
The system is symmetrical, so
there are also 15 ways of
distributing the energy when
the left block has 4 and right
2.
There are 20 ways of
distributing our 6 units of
energy so that each block
has three, and only one
way each when all the
energy is on one side.
Dong the math, there are a
total of 64 ways.
A simpler way: 6 quanta
distributed between 2
blocks can be distributed in
2
6
 = 64.
o
The chances of any particular state
occurring are (½)
6
.
 
What’s the point?
 
There were 20 ways of
distributing the energy
equally, 2x15 ways where
one side had 4 and the
other 2, 2x6 ways where one
side had one and 2 ways
where one side had none.
The point is that there are
many ways to distribute
energy for some values of 
e
and only a few for other
values.
The chances of an equal
distribution are 20x(1/2)
6
 =
20/64 = 0.3125. The chances
of the left block having all
the energy are only 1/64.
 
Calculating Probabilities
 
Suppose there are 20 quanta of energy to distribute. Too
many to count individual combinations! We need an
equation.
The equation will simply be the number of combinations
corresponding to a given state of the system, 
Ω(ƒ) (e.g., one
block having ƒ= 5 quanta)
 times the probability of any
particular combination occurring, in this case (½)
20
:
P(ƒ) = Ω(ƒ) × C
o
where C is the probability of any combination occurring:  (½)
20
 and 
Ω
 is
 
 
 
 
More generally,
 
o
where 
p
 is the probability of an energy unit being in the left block and 
q
 is the
probability of it being in the right.  This equation is known as the 
binomial distribution
(can be computed with the this is the BINOMDIST function in Excel).
o
In this case, 
p
 and 
q
 are equal, so it reduces to:
 
 
Energy Distribution
 
We can use the
BINOMDIST function in
Excel to compute the
probability.
We see an equal or
nearly equal distribution is
the most likely outcome.
o
Imagine if we had 10
20
 quanta to
distribute – one block having a
few more or less would make little
difference.
The point is that energy is
distributed equally
between the blocks
simply because that is the
most likely outcome.
Entropy and
Thermodynamics
Unlike a simple physical system (a ball
rolling down a hill), in thermodynamics
whether or not a chemical system is at
equilibrium depends not on its total
energy, but on how that energy is
internally distributed
.
Clearly, it would be useful to have a
function that could predict the internal
distribution of energy at equilibrium.
o
The function that predicts how energy (or atoms)
will be distributed at equilibrium is 
entropy
.
We found that in this case (blocks of
equal size and composition), energy
will be distributed equally between
them at equilibrium.
How do we develop this
mathematically?
Predicting the
Equilibrium Distribution
 
Consider again the
energy distribution.
What is maximized
when the system is at
equilibrium?
How do we know when
that function is
maximized?
What is mathematical
characteristic does the
maximum of function
have?
 
Its first differential is 0.
Probability Function
 
So the function we want is the probability, 
P
 of the
system (a copper block in this case) being found in a
particular state, 
e. 
As we have seen, this is proportional
to the number of combinations, 
Ω(e)
, corresponding to
that state:
P(e) = CΩ(e)
o
So we can use the function 
Ω(e) 
in place of
 P(e)
A difficulty here is that both 
P(e)  
and
 Ω(e) 
are
multiplicative. So the number of assessable states for the
two blocks is 
Ω
left
 x Ω
right
.
 Our other thermodynamic
properties are additive
.
How do we convert a multiplicative property to an
additive one?
o
Take the log: ln (
Ω).
 
Defining Entropy
 
Ludwig Boltzmann defined
entropy, 
S
, as
S
 = k ln
 Ω
o
Where k is Boltzmann’s constant.
o
R = k × N
A
Entropy is a measure of the
randomness of a system.
An increase in entropy of a
system corresponds to a
decrease in knowledge of it.
We can decrease the
entropy of a “system”, but
only by increasing the
entropy of its surroundings.
 
Temperature & Statistical
Mechanics
 
We noted that we can find the maximum of the
probability function by taking its derivative.
For our blocks, equilibrium occurs where
 
 
 
So this function ∂
Ω
/∂
E
, appears to be a useful one as
well; we’ll call this function β.
The two blocks are in equilibrium when β
left
 = β
right
.
We other variable is equal when the two blocks are in
equilibrium?
o
Temperature!
It turns out that 
β
= 1/k
T
 
The Second Law
 
Experience has shown
that in any spontaneous
transition, the increase in
entropy will always
exceed the ratio of heat
exchanged to
temperature.
 
Entropy also has the interesting
property that in any
spontaneous reaction, the
total entropy of the system plus
its surroundings must never
decrease
.
In our example, this is a simple
consequence of the
observation that the final
probability, 
P
(E), and therefore
also 
Ω
, will be maximum and
hence never less than the
original one.
Any decrease in entropy of
one of the blocks must be at
least compensated for by an
increase in entropy of the
other block
.
 
The Second Law
 
In 1856 German physicist
Rudolf Clausius proposed
what he called the
“second fundamental
theorem” as
 
 
In the case of a (fictive)
reversible process, this becomes
an equality. (Actually, Clausius
originally wrote it as an
equality).
Boltzmann’s statistical
interpretation came nearly 40
years later.
 
 
Integrating factors and
exact differentials
 
Similarly for heat
dQ
rev
/T = dS
Here we convert heat to
the state function
entropy by dividing by T.
 
Any inexact differential
that is a function of only
two variables can be
converted to an exact
differential.
dW
 is an inexact
differential, and 
dV
 is an
exact differential. Since
dW
rev
 = 
-
PdV
,
dW
rev
 can be converted
to a state function by
dividing by 
P
 since
 
dV =-dW
rev
/
P
 
Back to our box
 
Entropy relates not just to
the distribution of energy,
but of atoms and molecules
as well.
In our box with two gases,
the most likely state after we
removed the partition was
one in which the He and Ne
atoms would be randomly
mixed, with equal numbers
on both sides.
What changed when we
removed the partition from
our box was that the
entropy
 of the system
increased.
Slide Note
Embed
Share

This lecture delves into the concepts of entropy and the Second Law of Thermodynamics. It discusses the intuitive sense of energy, the nature of entropy, and the irreversibility of processes. The Second Law is highlighted in relation to heat transfer, work, and the increase of entropy in the universe. The principles of statistical mechanics and entropy at the microscopic level are also explored, emphasizing the probabilistic nature of systems.

  • Entropy
  • Second Law
  • Thermodynamics
  • Energy
  • Statistical Mechanics

Uploaded on Sep 22, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Entropy and the Second Law Lecture 2

  2. What is We have an intuitive sense of what energy is. o We know it take energy to throw a ball and we can feel that energy when the ball strikes us. o We sense that after a long workout, we have expended a lot of energy. o (that said, we have no intuitive sense that energy is conserved or that the sum of work and heat is constant. Joule s proposal of it to the British Association in 1843 was met by entire incredulity . We generally don t have an good intuitive sense of what entropy is. o Although we do have a sense that is often a natural direction and progression of things. o We also know that some things are hard, if not impossible, to undo All the king s horses and all the kings men could not put Humpty Dumpty together again. So we will start by trying to get a sense of just what entropy is.

  3. Getting to know Entropy Imagine a box containing two different gases (for example, He and Ne) on either side of a removable partition. What happens when you remove the partition? Did the energy state of the system change? Is the process reversible? (All the kings horses and all the kings men ) What changed?

  4. The Second Law It is impossible to construct a machine that is able to convey heat by a cyclical process from one reservoir at a lower temperature to another at a higher temperature unless work is done by some outside agency (i.e., air conditioning is never free). Heat cannot be entirely extracted from a body and turned into work (thus car engines always have cooling systems). Every system left to itself will, change toward a condition of maximum probability. The entropy of the universe always increases.

  5. The Second Law My favorites: o You can t shovel manure into the rear end of a horse and expect to get hay out of its mouth. o If you push a car backwards down the street, water and CO2won t spontaneously turn back into gasoline. The Second Law states that there is a natural direction in which reactions will tend to proceed.

  6. Statistical Mechanics and Entropy The Microscopic Viewpoint

  7. Back to our Box Imagine there were just two atoms of in each side of our box. o Only one possible arrangement If we remove the partition, there are 24 = 16 possible arrangements. Basic Postulate of Statistical Mechanics: a system is equally likely to be found in any of the states accessible to it. The odds of atoms being arranged the same as the initial way: 1 in 16. When we removed the partition, we simply increased the number of possible arrangements. Suppose we had a mole of gas. What are the odds of atoms being arranged the same way?

  8. Another Example Imagine two copper blocks at different temperatures separated by an insulator. What happens if we remove the insulator? Suppose we initially had 1 unit (quanta) of energy in the left block and 5 in the right (total of 6). How will they be distributed after we remove the insulator?

  9. How many ways can we arrange the energy? How many combinations are there that correspond to the first block having 1 quantum? Each of these arrangements is equally possible. o Although since we can t tell the individual quanta apart, some of these arrangements combinations are identical.

  10. How many when the left block has 2? Is there a rule we can use to figure out questions like this? E! W(e)= e!(E-e)! Where E is the total number of energy units and e is the number the left block has. Doing the math, we find there are 15 ways.

  11. How many all together? The system is symmetrical, so there are also 15 ways of distributing the energy when the left block has 4 and right 2. There are 20 ways of distributing our 6 units of energy so that each block has three, and only one way each when all the energy is on one side. Dong the math, there are a total of 64 ways. A simpler way: 6 quanta distributed between 2 blocks can be distributed in 26 = 64. o The chances of any particular state occurring are ( )6.

  12. Whats the point? There were 20 ways of distributing the energy equally, 2x15 ways where one side had 4 and the other 2, 2x6 ways where one side had one and 2 ways where one side had none. The point is that there are many ways to distribute energy for some values of e and only a few for other values. The chances of an equal distribution are 20x(1/2)6 = 20/64 = 0.3125. The chances of the left block having all the energy are only 1/64.

  13. Calculating Probabilities Suppose there are 20 quanta of energy to distribute. Too many to count individual combinations! We need an equation. The equation will simply be the number of combinations corresponding to a given state of the system, ( ) (e.g., one block having = 5 quanta) times the probability of any particular combination occurring, in this case ( )20: P( ) = ( ) C o where C is the probability of any combination occurring: ( )20 and is W(e)= e!(E-e)! E! E! e!(E-e)!peqE-e P(e)= More generally, where p is the probability of an energy unit being in the left block and q is the probability of it being in the right. This equation is known as the binomial distribution (can be computed with the this is the BINOMDIST function in Excel). In this case, p and q are equal, so it reduces to: o o E! P(e)= e!(E-e)!pE

  14. Energy Distribution We can use the BINOMDIST function in Excel to compute the probability. We see an equal or nearly equal distribution is the most likely outcome. o Imagine if we had 1020 quanta to distribute one block having a few more or less would make little difference. The point is that energy is distributed equally between the blocks simply because that is the most likely outcome.

  15. Entropy and Thermodynamics Unlike a simple physical system (a ball rolling down a hill), in thermodynamics whether or not a chemical system is at equilibrium depends not on its total energy, but on how that energy is internally distributed. Clearly, it would be useful to have a function that could predict the internal distribution of energy at equilibrium. o The function that predicts how energy (or atoms) will be distributed at equilibrium is entropy. We found that in this case (blocks of equal size and composition), energy will be distributed equally between them at equilibrium. How do we develop this mathematically?

  16. Predicting the Equilibrium Distribution Consider again the energy distribution. What is maximized when the system is at equilibrium? How do we know when that function is maximized? What is mathematical characteristic does the maximum of function have? Its first differential is 0.

  17. Probability Function So the function we want is the probability, P of the system (a copper block in this case) being found in a particular state, e. As we have seen, this is proportional to the number of combinations, (e), corresponding to that state: P(e) = C (e) o So we can use the function (e) in place of P(e) A difficulty here is that both P(e) and (e) are multiplicative. So the number of assessable states for the two blocks is left x right. Our other thermodynamic properties are additive. How do we convert a multiplicative property to an additive one? o Take the log: ln ( ).

  18. Defining Entropy Ludwig Boltzmann defined entropy, S, as S = k ln o Where k is Boltzmann s constant. o R = k NA Entropy is a measure of the randomness of a system. An increase in entropy of a system corresponds to a decrease in knowledge of it. We can decrease the entropy of a system , but only by increasing the entropy of its surroundings.

  19. Temperature & Statistical Mechanics We noted that we can find the maximum of the probability function by taking its derivative. For our blocks, equilibrium occurs where dlnW(E)left dEleft =dlnW(E)right dEright So this function / E, appears to be a useful one as well; we ll call this function . The two blocks are in equilibrium when left= right. We other variable is equal when the two blocks are in equilibrium? o Temperature! It turns out that = 1/kT

  20. The Second Law Entropy also has the interesting property that in any spontaneous reaction, the total entropy of the system plus its surroundings must never decrease. In our example, this is a simple consequence of the observation that the final probability, P(E), and therefore also , will be maximum and hence never less than the original one. Any decrease in entropy of one of the blocks must be at least compensated for by an increase in entropy of the other block. Experience has shown that in any spontaneous transition, the increase in entropy will always exceed the ratio of heat exchanged to temperature.

  21. The Second Law In 1856 German physicist Rudolf Clausius proposed what he called the second fundamental theorem as In the case of a (fictive) reversible process, this becomes an equality. (Actually, Clausius originally wrote it as an equality). Boltzmann s statistical interpretation came nearly 40 years later.

  22. Integrating factors and exact differentials Similarly for heat dQrev/T = dS Here we convert heat to converted to an exact differential. dW is an inexact differential, and dV is an exact differential. Since dWrev = -PdV, dWrev can be converted to a state function by dividing by P since dV =-dWrev/P Any inexact differential that is a function of only two variables can be the state function entropy by dividing by T.

  23. Back to our box Entropy relates not just to the distribution of energy, but of atoms and molecules as well. In our box with two gases, the most likely state after we removed the partition was one in which the He and Ne atoms would be randomly mixed, with equal numbers on both sides. What changed when we removed the partition from our box was that the entropy of the system increased.

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#