Main content

### Course: Physics library > Unit 10

Lesson 3: Laws of thermodynamics- Macrostates and microstates
- Quasistatic and reversible processes
- First law of thermodynamics / internal energy
- More on internal energy
- What is the first law of thermodynamics?
- Work from expansion
- PV-diagrams and expansion work
- What are PV diagrams?
- Proof: U = (3/2)PV or U = (3/2)nRT
- Work done by isothermic process
- Carnot cycle and Carnot engine
- Proof: Volume ratios in a Carnot cycle
- Proof: S (or entropy) is a valid state variable
- Thermodynamic entropy definition clarification
- Reconciling thermodynamic and state definitions of entropy
- Entropy intuition
- Maxwell's demon
- More on entropy
- Efficiency of a Carnot engine
- Carnot efficiency 2: Reversing the cycle
- Carnot efficiency 3: Proving that it is the most efficient

© 2024 Khan AcademyTerms of usePrivacy PolicyCookie Notice

# More on entropy

Distinguishing between microstates and macro states. How entropy S is defined for a system and not for a specific microstate. Created by Sal Khan.

## Want to join the conversation?

- I'm confused. If there are an infinite ammount of points in a 3 dimensional space, then can't a molecule occupy an infinite ammount of different states? (orientation, position, etc.) So even if you increase the volume, it's still and infinite ammount of states right? Or am I just misunderstanding the use of the word 'state'?(7 votes)
- You need to know that not all infinities are just as big. That is INF < 2*INF.

That is how entropy works.(5 votes)

- Can someone explain to me why elements in their standard states do not have 0 entropy? (i.e. S = 0)

Thank you(3 votes)- It depends on the standard state. Gas will always have high entropy, because there is little order. The same goes for liquids. For an entropy of zero in a solid, the temp must be 0K, so there is not too much movement, which causes disorder. Hope this helps(8 votes)

- One definition of entropy I have read is "a measure of the energy that is no longer available to perform useful work within the current environment", but it seems much different from the definition given here. Can someone help me make the connection between the two definitions?(4 votes)
- Suppose you have a gas that is expanding and moving a piston from left to right, thus doing "useful", i.e. mechanical, work. Since the gas particles are transferring a net rightward momentum to the piston, there is a net rightward velocity to the particles. There are not as many ways to get this state as there are to get a state where the particles are moving completely randomly. The piston eventually stops moving because the excess rightward momentum of the particles has all been transferred to the piston and now the particles are moving completely randomly, and the entropy of the gas is higher. Then no more useful work is possible. The particles are still moving and still have kinetic energy, but there's no way to capture it to do mechanical work.(2 votes)

- If heat is added to an ideal gas (+q) how much of that heat will be converted to mechanical work (-w)? Given that entropy is "a thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work" would it be as simple as dividing the heat put into the system (+q) by the temperature of the system (T) Q/T=S and deducting that S value from the heat put into the system (+q) to find the total amount of energy left to do mechanical work? If U = Q - W and 10 J of heat is added to the system, is the change in internal energy (U), which is the amount of energy not lost to work after the heat was added, equal to entropy's negative affect on converting that heat put into the system into mechanical work (-S)?(3 votes)
- but then why do we associate entropy with disorder ?? how does a disordered system take on more states ??(2 votes)
- entropy is the measure of disorder in a system.

in a solid substance, molecules dont move freely and thus the entropy of that system is small

in a liquid the molecules move a bit more freely and hence the entropy of htis system is more

in a gaseous system, the entropy is more as compared to the solid and liquid substance(3 votes)

- Can someone give me a super quick and easy definition of entropy?(1 vote)
- Entropy is just the measure of chaos within a system. The more disordered a system is, the more entropy it has. For example, a neat deck of playing cards has little entropy. However, if you throw the deck in the air, all the cards will go in random directions. Now, the cards have more entropy. Hope this helps!(3 votes)

- so only system has entropy and state does't has right? I mean if the state dissipates some heat then there would be entropy right?(2 votes)
- If the larger box had all 8 particles in one corner, would that change anything in relation to whether one can call it more/less entrophic than the small box? I realise that they're both micro states, and it doesn't tell you much about the system, but for that minuscule moment, would it change anything?(1 vote)
- The entropy is determined by the number of possible states, not by the particular state at one moment. 8 in one corner is just one of many many possible states, and we would expect that state to have as much likelihood as any other, so there's nothing special about it that would make you change your view of the entropy. If the particles were constrained to one corner, that would be a different story(4 votes)

- Change in Entropy(Heat absorbed by system/temperature at which heat was absorbed) is +ve when heat is actually absorbed or recieved or velocities of particle increase.

But in the case of**doubling the volume of container**volume increased but no heat is absorbed so how does the**change in entropy is positive**?(1 vote)- More possible states for the molecules to be arranged in.(3 votes)

- Does an increase in temperature on a system result in an increase in entropy?(1 vote)
- My textbook says that entropy is a unit which can only increase, like time does. The second law of thermodynamics also says that in any energy conversion the total entropy will increase. I hope this helped.(2 votes)

## Video transcript

I'm going to do one more
video on entropy. Maybe I'll do more
in the future. But I thought I would at least
right now do one more, because I really want to make clear
the idea of entropy as a macrostate variable. So let me write that down. An S, which is entropy, is
a macrostate variable. I'm going to write macro
in bold red. It's a macrostate
state variable. And I want to emphasize this. And I talked a little bit about
it in my previous video, but I think even then I wasn't
as exact as I needed to be. And the reason why I want to
say it's macro is because there's a very strong temptation
to point to particular micro states and
say, does this have higher entropy or lower entropy
than another? For example, the classic one. Even I did this. Actually, let me make a
bigger, thicker line. So you have this box that
has a divider in it. We've gone through this
multiple times. Let me draw the divider
like right there. OK. So at first if we have a
system where all of the molecules are over here-- so
that's our first scenario. And then our second scenario,
we've studied this a lot. We blow up the wall here. And we actually calculated
the entropy. Control. OK. Let me copy it and then
I'm going to paste it. Let me put these next
to each other. All right. So I have these two things. And then the second time,
I blew away this wall. Let me blow it away. Let me erase the wall there. And then we said, once the
system reaches equilibrium again-- remember, these
macrostate variables, like pressure, like volume, like
temperature, like entropy, are only defined once the system
is in equilibrium. So once the system is in
equilibrium again, these particles are now-- I wanted the
same number of particles, so let me erase some of these
particles, let me move them. Let me see if I can
select some. There. So we'll have a more even--
Let me just redraw. 1, 2, 3, 4, 5, 6,
7, 8 particles. Let me erase what I have. And
I'll make it with a more even distribution. So then, once I blow away the
wall, I might have 1, 2, 3, 4, 5, 6, 7, 8. Now, the reason why I'm doing
all of this is because there's a temptation to say that this
state, what I just drew you when I, you know, made sure to
blow these away and draw 8 more, I drew a microstate. This is a microstate. Anytime someone is actually
drawing molecules for you, they're drawing a microstate. Now, I want to be very clear. This microstate does not have
more entropy than this microstate. In fact, microstates you
don't have entropy. Entropy does not make sense. What you can say is a system--
and this time, I'm going to draw it without the particles. That if I have a container
that is this big, that contains-- so it has
some volume. So volume is equal to v1. Its temperature is equal
to a t1, and it has 8 particles in it. This has some entropy
associated with it. And what we can say, is if we
were to double the size of this container, which we did by
blowing away that wall, now all of a sudden our volume is
equal to 2 times v1, if we say this is double. Our temperature is still
equal to t1. We saw that a few videos ago. And we still have 8 molecules. The entropy of this
system is higher. So now entropy is higher. And I want to make this very
clear, because you never see it drawn this way. People always want to draw
the actual molecules. But that confuses the issue. When you draw actual molecules,
you're showing a particular state. For example, this system, if
we were to actually measure the microstate, it could be--
there's a very, very infinitesimally probability--
but all of the molecules, all 8 molecules might
be right there. I mean, it's almost, you know,
you could wait for the whole universe to come and go, and
it might not happen. But there is some probability
it would happen. So you can't assign entropy
to a particular state. All you can do is assign it
to a particular system. I want to be clear about that. So even I talked about
a clean and dirty room and all of that. Clean versus dirty room. And the point I was making is,
the entropy of a room is not dependent on its cleanliness
or its dirtiness. In fact, you could kind
of view these as states of a room. But even more, these
really aren't even states of the room. Because when a room is clean, or
a room is static at a macro level, they're static. If my books or lying like--
you know, sometimes people look at a deck of cards and
say, oh, if I have all my cards stacked up like this, or
if I have all my cards that are all messy like that, that
this has higher entropy. And I want to make
it very clear. I mean, maybe you can kind
of make an analogy. But that's not the case. Both of these systems
are macrostates. For example, it's not like
these cards are vibrating around any more than
these cards are. It's not like these can take
on more configurations than these cards can. So when you talk about entropy,
you're trying to take a macro variable that's
describing at a micro level. And the cards themselves are not
the micro level, because they're not vibrating around
continuously due to some kinetic energy or whatever. It's the cards' molecules that
are at the micro level. And if these cards, if you have
the same mass of cars as you have here, and if they're
at the same temperature, the molecules in these cards can
take on just as many states as the molecules in these cards. So they're going to have
the same entropy. Entropy is a macrostate
variable, or a macrostate function, that describes
the number of states a system can take on. So here, I would view the
cards as a system. And what we care about is not
the number of configurations the cards themselves
can take on. The cards aren't constantly
vibrating and changing from one thing to another. It's at the atomic level, at the
molecular level, that as long as we're above absolute
zero, which we pretty much always are, things are going
to be vibrating around continuously, and continuously
changing its state. So it's almost impossible
to measure the state. And since it's impossible to
measure the state, we use something like entropy to
say, well, how many states can we have? And I mean, all of these things,
entropy, whether we call it internal energy, whether
we look at entropy, whether we look at pressure,
volume, temperature. These are all, if you can think
about it in some way, these are shortcuts around
having to actually measure what each molecule is doing. And entropy, you can kind of
view it as a meta shortcut. I mean, temperature tells you
average kinetic energy, this tells you all of the energy
that's in it, this tells you, you know, how frequently the
molecules are bumping against a certain area. This tells you, on average, kind
of where the outermost molecules are. Entropy is kind of a, you
can almost view it as a metastate variable. It tells you how many states,
how many micro states can we take on? And so I just want to make this
very clear, because this is often confused, and there's
a very, very, very strong temptation to point to a
particular state and to say that that has higher entropy
than another, that somehow this state is more entropic
than that state. That's not the case. This system is more entropic
than this system, than this half box. It has more volume. If it has the same temperature
with more volume, then its particles can take on more
possible scenarios at any given moment in time. Anyway, hopefully you found
that a little bit useful. I just want to make it very,
very, very clear. Because this is often, often,
often confused. It is a macrostate variable for
a system, where a system is composed of things that are
bumping around randomly. Every millionth of a second,
they're changing states, so it's very hard to even measure
one of the microstates. You can't point to a microstate,
and say, oh, this has higher entropy
than another. Anyway. See you in the next video.