- Introduction to entropy
- Second Law of Thermodynamics
- Work done by isothermic process
- Carnot cycle and Carnot engine
- Proof: Volume ratios in a Carnot cycle
- Proof: S (or entropy) is a valid state variable
- Thermodynamic entropy definition clarification
- Reconciling thermodynamic and state definitions of entropy
- Entropy intuition
- More on entropy
- Maxwell's demon
Introduction to entropy, and how entropy relates to the number of possible states for a system.
Want to join the conversation?
- Wait, if energy cannot be destroyed or created, then how does the entropy of the universe increase over time?(49 votes)
- Entropy is not energy; entropy is how the energy in the universe is distributed. There is a constant amount of energy in the universe, but the way it is distributed is always changing. When the way the energy is distributed changes from a less probable distribution (e.g. one particle has all the energy in the universe and the rest have none!) to a more probable distribution (e.g. most particles have an amount of energy close to the average), we say that the entropy increases.(126 votes)
- So, is entropy this disease / virus, or is it actually profitable for this Sun-dude to have entropy.
Like, is it a bad thing(-) or good thing(+) ? Or does it make no difference?
How does knowledge of entropy affect us ? Does it appear in our daily life or something?(19 votes)
- It's just a tendency of the universe. We, as living beings, fight against the increase of entropy in our bodies, as a result, it's a little bad for business if you consider it's inevitable that entropy increases.
On the other hand, it doesn't really matter for non-living beings. In fact, it may even be benefitial if you consider that non-living beings tend to the most stable state, such as an electron that "wants" to be in the shell closest to the nucleus. Heat is the highest state of entropy, supposedly, and that's where it all tends to: the heat death of the universe is when it finally becomes just heat, evenly distributed across space.(14 votes)
- Please provide some clarification.
I just don't see the moon - sun comparison as being a good example. For instance, I easily imagine a piece of sun breaking off and evolving to a collection of fused protons collapsing into a sphere similar to moon. But I can't imagine the reverse... a group of moons coming together to become our fusion-driven Sun.
I think I get Sal's mediating point that with quantity part of the Entropy equation, the Sun wins hands-down. And in the context of the age of the Universe, the variance between Moon/Sun energy states is likely small. But from my earth-view, it looks like the moon has runs its course and reached its lowest state ... unlike the Sun whose future processes are described to us in science class.(12 votes)
- What you describe, a piece of the Sun breaking off and fusing together to a cold lump of rock, can only happen when the energy in that piece of the Sun escapes the system. By the first law of thermodynamics, energy is never destroyed. In the Sun's case, it radiates outward in the form of light, and this light would ultimately increase the entropy of some other place in the universe.
On the other hand, imagine what would happen if the Moon would collide with some other cool object like the Earth. The particles from the Moon and the Earth would be sent in all sorts of unpredictable directions, which would require lots of information to describe. In other words, entropy increases. Chaos ensues, if you like. Both bodies would also heat up from the friction. And if enough moons would collide, a star could theoretically be born (though most stars are currently fusing hydrogen or helium, which the Moon has fairly little of).(16 votes)
- why hot water has less entropy (S < 0) and cold water has great entropy (S > 0)?(5 votes)
- It is because water when it goes from 4 degrees celsius to less, expands in volume and thus gains entropy.(5 votes)
- Is there a mathematical formula for entropy in general (not necessarily for thermodynamic entropy)?(3 votes)
- First it’s helpful to properly define entropy, which is a measurement of how dispersed matter and energy are in a certain region at a particular temperature. Since entropy is primarily dealing with energy, it’s intrinsically a thermodynamic property (there isn’t a non-thermodynamic entropy).
As far as a formula for entropy, well there isn’t just one. There are several because entropy can be explained and used in a variety of ways.
One equation is Boltzmann’s equation: S = k*ln(W), where S is entropy (the usual variable for entropy), k is Boltzmann’s constant which is equal to the gas constant divided by Avogadro’s number which is approximately equal to 1.38 x 10^(-23) J/K, and W is the number of microstates which is a unitless quantity. The unit for entropy is the same as Boltzmann’s constant which means that entropy can also be understood as how many energy can be dispersed at a certain temperature since entropy is temperature dependent. Boltzmann’s formula uses a more mathematical approach to entropy by using W. A microstate is a particular way to order energy. So W is measuring how many energetically equivalent microstates a system can organize its energy into to get the same macrostate (or the system as we can observe it on a human scale).
Another formula uses the second law of thermodynamics: ΔSuniv > 0, which put into words states that any spontaneous process increases the entropy of the universe (creates a positive change). A spontaneous process is one which happens naturally without the need of outside energy or work to help it along. The law and the formula essentially state that the universe naturally prefers greater levels of entropy and allows any process which causes a positive increase, but prohibits any process which causes a negative increase. This is why gas particles spread out in a container instead of concentrate themselves into a small area.
Usually when we do chemistry though we’re concerned not so much directly with the entropy of the universe, but rather the entropy change which accompanies a chemical reaction. So we can imagine a reaction as a system whose entropy we care about. And the entropy of the surroundings (everything but the system itself) also being influenced by the entropy of the system. Together we get a formula for the overall change of entropy based on the system and surroundings: ΔSuniv = ΔSsyt + ΔSsur, where ΔSsyt is the change in the entropy of the system and ΔSsur is the change in the entropy of the surroundings. Using this formula we can judge if a reaction is spontaneous or not based on whether its change in entropy, and the surrounding accompanying change in entropy, facilitates a positive or negative change in entropy for the universe (following the second law).
Measuring the change in entropy for a system is relatively easy, but measuring the change in entropy for the surroundings is less easy to do directly. But we can relate the change in the surrounding’s entropy to another thermodynamic quantity we know well, the enthalpy change of the system. The change in enthalpy for a reaction we can determine experimentally using calorimetry. Since a change in entropy (and enthalpy) for a system creates an accompanying change in entropy for the surroundings we can use the formula: ΔSsur = -ΔHsyt/T, where ΔHsyt is the change in enthalpy for the system, and T is temperature in kelvins. This equation only works at constant pressure since enthalpy equals heat only at constant pressure. And it only works at specific temperatures (different values of T cause different changes in entropy) since entropy is temperature dependent. It’s important to note the negative sign in the equation which means that the change in enthalpy for the system creates an opposite signed entropy change in the surroundings. So if a reaction is exothermic, a negative change in enthalpy, heat is given off by the reaction and is dispersed into the surroundings causing an increase in the surrounding’s entropy. The opposite effect happens for an endothermic reaction.
And then finally probably the most useful formula is Gibbs free energy which measures how spontaneous a reaction is. Using the above equations and some algebra we can construct the formula: ΔG = ΔH – TΔSsyt, where ΔG is the change in free energy for the system. The change in Gibbs free energy is also related to change in entropy for the universe: ΔG = -TΔSuniv (again through some algebra). Since the change in the universe’s entropy must be positive for a process to be spontaneous, this means ΔG has to be negative. This also means a positive change in free energy is nonspontaneous because it decreases the entropy of the universe. Free energy is useful because it predicts whether a reaction will happen or not essentially, and it’s based on the entropy change of the universe from the second law.
There’s even more formulae I can go over using entropy (some that use equilibrium constants), but those are the most crucial in my opinion for actually using entropy to solve problems.
Hope that helps.(5 votes)
- I encountered a question on a practice Chemistry Subject Test. It looked like this:
Which of the following spontaneous reactions has the largest increase in entropy?
The two choices that I was choosing between were
(A) H2CO3(s) --> H2O(l) + CO2(g)
(E) 2NH3 (g) --> N2(g) + 3H2(g)
I chose (A) because the substance is going from a solid to a liquid and gas while simultaneously increasing the number of molecules. I thought this would be more significant of a change than that of (E), which is increasing the substance, but staying as a gas rather than changing from solid to liquid/gas. However, the answer was (E). Could someone explain why?(4 votes)
- I know this is years old, but in case I can help someone along the road:
In chemistry, if a reaction results in more molecules than you began with, that is an increase in entropy. Here, answer (A) goes from 1 reactant to 2 products. However, answer (E) goes from 2 reactant molecules to 4 product molecules. Therefore, we would say that reaction has the largest increase in entropy. This is how I remember it from school; I guess you could think of it as if more molecules --> more possible configurations --> more entropy.(4 votes)
- I'm a bit confused about what is considered as a "high" entropy. If an object is able to have more molecule configurations, does that mean it has a higher entropy?(5 votes)
- Yes. More configurations = more entropy. In fact, entropy = K times the log of the number of configurations. S=K log W, or S=K log omega. (To really nail down what these configurations are, we need to get into quantum mechanics. Also, if we are interested in things that radiate photons, we must take into account the number of possible configurations of the photons.) (Also if we are interested in changes to the molecules themselves, then we need to take into account the possible configurations of the particles which make up the molecules.)(4 votes)
- Which has more entropy? A star or a black hole with the same diameter? The black hole has more mass, but the star would have the ability to move the molecules more. Or is that incorrect?(3 votes)
- A black hole has the maximum possible entropy for the volume of space inside their event horizon.(5 votes)
- Which state has the lowest and highest entropy among plasma , gas , liquid and solid ?(2 votes)
- lowest entropy: solid
hight entropy: plasma
Because the particles in a solid are least dispersed, the entropy is lowest. It is the opposite with plasma, where the particles are most dispersed and free to move around.(4 votes)
- I'm a little confused - if entropy is always increasing, how come sometimes chemical equations have less products than reactants. (for example, synthesis reactions). My chem teacher said that if there are less products that means the entropy is decreasing. Does this contradict the 2nd law of thermodynamics?(2 votes)
- So entropy isn't the only thermodynamic variable we consider with chemical reactions. We also consider changes in enthalpy (essentially heat) and another one called Gibbs free energy which combines enthalpy, entropy, and temperature to give us an idea if a reaction will happen or not.
But anyway entropy can be negative (or 0) for a chemical reaction and still occur because there are other variables to consider in a chemical reaction which also help it proceed.
But still the second law says that entropy ALWAYS increases which is still true even for reactions where entropy decreases. And this is where it's necessary to understand the idea of system and surroundings. A system is the object of interest which we study. In chemistry the system is often the reaction itself, the reactants and products. The surroundings are everything else other than the system. So it can be the container the reaction takes place in, the air or water surrounding the chemicals, and basically the rest of the universe are the surroundings.
So we can have a chemical reaction (a system) where the entropy decreases, but the surrounding's entropy increases to such an extent that the overall change in entropy is increasing. So the second law is true, but we just have to consider more than just the reaction itself to see it work properly.
Hope that helps.(4 votes)
What I want to do in this video is start exploring entropy. When you first get exposed to the idea entropy it seems a little bit mysterious. But as we do more videos we'll hopefully build a very strong intuition of what it is. So one of the more typical definitions, or a lot of the definitions you'll see of entropy, they'll involve the word disorder. So it might be considered the disorder of a system. Now with just that definition in your head, I want you to pause this video and I want you to compare this system to this system. I want you to compare this room to this room, and ask yourself, which of these has more entropy. And then I want you to compare the moon here to the sun, and these clearly aren't at scale, the sun would be way more massive or way larger if I was drawing it to scale. But which of these systems has more entropy? Alright, so I'm assuming you've had a go at it. So when you look at these rooms you might say okay this room over here, this looks ordered, It's a clean room. And this over here looks disordered, it's a messy room. So if all you had is this definition, you'd say okay maybe this one is more disordered, maybe this one has more entropy. And you wouldn't be alone in thinking that. In fact, even in a lot of textbooks they'll use this analogy of a clean room verses a messy room. And the messy room somehow being indicative of having more entropy. But this isn't exactly the case. This form of disorder is not the same thing as this form of disorder. So let me make this very, very clear. So something being messy, does not equal entropy. To think about what disorder means in the entropy sense we're going to have to flex our visualization of muscles a little bit more, but hopefully it'll all sink in. Entropy, this kind of a disorder is more of the number of states that a system can take on. What do I mean by states of a system? Well if I have a container like this, and if I have four molecules that are bouncing around. So I have this magenta molecule, I have this blue molecule, I have this yellow molecule right over here, and then I have a green molecule. Well this would be a particular state, a particular configuration. But that system these molecules are bouncing around could take on other configurations. Or it could take on other states. Or maybe the yellow molecule is over here, they bounce around enough for the yellow molecule to get there, the blue molecule to get over here, maybe the pink molecule is now over here, and the green molecule is now over here. And so a system can take on a bunch of different states. I've just drawn two states for this system. But there could be many, many more states for this system. So each of these are a particular state for the system. So imagine this system where I have this box with the four molecules in it, and let's compare it to another system where I have a larger box. And let's say it has even more molecules in it. Let's say that it has two yellow molecules, let's say that is has a blue molecule, let's say that it has a green molecule, let's say that it has a magenta molecule, this is fun. Let's say it has a mauve molecule right over here. So this system that is larger, there's more places for the molecules to be and there's actually more molecules in it. This can actually take on more configurations or more states. I've just drawn one of them but there's many more. If you imagine these molecules all bouncing around in different ways there's many, many different states that it could take on. So the system without even knowing what the actual molecules are doing at that given moment in time, we would say that there's more possible states relative to this one, this has fewer possible states. And because this system over here has more possible states, more configurations, it would take more to tell you exactly where everything is. We would say that this has more entropy. So when we talk about disorder, we're really talking about the number of states something could have. And it makes sense that this thing you could imagine there's a lot more stuff moving around and a lot more different directions and they have a lot more space to move around. So it makes sense that the system as a whole has more entropy. So when we talk about entropy we're not talking about any one of the particular states, any one of the particular configurations, we're talking about the system as a whole without really knowing exactly where the molecules are. In this example with the rooms, we're just talking about particular states. Messy is a particular state, clean is a particular state. But we're not talking about the number of configurations that a room could actually have. In fact if this room is larger, this room actually could have more configurations. And if we're talking about the molecular level if this room was warm and this room were cold, and actually if this room is just larger, it's going to have more molecules in it. And those molecules are going to be in way more configurations that they could be arranged so there could be an argument that this actually has a higher entropy. And so using that same reasoning, let's go back to that comparison of the moon and the sun. Which of these would have more entropy? Well let's think about it. The sun is larger, it has way, way more molecules and those molecules are moving around way faster and they're hotter and they're moving past each other. While the moon is small, it's cold, it has fewer molecules. It's for the most part rigid, it doesn't have a very high temperature so these things aren't moving around a lot. It has way fewer states, way fewer configurations than the sun does. So the sun's entropy, if you view it as a system. If you view the sun as a system, it's entropy is way higher than the moon. It's entropy is much larger than the entropy of the moon. Think about it, how much information you would need. You would need a lot of information if someone wanted to tell you where every molecule or every atom on the moon is. But you would need even more to know where every atom or molecule for in a given moment on the sun is. If you're just looking at the sun, wow all these things are moving around and it's this huge volume. And they're very energetic and all of these molecules. So hopefully this starts to give you a sense of what entropy is. And you might say okay this is all fun intellectual discussion, what's the big deal? But the big deal is that to some degree you can describe the universe in terms of entropy. As we learn in the second law of thermodynamics, the entropy in the universe is constantly increasing. We are constantly moving to a universe with more possible states, which has all sorts of interesting implications.