Maxwell's Demon ?

Point out news stories, on the net or in mainstream media, related to polywell fusion.

Moderators: tonybarry, MSimon

TallDave
Posts: 3152
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

All we can do is say what the probability of the value of q: P(q) = N(q)/(N(1)+N(2)+N(3)+N(4)). To actually know the value of q you have to perform a measurement, and measurements are not free, which goes right back to my previous post.
Shrug. We don't really care about measuring it at any given point in time. We're only describing probabilities over time. It's enough to know that it must eventually happen to say entropy must decrease at some point.
If you were obsessed with details, you would ask, How many ways can I have exactly 1000-1101? Just one...With that additional knowledge the same state suddenly has low entropy.


Sure, and if you identify every atom in each state you can say one arrangement of the atoms in a chamber of mixed gasses is as likely as any other, including one where the atoms spell out your name. The math works out the same on how often you will see your initial separated state. At that level of detail, we can say any state you pick is equally unlikely, but there are so many possible states you will have to wait a very long time to see your name, or any particular state you pick.

Of course, there is no entropy if you make every state equally meaningful. All arrangements have equal order, all arrangements have equal probability. But some arrangements have particular consequences, and we can assign meaning on that basis, at an aggregate level, without being subjective.

For instance, if you have a lot of heat in one place and very little in an adjoining space, you can do work. It's very unlikely the heat will do anything but flow to areas of less heat, so you can use the difference in heat to drive a piston or etc. Of course, you could say all the possible distributions of energy in the system are equally meaningful and equally probable, but only a tiny subset of them allow you to do work, and we say those states have lower entropy because there are relatively few such states among the total possible states, and we say distributions that do little to no work are high entropy, because there are more of them (even though each individual high-entropy state is no more or less common than any individual low-entropy state). So the distinctions between low and high entropy can be more than arbitrary.

This is why the entropic distinction is between (0/4,4/0) and (2,2) rather than between single states like 0110-1001 vs. 1001-1100. Entropy is only meaningful when you talk about distributions rather than single states.

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

Quote:
All we can do is say what the probability of the value of q: P(q) = N(q)/(N(1)+N(2)+N(3)+N(4)). To actually know the value of q you have to perform a measurement, and measurements are not free, which goes right back to my previous post.


Shrug. We don't really care about measuring it at any given point in time. We're only describing probabilities over time. It's enough to know that it must eventually happen to say entropy must decrease at some point.
Once the system is relaxed, the probability over time is constant until you perform a measurement. All states are still possible for ever and ever, not just a billion billion years. If the number of *possible* states is constant, then entropy is constant. And constant is constant is constant is constant. Where is the disconnect here?
Carter

TallDave
Posts: 3152
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

Once the system is relaxed, the probability over time is constant until you perform a measurement.
The probabilities describe the likelihood that entropy has decreased or increased at any point you measure it. Since the probability of measuring any given decrease is nonzero, eventually you could measure a total decrease if you measured enough. It might happen the first time, the 10^100th time, or whenever.

I'm not sure what your point is with the measurement QM disturbance thing. Why would the disturbance matter anyway, since the effect we're looking at is random? If you say it's a closed system and we can't measure a closed system then you're just negating the concept of a closed system as something we can ever say anything about.

Are you arguing the increase in entropy from the work to measure must exceed the reduction in the box? That's easy to get around. If you don't want to measure the same box 10^100 times, you could measure 10^100 boxes once each, and set the possible entropy of the box at greater than the cost to measure.
If the number of *possible* states is constant, then entropy is constant.
If that was true, every system's entropy would be constant, including the universe.

Why do we see entropy increase, then? Because, as in the box, even though the number of possible states is constant, there are a whole lot more possible states at higher entropies than lower ones, so over time we're much more likely to be in increasingly higher entropy states rather than increasingly lower entropy states.

Entropy is currently low in the universe, relative to being totally relaxed. So if, at any given point in time, you were to look around and measure entropy for the universe, you'd find it was higher than it was the last time you measured, because there are vastly more possible states in the high-entropy direction.

TallDave
Posts: 3152
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

Anyways, my point was just that the 2nd Law is a macroscopic theory; it's not totally inviolable, it's just overwhelmingly likely in most systems of sizes relevant to humans. In tiny systems like the 4 particle one, you can see entropy decrease pretty often:
[edit] Microscopic systems
Thermodynamics is a theory of macroscopic systems and therefore the second law applies only to macroscopic systems with well-defined temperatures. For example, in a system of two molecules, there is a non-trivial probability that the slower-moving ("cold") molecule transfers energy to the faster-moving ("hot") molecule. Such tiny systems are not part of classical thermodynamics, but they can be investigated by quantum thermodynamics by using statistical mechanics. For any isolated system with a mass of more than a few picograms, probabilities of observing a decrease in entropy approach zero.[3]
http://en.wikipedia.org/wiki/2nd_law_of_thermodynamics

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

TallDave wrote:Anyways, my point was just that the 2nd Law is a macroscopic theory; it's not totally inviolable, it's just overwhelmingly likely in most systems of sizes relevant to humans. In tiny systems like the 4 particle one, you can see entropy decrease pretty often:
[edit] Microscopic systems
Thermodynamics is a theory of macroscopic systems and therefore the second law applies only to macroscopic systems with well-defined temperatures. For example, in a system of two molecules, there is a non-trivial probability that the slower-moving ("cold") molecule transfers energy to the faster-moving ("hot") molecule. Such tiny systems are not part of classical thermodynamics, but they can be investigated by quantum thermodynamics by using statistical mechanics. For any isolated system with a mass of more than a few picograms, probabilities of observing a decrease in entropy approach zero.[3]
http://en.wikipedia.org/wiki/2nd_law_of_thermodynamics
If there is a population inversion - most of the atoms are excited - you can have negative temperatures in macroscopic systems. However the sign is a function of the math not "real" temperatures.

http://en.wikipedia.org/wiki/Negative_temperature
Engineering is the art of making what you want from what you can get at a profit.

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

Yes, you can measure a drop in entropy. And yes I am arguing that you can not measure a closed system. The very act of measuring means it is no longer closed by definition.

The number of possible states is not constant. Going back to the example, if we have a wall and, say, q = 2, then N = 36 possible states. But when we remove the wall and allow to relax, then SUM(N) = 70 possible states. If we now perform a measurement to find the value of q we reduce the possible number of states by SUM(N) - N(q). And say define entropy as S = k*ln(N) for simplicity. The very act of measuring changes the entropy of the system by delta(S) = k*(ln(N(q)) - ln(70)), which requires work.
Carter

TallDave
Posts: 3152
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

Great, then you've just negated closed systems as thought experiments. Congrats, I guess.
The number of possible states is not constant
Yes it is, there are only eight bits with two possible values, and you are restricted to sets that have four 1s and four 0s. No, the time before you open the divider doesn't count.
If we now perform a measurement to find the value of q we reduce the possible number of states by SUM(N) - N(q).
Really? Which states have you eliminated? How exactly have you made them impossible by measuring? Close the box, and they're all equally probable again. Did we just change the entropy again by closing it?
And say define entropy as S = k*ln(N) for simplicity. The very act of measuring changes the entropy of the system by delta(S) = k*(ln(N(q)) - ln(70)), which requires work.
Not sure where k comes from.

In any case, it doesn't matter if you measure. Entropy increases and decreases whether you measure it not.
Last edited by TallDave on Wed Jul 15, 2009 6:08 am, edited 1 time in total.

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

If you know the system is in in the order of 2/2, you have eliminated all states of order 0/4, 1/3, 3/1, 4/0. Obviously the number of states is not the same before and after the measurement. It's your prerogative to completely ignore the math.
Last edited by kcdodd on Wed Jul 15, 2009 6:11 am, edited 1 time in total.
Carter

TallDave
Posts: 3152
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

You haven't eliminated anything, you've just measured it. Those states aren't eliminated from possibility, they're just observed not to have happened at a given point in time.

Again, what happens when you close the box after measuring? Does entropy increase again from the act of not observing it?

That's a fairly useless definition of entropy. If the box had all the heat on one side, your definition says it has the same entropy as one that's mixed because we can't see it, and the entropy only changes if we measure it, and apparently reappears when we close it. I guess if I can ignore the math you can ignore reality.

Anyways, why not go further? Why not explicitly measure 1001-1101, and eliminate every other possibility? Now entropy is zero. So the unmeasured box has full entropy, the measured box has zero. This definition of entropy resolves to a simple binary: did you measure the box?
Last edited by TallDave on Wed Jul 15, 2009 6:32 am, edited 1 time in total.

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

So what is the entropy of Schroedinger's Cat?
Engineering is the art of making what you want from what you can get at a profit.

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

TallDave wrote:You haven't eliminated anything, you've just measured it. Those states aren't eliminated from possibility, they're just observed not to have happened at a given point in time.
At the time you are observing all those states are eliminated. When you make a measurement you are basically replacing the wall momentarily, which makes all those other states inaccessible.
TallDave wrote:Again, what happens when you close the box after measuring? Does entropy increase again from the act of not observing it?
When you stop observing you remove the wall again. The system relaxes from whatever state you just observed. If you perform successive measurements much faster then the system can relax, then you will get the same result, or at least results which indicate that not all states are equally likely. For example, if you start out in state 0/4, remove wall, and then immediately make a measurement then it is much more likely to be in state 0/4 or 1/3, while 4/0 is much more remote then it normally would be in completely relaxed state.

This is simple to understand rapidly removing and adding the wall does not allow the two sides to mix, thus preventing any changes between them. Only after time passes between measuring could you reasonably assume all states are equally likely once again.
TallDave wrote:That's a fairly useless definition of entropy. If the box had all the heat on one side, your definition says it has the same entropy as one that's mixed because we can't see it, and the entropy only changes if we measure it, and apparently reappears when we close it. I guess it's your prerogative to ignore reality.
If you *know* the heat to be on one side then it does not have the same entropy as when it is mixed. The process of relaxation is where the system maximises the entropy, or, maximized the number of states. If you have a model for the system you can model the process of relaxation without performing measurements. But you do not gain any new information from a model.
TallDave wrote:Anyways, why not go further? Why not explicitly measure 1001-1101, and eliminate every other possibility? Now entropy is zero. So the unmeasured box has full entropy, the measured box has zero. This definition of entropy resolves to a simple binary: did you measure the box?
Yes. The state right after you close the box will be very near 1001-1101, but after relaxation it is anything again. You can make as many measurements and do as much work on the system as you desire. The more you eliminate possibilities the more work you have to do to remove entropy from the system to force it into a specific state.
Last edited by kcdodd on Wed Jul 15, 2009 6:45 am, edited 2 times in total.
Carter

TallDave
Posts: 3152
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

Simon,

mass * energy / watts

It's a standard meow function.

kcdodd,

Again, that definition of entropy resolves to: did you measure the box? Not very interesting.
if you *know* the heat to be on one side then it does not have the same entropy as when it is mixed.
If you measure and find all the heat on one side, that state has less entropy than if you measure and find it mixed. We know this is true because it can do more work. Your definition ignores that.

Did you see the wiki? You can actually observe decreases.
Last edited by TallDave on Wed Jul 15, 2009 6:49 am, edited 1 time in total.

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

We are talking about fluctuations of entropy. Of course we have to define what it means to measure it for it to mean anything. But we can stick to hand waving if that's less boring.
Carter

TallDave
Posts: 3152
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

Yes, "how much entropy is in the box" is much less boring than "did we measure the box?" Your definition tells us nothing useful. If that's handwaving, then hurray for handwaving.

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

How can you say how much entropy is in the box if you never measure the box.
Carter

Post Reply