Free Republic Browse · Search News/Activism Topics · Post Article

To: FreedomProtector; grey_whiskers

Based on that link, Sewell seems unaware that entropy is a state function. Randomness has little to do with the entropy; no given configuration is random, only processes are random.

44 posted on 10/23/2006 3:27:34 PM PDT by Doctor Stochastic (Vegetabilisch = chaotisch ist der Charakter der Modernen. - Friedrich Schlegel)

To: Doctor Stochastic; grey_whiskers
Based on that link, Sewell seems unaware that entropy is a state function. Randomness has little to do with the entropy; no given configuration is random, only processes are random.

The thermodynamic state of a system is its condition as described by its physical characteristics. Temperature and internal energy are both state functions, the entropy function is a state function as well. Although normally expressed on a macroscopic scale (Clausius/Kelvin-Planck etc), it is also used in statistical mechanics, the microstate of that system is given by the exact position and velocity of every air molecule in the room. Entropy can be defined as the logarithm of the number of microstates. The microstate and macroscopic scale can be shown to be related by "rigorously by using integrals over appropriately defined quantities instead of simply counting states".

Entropy is a description of the state of the position and velocity of the matter in the system. Configurations can be random or orderly.

From Physics for Scientists and Engineers, Third Edition, Raymond A Serway James 1992 Updated Printing

As you look around beauties of nature, it is easy to recognize that the events of natural processes have in them a large element of chance. For example, the spacing between treees in a natural forest is quite random. On the other hand, if you were to discover a forest where all the trees were equally spaced, you would probably conclude that the forest was man-made. Likewise, leaves fall to the ground with random arrangements. It would be highly unlikely to find the leaves laid out in perfectly straight rows or in one neat pile. We can express the results of such observations by saying that a disorderly arrangement is much more probable then an orderly one if the laws of nature are allowed to act without interference.
One of the main results of statistical mechanics is that isolated systems end toward disorder and entropy is a measure of disorder. In light of this new view of entropy, Boltzmann found that an alternative method for calculating entropy, S, is through use the the important relation

S = k ln W

where k is Boltzmann's constant, and W (not to be confused with work) is a number of proportional to the probability of the occurrence of a particular event..." ....several probability examples follow...... Consider a container of gas consisting of 10^23 molecules. If all of them were found moving in the same direction with the same speed at the some instant, the outcome would be similar to drawing marbles from the bag [half are red and are replaced] 10^23 times and finding a a red marble on every draw. This is clearly an unlikely set of events.

Temperature and internal energy are both state functions, the entropy function is a state function as well. Entropy is a description of the probability of a given configuration or a "logarithm of the number of microstates".

One of the main results of statistical mechanics is that isolated systems end toward disorder and entropy is a measure of disorder.

There are many more interesting properties of entropy the second law gives us one defines isentropic, adiabatic, irreversibility etc...

Sewell adds: "According to these equations, the thermal order in an open system can decrease in two different ways -- it can be converted to disorder, or it can be exported through the boundary. It can increase in only one way: by importation through the boundary."

Sewell's tautology that "if an increase in order is extremely improbable when a system is closed, it is still extremely improbable when a system is open, unless something is entering which makes it not extremely improbable" is a simply a more general statement.

Although, the application of vector calculus is obvious as boundaries are defined for thermodynamic systems, Sewell's mathematical deduction and resulting tautology is brilliant and gives us another interesting practical property of entropy deduced from the second law in addition to the ones related to isentropic, adiabatic, irreversibility etc which we already know.

Thanks for the ping, Doctor Stochastic!
45 posted on 10/23/2006 7:18:50 PM PDT by FreedomProtector

To: Doctor Stochastic
Randomness has little to do with the entropy; no given configuration is random, only processes are random.

With a name like Dr. Stochastic, you come up with this? C'mon, doc.... Suppose there's a configuration C(t), subjected to a random process. Are you really gonna make the claim that you know C(t+1) with probability=1? Because that's what your comment seems to be saying.

51 posted on 10/24/2006 8:30:40 AM PDT by r9etb

 Free Republic Browse · Search News/Activism Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794