# Not Even Wrong Concepts in Physics: Entropy

Pentcho Valev
Pentcho Valev
The following argument is obviously valid:

If there is no evidence that the entropy is a state function for ANY system, then the concept of entropy is not even wrong.

Is there any evidence that the entropy is a state function for ANY system? No. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

http://mutuslab.cs.uwindsor.ca/schurko/ ... 40_l10.pdf
"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf
"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

https://en.wikipedia.org/wiki/History_of_entropy
"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

Pentcho Valev

Anonymous
Anonymous
Let me guess, you never had a class on thermodynamics. All your "knowledge" is from Google University, and then mangled by you deeply flawed brain. In addition to that you have absolutely no idea how the method works. Using arguments such as "obviously false" is laughable. Every statement needs to be proven, shown that it follows from previous proven statement or postulated based on empirical evidence. Actually your statement says "Pentcho is too stupid to understand entropy, so it must be wrong."

And while there are some jokes around it (like Neumann's case) because it is not very intuitive, scientists understand it very well. It is just that you are not among them. Of course the Carnot cycle can be broken into smaller cycles, that is clear to everyone who understands what thermodynamic work is and can integrate. So essentially saying that breaking a macroscopic Carnot cycle into smaller is impossible either goes against the established concept of reversible work, or goes against the basics if integral calculus. Are you going now to claim that Newton messed that up?

Pentcho Valev
Pentcho Valev
The following argument is obviously valid:

If there is no evidence that the entropy is a state function for ANY system, then the concept of entropy is not even wrong.

Is there any evidence that the entropy is a state function for ANY system? No. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

http://mutuslab.cs.uwindsor.ca/schurko/ ... 40_l10.pdf
"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf
"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

https://en.wikipedia.org/wiki/History_of_entropy
"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

Pentcho Valev
The version of the second law of thermodynamics stated as "Entropy always increases" (a version which, according to A. Eddington, holds "the supreme position among the laws of Nature") is in fact a theorem deduced by Clausius in 1865:

http://philsci-archive.pitt.edu/archive/00000313/
Jos Uffink, Bluff your Way in the Second Law of Thermodynamics, p. 37: "Hence we obtain: THE ENTROPY PRINCIPLE (Clausius' version) For every nicht umkehrbar [irreversible] process in an adiabatically isolated system which begins and ends in an equilibrium state, the entropy of the final state is greater than or equal to that of the initial state. For every umkehrbar [reversible] process in an adiabatical system, the entropy of the final state is equal to that of the initial state."

Clausius' deduction was based on three postulates:

Postulate 1 (implicit): The entropy is a state function.

Postulate 2: Clausius' inequality (formula 10 on p. 33 in Uffink's paper) is correct.

Postulate 3: Any irreversible process can be closed by a reversible process to become a cycle.

All the three postulates remain totally unjustified even nowadays; Postulate 3 is almost obviously false:

Uffink, p.39: "A more important objection, it seems to me, is that Clausius bases his conclusion that the entropy increases in a nicht umkehrbar [irreversible] process on the assumption that such a process can be closed by an umkehrbar [reversible] process to become a cycle. This is essential for the definition of the entropy difference between the initial and final states. But the assumption is far from obvious for a system more complex than an ideal gas, or for states far from equilibrium, or for processes other than the simple exchange of heat and work. Thus, the generalisation to all transformations occurring in Nature is somewhat rash."

Note that, even if Clausius's theorem were correct (it is not), it only holds for "an adiabatically isolated system which begins and ends in an equilibrium state". This means that all applications of "Entropy always increases" to processes which do not begin and end in equilibrium are unjustified (even if the theorem were correct!). Needless to say, scientists couldn't care less about that.

Pentcho Valev

Anonymous
Anonymous
The following argument is obviously valid:

If there is no evidence that the entropy is a state function for ANY system, then the concept of entropy is not even wrong.

Is there any evidence that the entropy is a state function for ANY system? No. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

http://mutuslab.cs.uwindsor.ca/schurko/ ... 40_l10.pdf
"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf
"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

https://en.wikipedia.org/wiki/History_of_entropy
"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

Pentcho Valev
"Note that, even if Clausius's theorem were correct (it is not), it only holds for "an adiabatically isolated system which begins and ends in an equilibrium state". This means that all applications of "Entropy always increases" to processes which do not begin and end in equilibrium are unjustified (even if the theorem were correct!). Needless to say, scientists couldn't care less about that."

This is ********. The fact that the entropy may not increase in any system other than isolated is well known and explained in details in every introductory course on thermodynamics. That is why other potentials (e.g. Gibbs or Helmholtz) are used depending on the system. Entropy is always produced but the system may expel it into the surroundings. The fact is that all thermodynamic potentials (state functions) can be derived from the entropy law.

"Any time scientists use the term "entropy", they don't know what they are talking about."

Well it is quite clear that you do not know what you are talking about. You are just embarrassing yourself with your total lack of knowledge, understanding and critical thinking skills.

Anonymous
Anonymous
The following argument is obviously valid:

If there is no evidence that the entropy is a state function for ANY system, then the concept of entropy is not even wrong.

Is there any evidence that the entropy is a state function for ANY system? No. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

http://mutuslab.cs.uwindsor.ca/schurko/ ... 40_l10.pdf
"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf
"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

https://en.wikipedia.org/wiki/History_of_entropy
"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

Pentcho Valev
>>>This is ********

troll