Talk:Entropy/Archive 5

Entropy (energy dispersal)

Hi, everyone. I have been trying to avoid this article owing to the character of the talk-page discussion and in regards to certain individuals who are using fowl language and for those who are trying to push “original” conceptions onto this page. For the record, Nonsuch, Jheald, and Yevgeny are correct in their viewpoint and I commend them for their efforts. As for User:Jim62sch, who said “Helllooooo???!!!! It really doesn't fucking matter”, you are disqualified from any further intellectual discourse. As for everyone else, let me remind you that the point of wikipedia is to publish established information. Presently, we have accumulated over 200 kilobytes of talk-page discussion all because of the personal views of one person, whom I won’t name, who states that he has previously taught entropy to thousands of humanities students even though, as he states, he didn’t really know what it was, and that he has never read any of the original manuscripts of either Sadi Carnot, Emile Clapeyron, or Rudolf Clausius, and who says that he is poor, or in his own words a “baby”, at math and statistics, but that after reading this article by Kelvin:

He suddenly figured it all out. The point of a talk page is not to debate someone’s personal theories.

I do not want to get involved in this debate, but I happen to have glanced at the entropy page a few days ago and I notice that the un-named individual is referenced nine times in the entropy article as well as two external links, all essentially going to the views of one person. There is even a “religious-viewpoint” entropy link in the external links? Now, I am no expert on entropy, but I do own about a dozen books on entropy:

  • Greven, A. (2003). Entropy – Princeton Series in Applied Mathematics
  • Dugdale, J.S. (1998). Entropy and its Physical Meaning
  • Fast, J.D. (1962). Entropy – the significance of the concept of Entropy and its Applications in Science and Technology
  • Brooks, D.R. (1998). Evolution as Entropy – Toward a Unified Theory of Biology
  • Yockey, H.P. (2005). Information Theory, Evolution, and the Origin of Life
  • Bailey, K. (1990). Social Entropy Theory
  • Sanford, J.C. (2005). Genetic Entropy and the Mystery of the Genome
  • Chalidize, V. (2000). Entropy Demystified – Potential Order, Life, and Money
  • Bent, H.A. (1965). The Second Law – an Introduction to Classical and Statistical Thermodynamics
  • Atkins, P.W. (1984). The Second Law

And the unnamed person is not mentioned in these books nor is his “personal” theory on energy dispersal. Because of this, I would favor the recommendation of User Par who suggests removing all of the energy dispersal parts. However, although the un-named person has never published his “energy dispersal” views neither in book form nor in peer-reviews article form, I do remember having his seen his name mentioned in one chemistry book, somewhere?

Thus, to remedy this awkward situation, I am going to be bold and move all of this “energy dispersal” stuff as well as all of the related talk page stuff to it’s own page, i.e. entropy (energy dispersal) and Talk:Entropy (energy dispersal), and cite the unknown person as the primary author of the concept. In this manner, the introductory or novice reader can get the core, basic, historical, published, sourced entropy view-point on the main page and go to the “see also” section for other such side-line “energy dispersal” theories. Please do not revert this change, this unnecessary original research discussion has gone on long enough. Thanks: --Sadi Carnot 12:51, 6 October 2006 (UTC)

I finished with the moving. I put the "see main" link of Entropy (energy dispersal) below the "Ice melting example" plus added it to the "see also" section. I cleaned the article a bit, added an image, cleaned the external links a bit (now that there is two pages), and clarified the intro. With this done, anyone who wants to debate about "energy dispersal" can do so on the Talk:Entropy (energy dispersal) page. Later: --Sadi Carnot 13:42, 6 October 2006 (UTC)
Sadi, where to start? That you are not an expert on entropy is clear, no matter how many books on entropy you claim to own. (Also, there's a bit of a gulf between owning books and having read and understood them). But I digress... That Frank Lambert is an expert on entropy, is also clear. Have you bothered to research just how many schools (both secondary and Colleges) link to his work, or how much his work has had an impact on High School and College text books? That his work does not agree with your "opinion" of what entropy is, is of little or no value. That you are having difficulty conceptualising the nexus between energy dispersal and entropy is, frankly, no one's problem but your own -- until of course, you bring that opinion here, make allegations bordering on libel, and proclaim yourself to be some sort of Gen-Xpert whom all should heed.
As for whether or not you think I'm fit to join the supposed intellectual discourse you wish to lead, I care not a whit -- although I admit that I find your arrogant dismissal amusing, as it is hardly within your purview to promulgate such inane fiats. •Jim62sch• 21:46, 6 October 2006 (UTC)
For the record, Sadi, you're not in a position to determine who's "disqualified" - if you have concerns about the conduct of a user there are procedures to deal with that. The "awkward situation" of extended debate results from some users' decision to produce elaborate arguments contesting a sourced viewpoint without producing sources critical of the sources given. Your unilateral decision to side-line “energy dispersal” approaches removes an approach which a number of authors have found useful in putting over the concepts of thermodynamics. Frank Lambert, whose name you seem to have some difficulty with, has been cited as his work is readily accessible. You appear to be demanding that the article advocates a single point of view and relegates to "See also" a valid approach for beginners which you have some problem with, though there appears to be nothing in this approach which contradicts historical viewpoints other than to make the generally accepted criticism that "disorder" can be confusing. This useful viewpoint should have a proper and proportionate place in the article. ...dave souza, talk 14:54, 6 October 2006 (UTC)
Sadi, or Libb, or whatever your name is -- when we archive, we archive the ENTIRE discussion, not just the parts we feel like archiving. Capisce? Your little powerplay here is coming to an end. •Jim62sch• 22:06, 6 October 2006 (UTC)

Frank Lambert's entropy theories

Gentleman, I have been working to build this entropy page for some time. My only interest is to see basic information on this page. I do not wish to dig into an argument with anyone. My change was only to move Lambert's entropy theories (which are self-sourced by his own websites) to their own page. Please let my revisions sit for a week to see what everyone else thinks according to vote. If, after we vote, the editors here favor keeping whole sections devoted to Lambert's theories on "energy dispersal" here with 9 links to his websites as references well than fine with me. Please let the changes sit for seven days and then we can revert or keep according to consensus. Thanks: --Sadi Carnot 12:50, 7 October 2006 (UTC)

Votes? We don't do "votes". As for consensus, that is trumped by several other policies, most notably WP:NPOV, WP:RS and WP:V.
Now, from what I can see this boils down to a refusal to consider and in some respects a desire to stifle a concept with which you are either unfamiliar or that does not gibe (in some unclear way) with what you may have learned. Our purpose here is not to stifle ideas -- especially when they are notable and verifiable.
Next, I reverted your version. If you wish to make such drastic changes to the article, you need to bring them up on talk, first...noty make the changes and see where it goes.
I'm not sure precisely what this means, i.e., what the implication of "basic information" might be, "My only interest is to see basic information on this page".
Finally, I do not expect to see any further pot shots taken on Frank Lambert, no will the attitude displayed on July, 13 "one chemical engineer (me) is more knowledgeable than any three chemists (you) put together." [1]. We are here to hammer out this article, and to include ALL notable viewpoints. •Jim62sch• 13:18, 7 October 2006 (UTC)
You have broken the WP:3RR rule (twice on the main page and once in archiving 95 kb worth of talk page), it is clear that you do not want to vote to find consensus, and it is obvious (from your edit history) that you are doing these edits based on a religious-bias. As such, I have requested for Administration help for this page:
Talk page problems
Admin help
Mediation request
Entropy (energy dispersal) - Votes For Deletion
Finally, the comment you are referring to was made in agitation after a personal attack was made not only directly at me, but also several other editors on this page (on several different talk pages). --Sadi Carnot 14:34, 7 October 2006 (UTC)
Obviously, counting is not one of your strong points. Given that I have made 3 mainspace edits to the article in 24 hours, I could not have violated 3RR. It really is that simple, Sadi -- in fact, I refer you to WP:3RR.
Religious bias? Have you been drinking? You make this pronouncement based in my edit history? Please, humour me by explaining this massive leap in logic.
Again, regarding all of your odd accusations, you have failed to provide links to support your delusions.
BTW, I've seen your "requests for help", even commented on one. •Jim62sch• 21:15, 7 October 2006 (UTC)


I have to agree with Sadi that this article should better be free from the theory of Frank Lambert. I'm going further and proposed Entropy (energy dispersal) for deletion. Whereas Lamberts negative statements about entropy not being "disorder" are fine, they are hardly original. But his own theory doesn't gathered enough recognition to be included in an encyclopedia. --Pjacobi 14:58, 7 October 2006 (UTC)
Yes, I will vote to delete that article. Also, these votes are reprinted from the above maze of argument to highlight consensus (which goes back to Lambert’s push to eliminated “disorder” form the article):
" but the current view of Entropy is moving very much away from any descriptions involving disorder. ". That's simple not true in general. Certainly, evidence has been advanced that describing entropy as 'disorder' should he handled with care, and that introductory physical chemistry textbooks are avoiding the use of the word. And certainly there are some scholars who vigorously oppose any mention of 'disorder' at all. But I don't see any move away from 'disorder' and similar descriptions in the wider scientific community. Nor even, broadly, among physical chemists. Nonsuch 07:30, 17 July 2006 (UTC)
I plan to remove all references to entropy increase as a dispersal of energy. :The bottom line is that nobody, Frank Lambert included, ever quantitatively defines energy dispersal. Please, somebody, define it, or lets forget about it. PAR 03:59, 28 September 2006 (UTC)
I support PAR in saying that defining entropy as "dispersal of energy" is incorrect, as we already discussed several months ago, when Frank Lambert (the author of the reference) was here in Wikipedia and suggested these ideas. Yevgeny Kats 20:44, 28 September 2006 (UTC)
It's true, an increase in entropy (for an overall fixed total energy) means there's less energy available hypothetically to do useful work. But insisting the energy has become more dispersed seems (at least to me) just not quite the right word. Jheald 15:24, 29 September 2006 (UTC)
Moreover, I’ve just seen that this “energy dispersal” view has infected the second law of thermodynamics article as well as that talk page. --Sadi Carnot 15:20, 7 October 2006 (UTC)
Infected? Smug little thing aren't you, Sadi? (Also seems to me you're doing a real disservice to the good name of the real Sadi Carnot).
Also, could one of you explain the need to make a relatively simple concept like entropy more complex than M-theory? •Jim62sch• 21:20, 7 October 2006 (UTC)

To aid Sadi, PJacobi, Nonsuch and others to assess the breadth of the acceptance of my views in the scientific community's area of chemistry, I am assembling the list of published books, textbooks, and peer-reviewed journals that refer to my writing about energy dispersal and entropy and the second law. It will be posted here by 23:00 GMT 8 October. FrankLambert 21:17, 7 October 2006 (UTC)

Your expertise is most welcome, per WP:NOR ""No original research" does not prohibit experts on a specific topic from adding their knowledge to Wikipedia. It does, however, prohibit expert editors from drawing on their personal and direct knowledge if such knowledge is unverifiable. Wikipedia welcomes the contributions of experts, as long as these contributions come from verifiable (i.e. published) sources. Thus, if an editor has published the results of his or her research elsewhere, in a reputable publication, then the editor may cite that source while writing in the third person and complying with our NPOV policy. ". KillerChihuahua?!? 15:22, 8 October 2006 (UTC)

"List of notable textbooks in statistical mechanics" proposed for deletion

People following this page might like to express their views in the AfD, one way or the other. Jheald 03:55, 8 October 2006 (UTC).


Jheald's humorous gimmick reminds me of a 2002 email to me from an author of three editions of a widely used general chemistry text. From an early chapter, the text was replete with the use of disorder as a theme for spontaneous chemical behavior, culminating in his thermodynamics chapter. (Thus, he was extremely upset by my article decrying the use of "Entropy is disorder".) He took three pages to explicitly developed the frightening case: "... you denigrate three of the pillars of classical thermodynamics: Boltzmann, Helmhotz, and Gibbs -- and the modern Gregory Chaitin, one of the seminal thinkers of our time.."
The fourth edition of his text appeared in 2006. There is not one use of the word "disorder" in it and he defines entropy as a measure of the dispersal of energy.
Real science moves, changes. But be reassured, Jheald, valid new ideas only refine, they do not displace true foundations -- and they are ultimately accepted by critical thinkers. [Hang in there until late today!] FrankLambert 16:22, 8 October 2006 (UTC)


The fact that science is a living, breathing discipline, one that changes willingly as new information is received, is what makes science great. Scientific theories, hypotheses, and ideas must be dynamic, must be tentative, should aim for parsimony and should be progressive -- thus they should change as needed, with these changes (if the theory is good) strengthening the theory. That people were taught, incorrectly, that entropy was chaos or disorder is unfortunate, but that the teaching of entropy is evolving to dismiss that notion is wonderful. (BTW, note my various comments to this effect here). •Jim62sch• 17:00, 8 October 2006 (UTC)
No humorous gimmick, Frank. If you click through to the Wikipedia article linked to above, you will see that someone has indeed proposed the article should be deleted - that it should cease to exist, that no trace of it be left on Wikipedia. The machinery has been started. The article will get deleted, or not, according to the balance of comments that get received on the AfD deletion talk page. Jheald 17:10, 8 October 2006 (UTC).
Ah, this just gets better. Dig away. •Jim62sch• 18:15, 8 October 2006 (UTC)

Thanks for the heads-up on this AfD, Jheald. Your vote to keep the article seems right to me, and most of the other contributors appear to support this position so the page should not be in danger. ...dave souza, talk 19:01, 8 October 2006 (UTC)

Physical chemistry

To resolve this issue, I am going to type up the correct presentation of entropy by Nobelist Gilbert Lewis and Merle Randall from their famous 1923 textbook Thermodynamics and the Free Energy of Chemical Substances. According to both chemistry historian Henry Leicester, from his The Historical Background of Chemistry (1956), and Nobelist Ilya Prigogine, from his 1998 textbook Modern Thermodynamics, before 1923 chemists did not make use of “entropy” but instead used the concept of chemical affinity to calculate the driving force of chemical reactions. According to these sources, Lewis and Randall’s influential textbook led to the replacement of the term “affinity” by the term “free energy” in the English-speaking world.

Hence, all modern-day chemistry textbooks are based on Lewis and Randall’s description of entropy, which they define as, based on the work of Rudolf Clausius and Willard Gibbs, a “scale of irreversibility” that quantitatively measures the irreversible "work energy" or the "degree of degradation" of the system in the irreversible process. This is the correct perspective. I will type up the full presentation, from their textbook, over the next few days (Sorry, I ran out of time today). --Sadi Carnot 18:01, 8 October 2006 (UTC)

Of interest to all here

see [2] :

Ah, so you misused Admin tools, eh? (Yes, the protect tag is only for admins). This shall prove to be interesting, I think.
From WP:PPOL
Administrators have the ability to protect pages so that they cannot be edited, or images so that they cannot be overwritten, except by other administrators. Administrators can also protect pages from moves only. Administrators have the additional ability to protect pages from being edited by unregistered or very new users.
These abilities are only to be used in limited circumstances as protected pages are considered harmful.
Admins must not protect pages they are actively engaged in editing, except in the case of simple vandalism.
Clearly an RfC is in order here, if not an RfA. I've contacted several Admins to get their opinion on how best to deal with your behaviour. BTW: KillerChihuahua, the person you reverted, is an admin. As I said, this shall prove interesting, I think. •Jim62sch• 18:11, 8 October 2006 (UTC)

Jim, I the page is only locked so that we can all come to an agreement in order to reasonably solve this issue. People have been trying to push Frank Lambert's personal theories, which do not accord with thermodynamics textbooks, into this article since June of '06. This is a long-standing issue. --Sadi Carnot 18:17, 8 October 2006 (UTC)

However, my good man, you have no right to lock, protect, etc., the page, only an Admin (which you are not see here, all) can lock it. I can just as easily unlock it (as vandalism), but I'll let an admin do it. Also, the transparency of your archiving is rather clear, and we'll deal with that in due time. Finally, your edit summary is rather misleading -- there is no clear-cut consensus -- there are 5 people who want your version which has the merit of being agreed to in the same way that may believed in geocentrism, and 4 who want the other version, or want that version to remain until the issue is resolved (not, of course that this is a vote). You see, Sadi, even if there were consensus, it would not override WP:V, WP:RS, WP:NPOV nor WP:NOR (which, as pointed out by KillerChihuahua (see WP:LA, Frank is in no danger of violating). Thus, you see, your points are, how can I put this, irrelevant. •Jim62sch• 19:15, 8 October 2006 (UTC)
NOTICE: Adding the tag does not make it protected - it is tag vandalism, adding an inaccurate tag. Don't do it again, Sadi. Even if you were an Admin and could protect the page, it would be misuse of the tool to protect a page you were editing; or your preferred version of any page. KillerChihuahua?!? 19:33, 8 October 2006 (UTC)
I assumed that since I the tag saved on the edit that I had the rights to use it. Whatever the case, being that this page been reverted close to a dozen times, by multiple seasoned editors, someone should have put a lock on the page long ago so that we could discuss issue properly on the talk page. There are so many issues that are awry here, e.g. self-promotion, intelligent-design issues, divine intervention comments and edits, using multiple reference links to the same website over using standard article or textbook references, using months and 100s of kilobytes of talk page space to debate someone’s pet theory, using talk page space to argue that laws of science are false, etc., that this whole situation is making a mockery the Wikipedia science section. --Sadi Carnot 11:50, 9 October 2006 (UTC)

Notable content

Concerning this edit: [3] "Rv non-notable person's theory deleted again" This makes no sense. Either we should not have an article on this person, and we do, or he is notable. Pjacobi, please explain your reasoning - thanks. KillerChihuahua?!? 20:03, 8 October 2006 (UTC)

Entropy is a mainstay of physics. We can add the interpretions of any number of Nobel laureates or other first class physicists. Or what's covered in known good textbooks. But not those of any college professor.
He may be notable enough for wanting to reform how Entropy is taught in some area of the world in some faculties, but the evidence isn't presented yet, e.g. at Wikipedia:Articles for deletion/Entropy (energy dispersal). Note the comment if User:Linas, one of our best contributors in mathematics. Given his judgement, I don't give Lambert's approach a glorious future.
Pjacobi 20:25, 8 October 2006 (UTC)
Teached? ROFL
You've not been back to Wikipedia:Articles for deletion/Entropy (energy dispersal) yet, have you?
Heaven forfend Linus might be wrong. Couldn't happen.
I'd be very interested in knowing what your take is on entropy -- no doubt the usual chaos and disorder, eh? •Jim62sch• 21:29, 8 October 2006 (UTC)
Thanks for helping improving my English skills.
And, microstates per macrostate, didn't I state that already elsewhere? S = k ln (Unkenntnis) as Becker taught (hah!) us in "Theorie der Wärme".
Pjacobi 21:36, 8 October 2006 (UTC)
Hah? Was ist das?
In any case, it's interesting to see how many minds Frank Lambert has managed to pollute with his silly take on entropy. [4], [5], [6], [7],

[8], [9], [10], [11], [12], [13], [14] . Oh yeah, there are more -- I'm just "warming up".  ;)

BTW, PJ, you never did answer my final question. •Jim62sch• 23:29, 8 October 2006 (UTC)
Another final question? Your last final question I answered above. And the most general treatment of entropy without building upon the view of statistical mechanics is the classical treatise by Caratheodory. --Pjacobi 08:01, 9 October 2006 (UTC)

Non-notable?

There are two authors of a popular scientific book who used my ideas as a starting point for a couple of pages and ascribed that start to me. Far more important, the names of some 36 textbook authors (who have been so convinced of the validity of my approach to entropy that they have risked their reputations on it) and their text titles are listed below.

The actual truth is that not one of the authors would risk a minute or their publishers a dollar on my approach if they were not absolutely sure it is the wave of the future! (All but one of the 38 authors in the list below have a Ph. D. and all but two are professors of chemistry or professors emeriti.)

Popular Scientific Book

“Into the Cool – Energy Flow, Thermodynamics, and Life” by Eric D. Schneider and Dorion Sagan; University of Chicago Press, 2005, 362 pages, ISBN 0-226-73936-8

Dorion Sagan, the eldest son of Carl Sagan and a professional science writer, chose my introduction to the second law for his collaboration with Dr. Schneider on energy flows in life and between earth and sun. Specifically:
p. 82: “Chemist Frank L. Lambert, professor emeritus at Occidental College, Los Angeles, points out that the second law explains …” (discussion continues through to mid-page 84)

Textbooks

“Chemistry, The Molecular Science”, Second Edition J. W. Moore, C. L. Stanistski, P. C. Jurs; Thompson Learning, 2005, 1248 pages, ISBN 0-534-42201-2

p. xiii, xiv: “New in This Edition …Revised Chapters 14 and 18 to more clearly present entropy and dispersal of energy (see Lambert, F. L. J.Chem.Educ 1999, 76, 1685; 2002, 79, 187)”

“Chemistry, The Molecular Nature of Matter and Change”, Fourth Edition M. S. Silberberg; McGraw-Hill, 2006, 1183 pages, ISBN 0-07-255820-2

p. xviiii: “Chapter 20 has been completely rewritten to reflect a new approach to the coverage of entropy. The vague notion of “disorder” (with analogies to macroscopic systems) has been replaced with the idea that entropy is related to the dispersal of a system’s energy and the freedom of motion of its particles.
p. xix: “The following experts helped me keep specific content areas up-to-date and accurate: Frank Lambert of Occidental College for insightful advice and comments on the coverage of entropy (Chapters 13 and 20); …”

“Conceptual Chemistry”, Second Edition J. Suchocki; Benjamin Cummings, 2004, 706 pages, ISBN 0-8053-3228-6

Suchocki’s text introduces chemical principles to non-science majors, used in the type of course in the ‘hard sciences’ that has become required in most colleges and some universities. This is a major reason, I would think, that students of this sort who may become leaders in society are aided by Wikipedia’s providing them additional information. Publishers do not release figures about the success of their texts but Dr. Suchocki told me that this is the second best-selling text in chemistry for non- science students among some 24 competitors.
p. xxi: “New to the Second Edition…Perhaps the most significant change is the addition of a section on the second law of thermodynamics…This section was developed with the great assistance of Frank L. Lambert, Professor Emeritus of Occidental College. Frank aptly points out that entropy is merely a gauge of the natural tendency of energy to disperse. For more on this fresh and simple approach, be sure to explore www.secondlaw.com. “ [[During the US school year, secondlaw.com has ~20,000 readers monthly, out of some 100,000 ‘hits’. Perhaps, for a month each year, a few hundred come from this text.]]

“Chemistry, Matter and Its Changes”, Fourth Edition J. E. Brady and F. Senese; John Wiley, 2004, 1256 pages, ISBN 0-471-21517-1

p. iv-v: “Chapter by chapter changes…In Chapter 14 we discuss spontaneous mixing as a tendency toward more probable states rather than a tendency toward disorder in preparation for an improved treatment of entropy in Chapter 20.…“We have changed our approach to presenting Thermodynamics (Chapter 20). This chapter now explains entropy as a measure of the number of equivalent ways to spread energy through a system.” [I worked directly with Drs. Brady and Senese on their revision.]

“General Chemistry”, Eighth Edition D. D. Ebbing and S. D. Gammon; Houghton-Mifflin, 2005, 1200 pages, ISBN 0-618-399410

p. xxiv: “Revision of Material on Thermodynamics…the discussion of entropy in Chapter 19 was revised to present entropy as a measure of the dispersal of energy.” [An entire special page with colored background and featured as “Entropy and Disorder” in the seventh edition was deleted in the eighth along with the old definition “Entropy is a thermodynamic quantity that is a measure of randomness or disorder”.]


“Chemistry: A General Chemistry Project of the American Chemical Society”, First Edition J. Bell, et al.; W. H. Freeman, 2005, 820 pages, ISBN 0-7167-3126-6

p. 18 of the Beta Version of Chapter 8, Entropy and Molecular Organization: “Entropy is the thermodynamic quantity related to W, the number of possible arrangements of particles and energy: the larger the number of possible arrangements, the larger the entropy of the system.” [Although Dr. Bell uses the word “disorder” twice in the first three lines of this chapter, it does not appear elsewhere and his pages of discussion lead directly to the above definition of entropy only in terms of arrangements of particles and energy. (Of course, I would prefer “the number of accessible arrangements of particles with their energy!)]


“Atkins’ Physical Chemistry”, Eighth Edition P. Atkins and J. de Paula; W. H. Freeman, 1072 pages, ISBN 0-7167-8759-8

Previous editions of “Atkins” – the world-wide best seller in physical chemistry texts -- had included the view of entropy change as the dispersal of energy, but that latter phrase was always ended with “into a more disordered form” . This leads, far too readily, to “into disorder”. Indeed there were some 27 instances of using order-to-disorder contrasts as a rationale for entropy increase in the chapter dealing with entropy in the seventh edition. I urged the authors to consider, as they well knew would be better, simplifying the statement as they have done on p. 78: “entropy…is a measure of the energy dispersed in a process.” (The word “disorder” is only introduced twice in the chapter of the eighth edition that introduces entropy and then only to discourage its use.)

Physical Chemistry for the Life Sciences, First Edition P. Atkins and J. de Paula; W. H. Freeman, 699 pages, ISBN 0-7167-8628-1

This new text of physical chemistry differs somewhat from “Atkins” in its presentation of entropy by stating that “The measure of the dispersal of energy or matter used in thermodynamics is called the entropy, S.”
It is certainly true that the dispersal of energy separate from matter is important in physics but in chemistry, as I have said repeatedly, the energy that is becoming dispersed is that of molecular motion – energy that is inseparably associated with physical molecules and their momenta. Thus, I prefer never separating the two in statements or students will inevitably get a sense of molecules being able to be somewhere or effect some change without regard to their speeding in real space.


“Chemistry: The Central Science”, Tenth Edition T. L. Brown, H. E. LeMay, and B. E. Bursten; Prentice Hall, 2006, 1248 pages, ISBN 0-13-109686-9

p. xxx: “Special Thanks to Those Who Provided Valuable Feedback to the Authors: Frank L. Lambert, Professor Emeritus, Occidental College…” [[The Ninth Edition defined entropy only by “The disorder [of a system] is expressed by a thermodynamic quantity called entropy…The more disordered a system, the larger is its entropy.” In the Tenth Edition, this has been deleted, at my urging. However, unfortunately for beginning students, the authors together (one of whom had completely changed the definition to involve energy dispersal) decided to be ambiguous on p. 808 with “Entropy has been variously associated with the extent of randomness in a system or with the extent to which energy is distributed among the various molecules of a system.” This is too much for a student to analyze and evaluate.]]


I do not have copies of the seven other US chemistry textbooks that have changed their approach to entropy on the basis of my articles. If the above list is not convincing and such further evidence would be conclusive, I would be glad to supply it. However, the basic information (lacking only specific page numbers) is at www.entropysite.com/#whatsnew at December 2005 under the following authors/titles:

1. Oxtoby, Gillis and Nachtreib, 5th edition of “Principles of Modern Chemistry”
2. Petrucci, Harwood and Herring, 9th edition of “General Chemistry”
3. Hill, Petrucci, McCreary and Perry, 4th edition of “General Chemistry”
4. Ebbing, Gammon and Ragsdale, 2nd edition of “Essentials of General Chemistry”
5. Moog, Spencer and Farrell, “Thermodynamics, A Guided Inquiry”
6. Kotz, Treichel and Weaver, 6th edition of “Chemistry and Chemical Reactivity”
7. Olmsted and Williams, 4th edition of “Chemistry”

My major recent articles are all accessible under "articles" at www.entropysite.com . Their locations in issues of the Journal of Chemical Education ("JCE") are given below.

1. “Shuffled Cards, Messy Desks and Disorderly Dorm Rooms – Examples of Entropy Increase? Nonsense!”, JCE, 76, 1999, pp. 1385-1387.
2. “Disorder – A Cracked Crutch for Supporting Entropy Discussions”, JCE, 79, 2002, pp. 187-192.
3. “Entropy Is Simple, Qualitatively”, JCE, 79, 2002, pp.1241-1246.

The following list is of peer-reviewed articles that cited one or more of mine.

To 2 above: “The Concentration Dependence of the ΔS Term in the Gibbs Free Energy Function: Application to Reversible Reactions in Biochemistry”, R. K. Gary, JCE, 81, 2004, pp. 1599-1604.
To 2 and 3 above: “Introduction of Entropy via the Boltzmann Distribution in Undergraduate Physical Chemistry: A Molecular Approach”, E. I. Kozliak, JCE, 81, 2004, pp. 1595-1598.
To 1, 2, and 3 above: Teaching Entropy Analysis in the First-Year High School Course and Beyond, T. H. Bindel, 81, 2004, pp.1581-94
To 1, 2, and 3 above : “Entropy and Constraint of Motion”, Professor W. B. Jensen, JCE, 81, 2004, pp. 639-640.
To 2 above: “ “Disorder in Unstretched Rubber Bands?”, Professor W. J. Hirsch, JCE, 2002, 79, pp. 200A-200B.FrankLambert 00:27, 9 October 2006 (UTC)


It would be more graceful to stop now. Trying to promote one's own notability on Wikipedia in most cases ends rather nasty. In the AfD, John Baez just weights in. You can only hope, that he doesn't use this example to illustrate self-promotion in Wikipedia. --Pjacobi 07:58, 9 October 2006 (UTC)
This is not trying to promote his own notability Pjacobi, you clearly are confused about what "self-promotion" means. This is a very appropriate list of published works which cite his work. I'm reminded of the harassment User:William M. Connolley had to endure from people, and he was only published in Nature SFAIK, and has been thoroughly vindicated (rightly so.) It would be more graceful for you to work on the article, Pj, and stop making snide remarks about an editor who is recognized in his field. No wonder so many experts and notable authors are driven off WP; if there is undue weight given, address that, and don't accuse someone who, when asked to provide proof he is knowledgable, provides such a list. The man's damned if he does and damned if he doesn't - provide a list, and be accused of self promotion (wrongly, since this is a talk page); don't provide a list, and keep getting attacked as non-notable. You're discouraging expert editors, and that is against the encyclopedia's best interest. KillerChihuahua?!? 11:23, 9 October 2006 (UTC)
Regarding Entropy, you hardly can claim both (a) having found a new definition and (b) being used as a reference in several respected textbooks. I'd be much happier if Lambert can clarify whether to claim (a) or (b) (and hopefully (b)).
And notability has to be put in proportion: the current article version didn't even mention Caratheodory or Lieb, in realtion it would be highly out of proportion to mention Lambert.
Pjacobi 11:37, 9 October 2006 (UTC)
As Lambert is actually available, let's let him explain his A vs. B position. It seems to be that we should look into adding Caratheodory and/or Lieb; that notable people are not mentioned is not a reason to remove content but rather a reason to add it. That said, we probably need to get a short list going here on talk and weigh the value of adding different people to the article in order to avoid undue bloat. KillerChihuahua?!? 12:17, 9 October 2006 (UTC)

I already have a category "thermodynamicists" list started of established thermodynamicists; Constantin Caratheodory is a good classic example, his theories are respected and written up in 100s of books and articles. --Sadi Carnot 13:24, 9 October 2006 (UTC)

I have just noticed this current fracas on the topic of entropy. While I haven't yet had a chance to closely familiarize myself with the explanatory merits (if any) of Frank Lambert's approach, I need to say I am disturbed by the treatement he appears to be receiving from some editors. Although WP quidelines discourage self aggrandization through rules like WP:NOR and guidelines or policies such as WP:BIO, WP:OWN, etc., it is plain that the authors of WP:VER verified sources have a right to participate in the formation of articles. WP:NPOV#Undue_weight plainly allows for the inclusion of notable minority views in proportion to their comparative notability. While Mr. Lambert's approach does not at first blush square with classical thermodynamics (again, I haven't familiarized myself yet), I see absolutely no reason why it should be deleted out of hand as was done recently.

It seems to me the appropriate approach here would be to include this material in a reasonable summary form, linking to another main article if it's too lengthy to explain in a brief section, and note for the reader of the article that it's "an alternative modern approach", or some other way of accurately expressing it's positioning in the marketplace of ideas (per WP:VER of course). . ... Kenosis 16:00, 9 October 2006 (UTC)

Concur absolutely with you, Kenosis. FYI, Pjacobi didn't intend his statements to be so harsh, and has apologised below, along with a partial retraction. I suggest we move past that and back to content, as Pjacobi has so excellently suggested. KillerChihuahua?!? 18:55, 9 October 2006 (UTC)

Intelligent design

Comment: for everyone to note, two of those who want to keep Lambert's article and related theories (e.g. "entropy=dispersion not disorder"), i.e. User:Jim62sch and User talk:Dave souza, seem to be only doing this, based on their edit and comment history, for intelligent design purposes, what ever they are? On the second law of thermodynamics talk page, for example, User:Dave souza has stated that the second law of thermodynamics as stated by Rudolf Clausius in 1854, i.e. “heat cannot of itself pass from a colder body to a hotter body”, is now incorrect due to recent talk page discussions. He seems to think that Lambert's website theories are the correct ones rather than Clausius and he is the one that started this mess by adding 9 ref links to Lambert's website. Whatever the case, Souza’s views and edits are not scientific.

Moreover, today I looked through my collection of entropy-related textbooks (70 thermodynamics textbooks, 7 physics textbooks, 5 biochemistry textbooks, 4 chemistry textbooks, 3 physical chemistry textbooks, and others) and the “energy dispersal” concept is not there neither is Frank Lambert. This is entirely a sideline website theory that happens to have good search rankings because the author has bought up all of the URLs related to entropy, e.g. ‘entropysimple.com’, ‘entropysite.com’, ‘2ndlaw.com’, ‘secondlaw.com’, etc., and in which he uses these to push out his own personal theories. Now, there is certainly nothing wrong with that. But, self-published website theories do not justify scientific correctness nor reasons to be included into the Wikipedia entropy article, which is from where all this mess is stemming. --Sadi Carnot 10:18, 9 October 2006 (UTC)

For everyone to note, Sadi has completely misconstrued my suggestion that consideration be given to this discussion where other editors indicated that the statement by Clausius refers to "a statistical effect, so the theory really states that it is statistically improbable for this to occur." Evidently he feels that such holy writ should not be questioned, but it seemed a reasonable point to me. ...dave souza, talk 15:56, 9 October 2006 (UTC)
  • Second warning: Personal attacks on Jim62sch and Dave souza, two highly respected editors, will not gain you any points here, Sadi.

I note that in your second paragraph you actually address the article, which is good - but as there is a list of textbooks above, it appears your collection may be out of date. KillerChihuahua?!? 11:27, 9 October 2006 (UTC)

I am not making personal attacks, I am stating facts. This is a clear case of religion vs. science. I am the one that wrote most of this article, particularly the history section. I have been reading chemical thermodynamics, thermodynamics, thermal physics, and chemistry books almost on a daily basis, for over ten years now. Entropy has been an established concept for 182 years now. Yet, now, all of a sudden, a random college professor named Frank Lambert, who has never published his energy dispersal theory for peer-review, seems to think he has revolutionized chemistry and wants go down in history as someone who coined a new version of the second law. There is a famous and humorous quote, which goes back before 1950, which states that: "there are as many versions of the second law of thermodynamics as there are thermodynamicists." If we are to quote from these hundreds of possible scientists, I suggest that we start with those thermodynamicists most respected in science for their conception of entropy. I will be glad to type up and add over 20-30 respected chemists, physicists, and thermodynamicists, referenced from the classic articles and books, to this article over the said theory (which I might add is wrong) if this will remedy the problem. We are supposed to be building an Encyclopedia to be comparable with Britannica. Linking to unpublished sideline website theories is going turn this article into a joke. --Sadi Carnot 12:19, 9 October 2006 (UTC)
Utter nonsense, and an ad hom. You need to go back and actually read Jim62sch's and Dave souza's contributions to Intelligent design if you have been so obtuse as to think they are proponents. I have lost track of how many times Jim has been attacked by ID proponents for his efforts to prevent the ID article from becoming a silly Discovery Institute promotional article. Further, whether they are Bible-thumping Christians or Wiccans or Muslims or Atheists has no bearing on this article. That is a classic example of an ad hominem, and you are continuing your pattern of personal attacks, even after being warned. Do it again and I will block you to give you time to read the WP:NPA and possibly enough time to read and familiarize yourself with all the WP:RULES. KillerChihuahua?!? 13:25, 9 October 2006 (UTC)
I would hope to see the personal differences here set aside as quickly as possible, and the disagreement resolved on its merits.

It appears to me that part of the problem here is over the common meaning of the word "disorder" or "disordered". Despite the use of the word to describe entropy, despite the involvement of chaotic factors and difficulties defining the boundary characteristics of closed systems, entropy is a process that is as orderly and quantifiable as are "permeability", "diffusion", "homeostasis", "absorption", "osmosis", "convection" and other such dynamics. It seems to me the use of analogies (such as were used in the recently deleted section) does not threaten the order of the cosmos or even the order of WP, nor should it threaten editors of the current article. Unlike the article on the Second law of thermodynamics, this article need not, in my judgment, be necessarily limited to formulaic and/or logical positivist explanations of the concept of entropy. Surely there must be room to gain a stable consensus on agreeable ways to explain this stuff to the reader in an analogical way, per WP:NOR, WP:VER, and WP:NPOV#undue_weight of course. ... Kenosis 16:34, 9 October 2006 (UTC)

Can somebody pls clarify over which part of the article (if at all) we are arguing in this section? --Pjacobi
I'd love to, but so far as I can tell this section was started by Sadi to malign Dave souza and Jim62sch, accusing them of being Intelligent design proponents and inserting a religious POV into this article (no diffs or examples for this assertion given). As If you have edited the ID article with them, I'm sure you see the bizarreness of this accusation. If he was trying to actually discuss content of the article in this section, he failed utterly. Sadi has been warned twice by me about violating NPA, he has been informed of various other rules and guidelines he has been violating and/or unaware of by both me and Pjacobi, and his attack article and POV split article have been edited by others to a more neutral and informative positon. He stated on the 2LOT talk page that he is taking a wikibreak - hopefully he will come back renewed and with a more productive approach. KillerChihuahua?!? 19:00, 9 October 2006 (UTC)
AFAIK I've stayed out of ID as us Krauts see this as an internal affair of the U.S.
So perhaps the relevant part of this discussion can move to Talk:Second law of thermodynamics, where it seems to need clarified
  1. why (if at all) ID proponents see the 2LOT as an argument for the necessary of an intelligent designer
  2. why that argument is bogus
19:10, 9 October 2006 (UTC)

Yup, been there done that, got tired of it and stopped editing 2LOT for about six months. See Talk:Second law of thermodynamics/creationism for the old discussion. Wade wanted it in, I didn't, vote on the straw poll was about split. Even Wade (a creationist and ID proponent) knew it was bogus, so there shouldn't be any editing wars about that, at least. And now I move we close this section and get back to the Entropy article here. KillerChihuahua?!? 19:50, 9 October 2006 (UTC)

I want to thank KC for her spirited defense of Dave's and my honour. •Jim62sch• 21:18, 9 October 2006 (UTC)

Caratheodory and other notable people

We have one mention of Lieb and two of Caratheodory as suggested additions to the article. Anyone else?

RESPONSE from Lambert

I find it difficult, as I must, to refrain from vigorous and unacceptable language. Please consider it interspersed between each word of the following.

My goal in spending a great deal of time on this Talk:Entropy list in July, and now, was only to have the Wikipedia Entropy section written so as to aid beginning students and the general public who might access Entropy. My contributions to the field, as you now can see in detail at "Non-notable?"(written due to YOUR DEMAND, not my choice) have been adopted by the majority of authors of US general chemistry texts to be the way entropy should be presented to beginners. (The change also in Atkins is not trivial.)

However, those views that were patiently and slowly developed in July and recently were not considered/deliberated seriously by Sadi Carnot, Kats and the three or four others -- a very small number of individuals, none with experience in educating chemistry students -- who have dominated the Talk:Entropy page since I first saw it in July.

I care NOTHING about my name being mentioned ANYWHERE! In a series of statements from Carnot, Pjacobi and others recently, anything I wrote at the time or in the past was denigrated because it was my POV or that I was not notable because my views had not "come from others". THAT is the only reason that I took a day -- I do not type rapidly, I am now 88 -- to prepare the formal list of texts and literature citations.

And yet -- is it conceivable? -- Sadi Carnot, after that detailed list of my peer-reviewed articles and the amazingly long list of textbooks that have changed in only four years, writes of me "....who has never published...for a peer-review.." Sadi demonstrates here that he does not read carefully. His ten year investment in reading [in the same careless manner he approaches reading my presentations? I hope not.] has resulted in his sense of ownership of the Entropy article in Wikipedia. My peer-review has not been only three seminal articles. It consists of 36 established textbook authors, my peers and superiors, whose jaundiced eyes are far more reliable, I believe, as to the educational worth of any statement about entropy than those of a wannabe Sadi Carnot.

Forget me and my name. What I think is important is that this approach involving energy dispersal be made an initial part of the Entropy page because young students are impatient -- some even desperate - when they hit Widipedia for a quick explanation of entropy. They are NOT, in general, going to do a scholarly scan of the whole article and then go to other links. I believe the same is true for most of the general public who will be accessing Entropy in Wikipedia. As I've said to some of you, "Does Wikipedia serve the learner as well as the learned?"

Forget me. Omit my name completely from anywhere. Literally. I do NOT care. But I DO care for what has been my life's devotion. So, for the sake of millions of students now and in the future. please don't omit an approach to entropy that is now seen as valid by so many text authors. FrankLambert 17:23, 9 October 2006 (UTC)


Eighty-eight? Hope your health is stable and we can call on you for another decade at least, should we need to get clarifications directly from a primary-source author. ;-) .
I wrote 88, not 188 ! I did not know Lazare Carnot personally :-) FrankLambert 23:48, 9 October 2006 (UTC)
Well, OK, so it's secondary or tertiary source material. But it's plainly not "original research" in WP parlance. ... Kenosis 00:42, 10 October 2006 (UTC)

As I said above, I trust the personal frustrations can be put aside, and a stable consensus acheived about how to integrate a few useful analogic explanations into the article. Speaking for myself to all involved editors, in my judgment this article needn't be a totally positivist, technical explanation (as would be more arguable in the article on the second law, though even there laypersons may deserve a good common-sense analogy or two to help them along). Certainly there must be some reasonable way to accomplish a workable combination of technical accuracy and reasonable analogies that would be understandable to the average layperson. ... Kenosis 18:06, 9 October 2006 (UTC)

Without retreating on the front of content issues, I want to apologize for my remarks avout self promotion and everything which may be read as a personal attack. I'd guess that User:Sadi Carnot creation of the biography article Frank Lambert was a disservice to you. --Pjacobi 18:19, 9 October 2006 (UTC)
It's probably that the brief summaries I've introduced into articles have failed to do Frank Lambert's views justice, but the point doesn't seem to have come across that his approach is an introductory presentation for school students and undergraduates which leads on to, and does not contradict, both Classical and statistical thermodynamics. The "dispersal" description is used as a less ambiguous analogy for beginners than the "disorder" approach which has led to much misinterpretation, though it can be useful as long as "disorder" is carefully defined. Others have expressed valid concerns that the introductory analogy should be accurate and robust, and have questioned its application to advanced problems. Evidently several authors of school and undergraduate chemistry textbooks did not find this a concern. Anyway, this approach has widespread currency, not least because of recommendations such as Chemical Education Resources and Instructional Resources for Chemistry Educators. If the approach is valid as an introduction for beginners I feel we should mention it in that context, if there are reliable sources showing problems with this approach we should point them out, and should note any limitations. This in no way implies that we should override or neglect more established approaches. ...dave souza, talk 19:38, 9 October 2006 (UTC)
It looks to me like you have a good handle on how to phrase this to add it into the article, Dave - care to give it a try and see if we can move forward on this? KillerChihuahua?!? 19:51, 9 October 2006 (UTC)
I'd say this part On a molecular basis, entropy increase means that a system changes from having fewer accessible microstates to having a larger number of accessible microstates (from [15]) is uncontroversial, not to say canonical. For the "macroscopic" side, the most needed clarification by FrankLambert would be, whether he assumes to have found something new -- or if not, give references to earlier authors giving this interpretation.
Also, without proposing this as an absolute criterium for inclusion, result from Google Scholar may seem to indicate a rather low impact of the approach.
Pjacobi 20:02, 9 October 2006 (UTC)
The manner in which the energy dispersal theory is applied to the entropy of mixing example needs to be clarified before the theory is credible. The entropy of mixing example stands as a case in which there is certainly no NET spatial energy dispersal, yet there is entropy increase. Saying that the energy of one of the mixing gases disperses does not constitute what is commonly known as dispersal since energy is not "tied" to a particular particle for any more than the time until its next collision. The dispersal of the particles does not go in lockstep with the dispersal of the energy they happen to have at some instant. If there is an energy dispersal in the mixing case which is inextricably bound to the entropy increase, this is not it. I understand that a discussion of the merits of the theory or lack thereof is of little value in this argument, but I just couldn't help myself. PAR 00:49, 10 October 2006 (UTC)
RE "The manner in which the energy dispersal theory is applied to the entropy of mixing example needs to be clarified before the theory is credible.":

First, it is not the credibility of the theory that is at issue, but the usefulness of the explanation to the reader of the WP article as proposed to be drawn from Lambert's example of a growing contemporary pedagogical approach, that of describing entropy as "energy dispersal" rather than as "disorder".

Second, there is no contradiction at all between describing entropy as "energy dispersal" and the mixing example. The mixing example is a combination of two basic concerns, that of the two substances mixing physically and that of their respective energy seeking equilibrium among the two original sets of particles now mixed. ... Kenosis 03:02, 10 October 2006

If the entropy of mixing case is not a case of energy dispersal but rather of particle dispersal, then how is it useful to say that entropy increase is conceptually the same as energy dispersal? Why not say "entropy increase is energy and/or particle dispersal?"
"their respective energy seeking equilibrium among the two original sets of particles now mixed" is not the same as the spatial dispersion of energy. In the mixing example, the energy density, the temperature, the pressure are all constants during the mixing. Please define the concept "dispersal of energy" for the mixing example. I cannot believe that clearly defining it detracts from its usefulness.PAR 07:38, 10 October 2006 (UTC)
RE "Why not say "entropy increase is energy and/or particle dispersal?" The reason is that the particle dispersal is not entropy, any more than sound waves literally "move air". In both cases the displacement of the particles is no more than necessary to move the energy to the adjacent sets of particles. In the mixing example there is both particle dispersal and entropy, or energy dispersal. That's why a special formula is used to dsicribe the idealized mixing process, in order to account for both aspects of the mixing process. ... Kenosis 15:53, 10 October 2006 (UTC)
And, "their respective energy seeking equilibrium among the two sets of particles now mixed" is in fact a spatial dispersion of energy. It's just more difficult to pinpoint when two substances are mixed together. That's why the two phenomena of particles mixing and energy being exchanged need to be described by a formula that factors in both of these phenomena in the idealized situation of two separate fluids combined into a homogeneous mix. ... Kenosis 19:06, 10 October 2006 (UTC)
Before I respond, please, this is perhaps the tenth time I have asked this question on this page, and I have never gotten a clear cut response, just more hand-waving and pointers to hand-waving web pages. Please define quantitatively the concept "spatial dispersal of energy" for the mixing example. What energy is being dispersed, and how is "dispersal" into physical space defined? Can you define it in terms of physical concepts? PAR 16:57, 10 October 2006 (UTC)
This is the first time I've seen the question. My response just above speaks to the special case of two substances mixing. And, it is not an energy dispersal into physical space, but through physical space, passed along via successive particles to one another. ... Kenosis 19:12, 10 October 2006 (UTC)
Ok, through. But do you see what my problem is? Repeatedly, over and over again, I ask "please define energy dispersal in a quantitative way" and over and over again, there is no clear answer. If you look at your above response, you have not answered the question. Please, just explain to me why. Is it because you believe you have answered the question? Is it because you forgot to answer the question? Is it because you are not clear what I mean when I ask the question? What? PAR 20:41, 10 October 2006 (UTC)

Energy dispersal is quite well quantified by the formula  , where Ω is the number of microscopic configurations, and   is Boltzmann's constant. It is quantified as S in a thermodynamic system (the current state of energy dispersal throughout a defined environment), and changes in energy dispersal are defined by delta-S (ΔS, which is a derivative function of S). ... Kenosis 21:22, 10 October 2006 (UTC)

What you have given is the definition of entropy change, not of energy dispersal. The whole point of this discussion is whether the above defined entropy change is equivalent to energy dispersal. You cannot just say its true by definition, you have to prove it. To prove it you must define energy dispersal and then show that it is equivalent to entropy change.
The definition of energy dispersal should sound something like "if some energy is concentrated at some point in time, and is more spread out at a later point in time, then that energy has dispersed". I mean, thats not a very rigorous definition but its a huge advance over any thing I have heard from proponents of the energy dispersal idea, or read on any web page. Please give me your definition of energy dispersal, so we can actually talk about whether it is equivalent to entropy change. PAR 00:03, 11 October 2006 (UTC)
The description User:PAR just gave is a reasonable description of heat entropy, or extent-of-energy-dispersal, in a solid. Heckuva lot more understandable than the classical approach of referring to it as "disorder". And Delta-S would be the energy dispersal in progress. ... Kenosis 00:35, 11 October 2006 (UTC)
Which part of what you're doing is OR are you not understanding? We are not writing a thesis, we are writing an encyclopedia. •Jim62sch• 00:29, 11 October 2006 (UTC)

Folks, Please keep in mind that there are plenty of systems which have no concept or definition of energy or temperature, but for which one can explicitly and precisely define entropy. This is generally the case for information theory (information entropy), and for many cases studied in dynamical systems (Kolmogorov-Sinai entropy). Insofar as energy is a kind of constant of motion that is kind-of-like conjugate to time, then maybe the change of entropy over time is kind-of-like energy dispersal... Hmmm. I'm not really aware of any way to take the formal mathematical definition for a system with dissipation (wandering set), and turn it into some theorem involving entropy. But this could be an interesting topic to explore ... Is entropy kind-of-like the dissipation of constants of motion over time? Hmmm. linas 06:24, 10 October 2006 (UTC)

Let me know what you come up with. I've been trying to very clearly understand why entropy increase is (or is not) spatial energy dispersal, particularly in the entropy of mixing case. It's not as simple to conceptualize as it sounds. PAR 07:38, 10 October 2006 (UTC)
linas, you're right to point out that this introductory approach is specific to thermodynamic entropy: it gives concrete examples to assist students who have great difficulty in grasping mathematical abstractions, in the same way that the bouncing balls blown by air often used in a lottery can help such students conceptualise the movement of molecules in a gas. This vision of continuous movement and molecular collisions can then lead to showing the possibilities of many Boltzmann distributions and continually changing "distribution of the instant". Hence on to the idea that when the system changes, dynamic molecules will disperse into a greater number of accessible microstates. PAR, I think that may be the point you're looking for, the analogy of dispersal being used to put across the concept at a molecular level as well as describing macro scale spatial dispersal, always remembering that this is just a way of describing an abstract concept to more literal minded beginners and not a new theory in any way. It's not a entirely new term for this idea, having been used in 1997 and 1999, but the significance of Lambert's approach is that its use in chemical education and the accessibility of his websites means that many students are now starting off with this simple analogy. The relationship of thermodynamic entropy to abstractions covering many situations is complex and interesting, as indicated by various controversies about Boltzmann's ideas, but this is really a separate topic which is already mentioned here and could be developed further in appropriate articles. ..dave souza, talk 09:37, 10 October 2006 (UTC)
Yes, Dave, the intro is overly specific to Carnot's original approach in my judgment, using as it does the "work" example. Dissipation and entropy are phenomena not limited to that energy which is unavailable to do a certain kind of "work" intended to be done by those applying a given technology or mechanics to a task. It is far more universal than that. Perhaps the "work" example currently in the second introductory paragraph should be presented in the "history" section instead. ... Kenosis 16:34, 10 October 2006 (UTC)
(edit conflict)Note the lead of this article: "In thermodynamics, entropy, symbolized by S..." [emphasis added] Not also the disambig: :For entropy in information theory, see information entropy. For connections between the two, see Entropy in thermodynamics and information theory. For other uses of the term, see Entropy (disambiguation), and further articles in Category:Entropy. Thus, I fail to see how Folks, Please keep in mind that there are plenty of systems which have no concept or definition of energy or temperature, but for which one can explicitly and precisely define entropy. This is generally the case for information theory (information entropy), and for many cases studied in dynamical systems (Kolmogorov-Sinai entropy). is relevant.
Also, determining the validity of the theory (not that it is a theory per se, it's simply a better way of explaing entropy, the ever-popular "disorder" and "chaos" related definitions being simply absurd definitions of thermodynamic entropy) is not our job, and falls into the realm of OR. What is our job, is to determine if the sources provided, at your request, by Frank Lambert meet WP:V and WP:RS. They do. Hence the definition of entropy equaling energy dispersal needs to be included.
I also find all of this hand-wringing over a definition of "dispersal" to be ironic, given that the definition of "entropy" itself wanders all over the place, and that seems not to bother a soul here. Remember, this is an encyclopedia, not a doctoral thesis.
More later. •Jim62sch• 09:51, 10 October 2006 (UTC)

Removed sentence from section on "Thermodynamic system"

I've removed the following sentence, because it's phrasing and placement in the section was confusing if not incorrect. It was impossible to tell what the "converse" was here. ... Kenosis 20:15, 10 October 2006 (UTC)

  • "Although the converse is not necessarily true, for example in the entropy of mixing." ... 20:15, 10 October 2006 (UTC)

I replaced the reference to "entropy of mixing" at the end of that section, explaining that it's a specialized case of entropy described by its own formula. ... Kenosis 20:25, 10 October 2006 (UTC)

I just read the article on entropy of mixing and double checked a couple of independent sources, and now understand that this formula for mixing presumes equal temperature of the two substances being mixed. Didn't know that before now. Thus, the entropy of mixing refers to distribution of particles into a homogeneous particle mix. This is an entirely different use of the word entropy which does not involve thermodynamics. Learning new things here. Nonetheless, as to thermodynamic entropy, it remains that dispersal of energy from warmer particles to cooler particles always represents an increase of entropy in a closed system, consistently with the Second Law. ... Kenosis 21:53, 10 October 2006 (UTC) I just gave this last comment a strike-through, as I've found the formula for entropy of mixing does not necessarily assume a starting point of equal temperature, but merely a homogeneous mix at the conclusion of the mixing process. My mistake. ... Kenosis 22:39, 10 October 2006 (UTC)
There was quite a discussion about this in the first section of Talk:Entropy/Archive4 with an interesting thought experiment set out by Jheald at 01:51, 29 September 2006. It's an issue which Frank Lambert touches on in his Notes for a “Conversation About Entropy” from a seminar on November 8, 2005 at California State Polytechnic University, Pomona. ...dave souza, talk 22:36, 10 October 2006 (UTC)
In other words, we are talking about two separate applications of the term "entropy" here, thermodynamic energy dispersal, and physical particle dispersal, each with its own characteristics. I hadn't known the term was accepted in such an application to the physical mixing process. Very interesting. ... Kenosis 23:29, 10 October 2006 (UTC)
Of course, if they're not at an equal temp, energy disperal still applies, no? •Jim62sch• 23:55, 10 October 2006 (UTC)
Evidently yes. ... Kenosis 00:27, 11 October 2006 (UTC)

The definition of physical entropy (S=k ln(W)) takes into account both "thermodynamic entropy" and "configurational entropy". Its an unfortunate choice of names - it implies that configurational entropy is somehow "outside" of thermodynamics, which it is not. Jheald's thought experiment was an excellent way of seeing that. Also, you are disagreeing with Frank Lambert when you call configurational entropy a dispersal of particles and thermodynamic entropy a dispersal of energy. According to that reference provided by Dave Souza, his position is that even configurational entropy constitutes a dispersal of energy (with which I disagree). PAR 01:25, 11 October 2006 (UTC)

I'm strongly displeased with the notion of "thermodynamic entropy" and trying to separate specific applications of entropy from a narrower concept of "thermodynamic entropy". Whether mixing entropy or or black holes, it's all the same physical quantitaty, nmamed entropy in my books.
Also, mixing entropy would be included even in this narrower definition.
Pjacobi 08:23, 11 October 2006 (UTC)
Agree with Pjacobi: "thermodynamic entropy" means physical entropy. So-called "configurational entropy" is part of thermodynamic entropy. Redefining thermodynamic entropy as anything different from this is deeply misguided and likely to confuse. Jheald 17:18, 11 October 2006 (UTC).
Which seems to be very much what the linked "Conversation about entropy" says: "A designation of ‘configurational’ or ‘positional’ entropy is an unfortunate artifact from poor communication by statistical mechanics experts." ..dave souza, talk 17:39, 11 October 2006 (UTC)

The basics

Regarding the insistence on stating that there is a net "decrease" of entropy in the hypothetical room in which ice is placed into a glass of water, please let's at least get the basics straight. When a room (a closed system) gives up some of its available heat energy to a glass of ice water within it, the room's entropy has increased in between the time the ice was placed into the glass and the time that the contents of the glass has returned to room temperature. That is, there is a net loss of available energy left to dissipate within that room. That is a most basic principle of entropy. The Second Law tells us that this room, having introduced the added component of the ice, will seek such an increase of entropy until it arrives at a steady-state maximum value of dispersal of its available heat within its boundaries. ... Kenosis 00:46, 11 October 2006 (UTC)

Here is the sentence in question:

"the entropy of the system of ice and water has increased more than the entropy of the surrounding room has increased."

It is extremely clear that there are two systems here - the glass of ice water and the surrounding room. Each is an open system, both taken together form a closed system. Any object which simply grows colder, loses entropy. If it is cooled to absolute zero, its entropy is zero. (or really small anyway). The room grows colder, therefore its entropy decreases. The glass of ice water grows warmer. Any object which simply grows warmer, gains entropy. The second law says that the total entropy of the closed system - that is the sum of the entropy of the ice water and the surrounding room - increases as the energy flows out of the room and into the ice water. That means that the increase in entropy of the ice water is greater than the loss of entropy by the surrounding room. The correct statement is:

"the entropy of the system of ice and water has increased more than the entropy of the surrounding room has decreased."

PAR 01:15, 11 October 2006 (UTC)
No. The room is a closed system for purposes of this analysis, with the glass of ice water a component within that closed system. A closed system does not mean a homogenous isolated system with no heat variations within it. That's what happens when entropy reaches its maximum value within the closed system. (Recall, of course, that no system is entirely closed no matter how well you insulate it. If the argument is that the room is not truly a closed system, then the whole explanation should be eliminated. But that's not what I'm advocating.) ... Kenosis 01:24, 11 October 2006 (UTC)

Moreover, several statements made just above betray a very fundamental misunderstanding of what entopy is: User:PAR's statements "Any object which simply grows colder, loses entropy. If it is cooled to absolute zero, its entropy is zero. (or really small anyway). The room grows colder, therefore its entropy decreases." are simply incorrect, and confuse entropy (the quantity S) with heat itself. ... Kenosis 01:28, 11 October 2006 (UTC)

Thermodynamically, by the definition of entropy, a small change in entropy is dS where
 
where   is the amount of heat added (or subtracted if it is negative) and T is the temperature. In the above case,   for the room (without the ice water) is negative, since heat is being lost by the room to the ice water. In other words, the entropy loss by the room is equal to the heat energy lost by the room to the ice water, divided by the temperature of the room. PAR 02:04, 11 October 2006 (UTC)

.

No! The formula is:

 ; or

 , where the SI unit of entropy is "joule per kelvin" (J·K−1), which is the same as the unit of heat capacity.

PAR, I believe you're confusing entropy with heat itself. Entropy is the dispersal of heat energy throughout a medium or into surrounding media. Delta-S is the change in dispersal of heat energy across a medium or into surrounding media, and not the change in temperature. ... Kenosis 02:39, 11 October 2006 (UTC)

The formula I gave was for a small change, so small that T2=T1. The formula you have given is derived from the formula I gave, and it holds for larger changes. There is no disagreement between the two. Your formula states that the entropy of the room (minus the ice water) decreases. Suppose the temperature at the earlier time is T1=30 degrees C. Then, at a later time, the room has cooled to T2=29 degrees. The (1/T2-1/T1) term equals 1/870, a positive number. The heat term Q is negative, because heat is being lost by the room and the subsequent entropy loss by the room is then Q/870. I am not confusing heat and entropy, its just that in this particular case they are closely related mathematically. PAR 02:59, 11 October 2006 (UTC)
Closely related maybe, but it's the difference between expressing entropy to the reader effectivly or not, because it's the difference between a positive and a negative number. If you define the room as a closed system, after you add several ice cubes to its environs, its entropy gradually increases as the room's energy is dissipated into the ice and it melts into the glass which in turn is brought back to room temperature. The process of the ice melting (the one described in that section and in the illustration) is one of entropy increasing in the room between the starting time of adding the ice and the ending time of maximum entropy in the room. ... Kenosis 03:11, 11 October 2006 (UTC)


Ok, regarding the sentence, just so we don't get mixed up, we agree that there are three systems here:

  • The glass of water (g)
  • The room without the glass of water (r)
  • The room with the glass of water (R= r and g taken together)
I think we can agree that R is a closed system. I'm ok with that. We agree that r and g are open systems, right? There's energy flowing from the room r to the ice water g  as the ice water melts, so they are both open systems. I am saying that as the situation goes to equilibrium, the entropy of g increases, the entropy of r decreases, and the entropy of R, which is the sum of both of the above entropies, increases, as demanded by the second law, since R is closed. Maybe the sentence could read:

"the entropy of the system of ice and water has increased more than the entropy of the room and ice water together have INcreased."

or maybe:

"the entropy of the system of ice and water has increased more than the entropy of the room (excluding the ice water) has DEcreased."

PAR 03:14, 11 October 2006 (UTC)

The former of the two options just presented is definitely preferable. Please just go ahead and implement it in the article. ... Kenosis 03:22, 11 October 2006 (UTC) And, just to clarify a very basic aspect, I do not agree that r and g are "open systems" when taken together. If that were the case, the second law of thermodynamics would be a useless curiosity. The second law presumes that some element of non-equilibrium has been introduced into the closed system (imperfectly insulated as the boundaries always are); otherwise there's no point in analyzing it because it'd already be in thermoequilibrium. ... Kenosis 03:51, 11 October 2006 (UTC)
Maybe a bit later we can say it so it's yet more explanatory for the reader of the article. I'm getting to appreciate the value of FrankLambert's pedagogical approach more and more as each hour passes (Q: is that a form of ideological entropy? ;-) . ... Kenosis 03:35, 11 October 2006 (UTC)
I agree that r and g are not "open systems" when taken together. When taken together, they form R, the closed system. I rewrote the section to make it clear that the "room" is separate from the ice water, so the problem is addressed. PAR 11:49, 11 October 2006 (UTC)


Wish I'd seen this before now....maybe could have saved people some time :-). (I greatly admire that beautiful illustration of glass, ice, water, lemon, obvious nice warm room.) But an 18g ice-cube in a styrofoam cup would be a simple system (usable for quant talk to kids, next). Then when this 'system' is quickly inserted in the warm surroundings of an isolated room, the modern 2LOT that predicts motional energy of faster moving molecules will spread out if they are not constrained and leads pretty quickly to what happens. So, because q energy from the warmer room spreads out in the cooler space of the ice cube and [q/'warmer room' T] is less than [q/'lesser than room'T], the entropy of the system increases more than the room surroundings decreases. At equilib then the total entropy of this sys + surr increases. As you have concluded. FrankLambert 20:14, 11 October 2006 (UTC)
In other words, as S reaches a maximum value in the room after ice melts and equilibrium is again achieved,   rolls off to zero. The entropy of the entire (room with ice-water) has increased, in keeping with the second law. But when taken as separate systems, entopy increases in the glass and decreases very slightly in the room (surrounding environment). PAS, the current version is much, much clearer than before in my opinion. Maybe at some point it can be improved further yet to include something that explains this combination of systems in keeping with the basic principle expressed by 2LOT? For the moment, thanks very much to both PAR and FrankLambert. ... Kenosis 23:49, 11 October 2006 (UTC)
Yes, thanks for your input on this FrankLambert - also, please note a consistent error in the above discussion, when we were saying "closed" we should have been saying "isolated". A closed system allows energy but not particles to flow into it. An isolated system allows neither energy nor particles in. PAR 20:04, 13 October 2006 (UTC)
Ahh. Now that reconciles the math to be in keeping with 2LOT. OK then. so the Delta-S of this isolated system is slightly negative upon insertion of the ice, some of the room's available energy dissipated into the ice water (assuming here that a thermostat's not causing a heating system to inject more energy into the room to make up the difference, of course). Still a couple of gaps in the analysis, but it makes some better sense now. ... Kenosis 23:05, 13 October 2006 (UTC)

NPOV

Has nobody noticed that putting a particular theory on a separate page is completely in contradiction to NPOV? The only way to do something of the sort within the rules might be to link to an external source. It is really just the same as Origin of species (Biology) and origin of species (biblical) DGG 05:03, 13 October 2006 (UTC)

As stated at #Intelligent design above by KillerChihuahua at 19:00, 9 October 2006, discussion here and mention in the article of a simplified teaching approach for beginners (NOT a theory) was moved by an opponent of these ideas to a new "attack article and POV split article", which is jargon for the point you're making: see Wikipedia:Neutral point of view#POV forks. However, as Wikipedia:Content forking#Article spinouts - "Summary style" articles points out, it is often good practice to have a brief summary on the main article linked to a spinoff article giving additional detail. By the way, origin of species is a redirect to an article about the book On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life, the various theories and religious faiths all have their own articles: Wikipedia is not a paper encyclopedia. ...dave souza, talk 10:24, 13 October 2006 (UTC)


The “Entropy of Mixing “

Professor emeritus Harvey Leff (Am. J. Phys. 1996, 64, 1261-1271; “Maxwell’s Demon 2, Entropy, Classical and Quantum Information, Computing” (IOP – now Taylor Francis, 2003) has read the many requests that PAR posted to Talk:Entropy for a mathematical quantification of the spatial dispersal of molecules in mixing. Leff stated that, although noble in principle, a more detailed quantitative treatment of molecular behavior in mixing is probably impossible, but also unnecessary.

This statement of Leff is augmented here by quotations from the literature and an illustration of how readily the necessary quantification of molecular behavior in mixing is obtained.

Levine (“Physical Chemistry”, 5th Edition, 2002, McGraw-Hill; p. 90) states: “The entropy of a perfect gas mixture is equal to the sum of the entropies each pure gas would have if it alone occupied the volume of the mixture” (temperature unchanged). Meyer (J. Chem. Educ. 1987, 64, 676 ) stated, ”…processes generally described as ‘mixing’, that is, a combination of two different gases …has absolutely nothing to do with the mixing itself of either” (italics in the original). Craig ("Entropy Analysis", 1992, VGH Publishers; p. 92) says about the "entropy of mixing" (quotes are his), "It is the spreading out of the molecules in space that is crucial, not the intermixing of them."

No “special equation” is necessary to explain the results of mixing, i.e., of spontaneous gas expansion at T. The classic macro expression for that expansion per mole at constant T is ΔS = R ln VFinal / VInitial (i.e., “V2/V1”)

When A and B are mixed, their groups of different molecules can move to spread their “A” or “B” internal energy throughout the larger combined volume. From classic considerations of quantum mechanics of a moving particle in a box, energy levels of any Boltzmann distribution in a larger box are closer or ‘denser’ than in a smaller volume. Thus, if energy levels are closer in the final volume of A and B, their respective initial internal energies now have many more energy levels for their individual molecular energies (alternatively, ‘for their moving molecules having particular energies’). The result is an increase in accessible microstates for A and for B in the final mixture at equilibrium – just as there is the same increase in either A or B alone expanding to that final volume.

Qualitatively then, because entropy increase reflects an increase in the number of accessible microstates, the entropy for A and for B clearly will be greater in the final volume of the mixture than when they were separate in their smaller original volume.

Using semipermeable membranes, a possible reversible process for changing the unmixed initial to the mixed final state is illustrated in Levine (Fig. 3.9). The macro equation of ΔS = R ln V2/V1 , for one mole of either A or B, is ΔS = R ln 2V1 / V1 = R ln 2 = 5.76 J/K. Thus, the total entropy increase for A and B in the final mixture is 11.5 J/K. The conceptual meaning of this entropy increase is identical to that of any substance’s internal energy becoming more dispersed by its molecules colliding and moving into a greater volume of three-dimensional space, if it is not constrained.

Of course, the same results come from using mole fractions, xA and xB (as does the example in the Entropy of Mixing). In the present example of A and B, both xA and xB are 1 initially, and after mixing are 0.5. Thus, for either A or B, the result of mixing is an entropy change of ΔS = - R ln 0.5/1 = 5.76 J/K and the total entropy change for both is 11.5 J/K. FrankLambert 14:16, 17 October 2006 (UTC)

"... gulp ..." ... Kenosis 14:39, 17 October 2006 (UTC)

Discussion of 'entropy of mixing'

I of course have to respond -

"The entropy of a perfect gas mixture is equal to the sum of the entropies each pure gas would have if it alone occupied the volume of the mixture” (temperature unchanged)"

Exactamundo - This is Gibbs theorem, and it deserves a place in a Wikipedia article if not its own article.

"It is the spreading out of the molecules in space that is crucial, not the intermixing of them."

By Gibbs theorem, its hard to argue against this. These statements do not support nor deny the "entropy change is energy dispersal" contention, but they are useful tools to bring to the discussion of energy dispersal in the mixing case.

"When A and B are mixed, their groups of different molecules can move to spread their “A” or “B” internal energy throughout the larger combined volume."

Here is where the problem lies. The quantity defined as "the internal energy of the A molecules" does disperse under any reasonable definition, but I still object. For technically, quantitative minded people, I have another argument which I hope will be helpful in understanding my objection. To others I apologize. For the technically minded, let me just write down an expression for entropy from the first and second law.
 
Nobody disagrees that "energy dispersal" occurs in the first two terms (heat dissapation and expansion of a gas). Its that third term which applies to the mixing case that I am using to object to the above quote. Suppose I put forward a "mirror image" theory that all entropy change is particle dispersal, not energy dispersal. Then no one would disagree that "particle dispersal" occurs in the second and third cases. But for that first case, there would be an argument (mirror image of the entropy of mixing argument) saying "what if you had a hot iron bar and a cold iron bar connected - there would be entropy increase with no particle dispersal". But I could respond - lets call the system composed of both iron bars the "whole system". Consider the hot particles in the whole system - those half of the particles that are above the average energy/particle of the whole system. They will be concentrated in the hot bar at the beginning, much less so in the cold part. As time goes on the density of the hot (and cold) particles will even out, they will disperse throughout the whole system. Thus there is hot and cold particle dispersal.
Now the obvious objection to this scenario is the same that I have for the idea that the separate energies are dispersed in the entropy of mixing case. There is no "tracking of the identity" of the energy in the energy dispersal argument any more than there is any "tracking of the identity" of the particles defined as "hot" in the mirror image argument. PAR 15:31, 17 October 2006 (UTC)
In rereading the response of Frank Lambert, I realize now the reason for stating Gibbs theorem in support of the energy dispersal idea. I think it is being used to demonstrate that mixing is essentially equivalent to the free expansion of two different gases. The final entropy of the mixture is the same as the sum of both gases if they freely expanded, yes, but there is a difference, and its crucial. In free expansion, there is no transfer of energy, between the A and B gases. In free expansion, all collisions are of the AA type or BB type, while in mixing there are AB collisions as well. Energy is moved back and forth between the A and B gases, and has no tight connection to either the A or B gas. PAR 16:35, 17 October 2006 (UTC)
So what if who hits who!!. In common language, not a bunch of QM talk, the mass and other characteristics of A only allow its energy to be transferred and accepted on SPECIFIC energy levels; i.e., the initial "group" of A molecules after infinite collisions with B or itself or the walls will -- even spread out in the final larger volume -- will have the same internal energy characteristics of A at T -- i.e., the group of A will have the SAME ENTROPY finally OR the measured change entropy due to mixing would be different than simple expansion of each component. It isn't!!! FrankLambert 17:29, 17 October 2006 (UTC)
An issue I see here is a lack of agreement on operational definitions of the term "entropy". That is, to what extent, beyond the measurable quantification of entropy as S on a cardinal scale, is entropy also properly taken to refer to the process of energy dispersal the results of which are quantified as   . ... Kenosis 16:52, 17 October 2006 (UTC)
Ok, FrankLambert has responded to my second post, but what about the first? That is the main argument. If there is a telling response to the first, then I concede your point on the second. And To Kenosis, I think entropy is very well defined, it is energy dispersal that is not being adequately defined. S stands for entropy, and ΔS is a measure, a direct measure, of the change in entropy. It cannot be defined as the degree of energy dispersal, it already has a definition. The idea that it is equivalent to energy dispersal is the very point under discussion. Energy dispersal has to be independently defined, as Dr. Leff's article has done, and then shown to be equivalent to entropy increase ΔS. (Not that I agree with his definition). PAR 18:22, 17 October 2006 (UTC)
Well, yes PAR. The question I'm posing is: "To what extent, if any, is the process itself properly termed "entropy", or is the term strictly limited to the quantity S "? If the term "entropy" is properly extended to include the process quantifiable as  , then the description of the process as "energy dispersal" is very reasonably applied in explaining the process in the article! And, if not, then the use of a descriptor such as "energy dispersal" must be limited to explaining that particular aspect of the process which results in the quantification of S or  . Either way, it appears to me that FrankLambert's basic pedagogical approach is an extremely reasonable one which can be sourced in accordance with WP:VER and WP:RS. ... Kenosis 20:32, 17 October 2006 (UTC)
To cite, "Entropy measures the spontaneous dispersal of energy... — at a specific temperature." Which is now used by a number of authors as a much more useful description than it being "a measurement of the disorder or randomness of a system", for example. Neither of which deviates from the definition of ΔS ... dave souza, talk 21:22, 17 October 2006 (UTC)
Well, I should think it's time to begin including a modified version of FrankLambert's earlier explanation in the article, properly sourced to some of these modern texts, so that readers who are not so technically inclined might have an opportunity to better understand this subject. Any controversy about the use of the words "energy dispersal" can be readily mentioned to the extent that it reflects current controversy between the traditional approach and this particular modern approach, in keeping with WP:VER and WP:NPOV#Undue_weight of course. ... Kenosis 21:49, 17 October 2006 (UTC)
Boo-booing the loose speak of "disorder or randomness" is a red herring. "dispersal of energy" is still less exact (and only questionably more intuitive) then   as can be found in Becker's Theorie der Wärme and any number of other classical treatments of the subject. --Pjacobi 21:46, 17 October 2006 (UTC)
Kenosis: Entropy is the quantity S. Entropy change is the difference ΔS. Processes can be associated with an entropy change. But entropy is a property of the system at a given moment - a state function. Jheald 21:59, 17 October 2006 (UTC)
Sure, and that state function is the result of a process, not just something divorced from the dynamics of the natural world. It is that process that needs some reasonable explaining in the article. ... Kenosis 22:11, 17 October 2006 (UTC)

S, S0, ΔS

Sorry, I’ve been away and have to go quickly. Norman Craig (text: Entropy Analysis) says: “Entropy in thermodynamics always means entropy change.” S lacks meaning without its reference to a substance/system at T. S0 refers to a change from 0 K – the dispersal of thermal energy from the surroundings/T to a substance beginning at 0 K for incremental values of T up to 298.15 K. Specifically, the process involves calculation from 0-10 K, then summation from 10 K to 298.15 K of incremental/reversible measurement of heat transferred/T (shown in most phys chem. texts as Cp/T versus T, with the area under the curve to 298.15 K as the summation of ∫ Cp/T dT). S0 of course is the most telling example of the importance of viewing energy dispersal as integral to any consideration of entropy change.FrankLambert 23:06, 17 October 2006 (UTC)

Possibly a different perspective

When there is unhappiness about a piece of writing, I find it usefull to ask the questions, "Who will read this?" and "Who is the intended reader". Doing this about this article, it seems to me that very few people will read it usefully. There will be a group of well-trained physicists who might have a look at it, but they will learn very little because they already know it. Someone who knows nothing about entropy, but wants to know, will read the first paragraph and run away screaming. I think the same is likely to be true about chemistry students (I have taught Physical Chemistry for many years although my expertise is in quantum chemistry). I suspect that more students come across entropy in chemistry classes than in physics classes. It has been suggested, and I note a recent more detailed discussion on the citizendium forums, that all articles should appeal to everybody by being written at three levels - first a very general introduction for someone who knows nothing about the topic, second a discussion appropriate to a beginning undergraduate meeting the topic for the first time, and third a detailed discussion of the whole topic. It has to be said that this article fails this criteria.

What to be done? I welcome input from others as I only have a few ideas. First, there should be an introductory paragraph that contains no equations and no mathematics that says what entropy is and how it fits into the general picture. Second, the article and this discussion is full of very rigorous statements and equations. This is the way physicists think and rightly so. Of course they have a place. However, chemists often do not think in this way and I suggest we need to add some more vague points that appeal to chemists and perhaps to the general reader. Of course we can prefix them with a statement such as "a rough way of looking at this is ..". The suggestions of Frank Lambert fit in here. I do not think that energy dispersal is general. I prefer to think of entropy change on mixing as dispersal of matter. Nevertheless it is a good way of getting across what entropy is all about and it is proving usefull to chemists. It needs to be added, because quite frankly this article is very rigorous and exact but gives no real idea of what entropy is all about. --Bduke 23:09, 17 October 2006 (UTC)

I would certainly appreciate an integration of these levels of explanation into the article in some reasonable form, including material more accessible to the general reader. Or alternately, lacking an eventual realization of a consensus for such an approach, the participating editors could instead agree to create a topic fork dealing with basic principles of Entropy which avoids the more technical specifics and deals with the conceptual aspects related to entropy. This basic approach has been utilized in a number of topics in the wiki, some of which are included in a list put together by another editor who is no longer a Wikipedia user. That list can be seen here. ... Kenosis 04:52, 18 October 2006 (UTC)
"Introduction to Entropy" might be one way to go, as similar to your list. However I do not think that is the way an encyclopedia should be written. If we do go that way, this article ("Entropy") should at least have a simple first paragraph which could point to the "Introduction to Entropy" article. BTW, I would drop "Biology" and "Theoretical Biology" from your list. They do not match the other cases. --Bduke 06:05, 18 October 2006 (UTC)
Biology example removed as suggested. Here's the link to the list of split topics we're referring to. ... Kenosis 07:35, 18 October 2006 (UTC)

I am surprised that the regular editors of this article have not responded. Have you thought who actually reads this article and who should read it? Do you realise just how off-putting it is to the new comer to entropy who justs wants a simple introduction to the concept?

A new introduction is needed. I suggest that we add a new first paragraph that might read something like this:-

"While the concept of energy is central to the first law of thermodynamics, which deals with the conservation of energy, the concept of entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. In simple terms entropy change has been related to a change to a more disordered state, and to the idea of dispersion of energy or matter. However care has to be used with these simple ideas and a deeper study of the concept of entropy is necessary."

Could we delete this proposal here? --Bduke 00:15, 24 October 2006 (UTC)

I assume you meant "debate" not "delete". I fully agree that the explanation of entropy as "disorder", "energy dispersal" and/or "matter dispersal" are all imperfect and that the bottom line is that entropy is an inherently a difficult concept, and so I look favorably on your introduction. PAR 01:21, 24 October 2006 (UTC)
Thanks. Yes, I did of course mean "debate". It was too early in the morning here! --Bduke 01:36, 24 October 2006 (UTC)
I agree too, so I put it in the article, with slight modifications. Yevgeny Kats 02:24, 24 October 2006 (UTC)
My opinion is that this revisiting of the introduction is a definite step forward towards integrating verbal insights with mathematical or other formulaic insights into entropy for the readers. I removed the first clause of just-added sentence with the edit summary "Removing first clause of newly placed sentence. NEVER start a WP article with the word "while" or "although". Since this clause is already documented in Talk. I meant to say "Since this clause is already documented in Talk, there's not need to post a talk comment", but I inadvertently hit the "enter/return" key. Sorry about both the overemphasis on "never" and the mistake in my edit summary. Nevertheless I trust that the clause I removed (which read: "While the concept of energy is central to the first law of thermodynamics, which deals with the conservation of energy..."), or some other expression of this basic point, will find its way back into this article in another place in the article. It seems to me it's a very central and useful point. More importantly to me, I wanted to express my gratitude for the recent movement towards "plain-English" explanations of the idea of entropy to the WP reader. ... Kenosis 03:51, 24 October 2006 (UTC)

I looked at this again, and replaced the first paragraph of the intro with the following, pending the weigh-in of other knowledgeable editors:

  • "Entropy describes the amount of energy in a give place at a given instant in time. In simpler terms, entropy change is related to a change to a more disordered state, and/or to the dispersion of energy or matter."
...Kenosis 04:19, 24 October 2006 (UTC) I do apologize again, as this replacement is no doubt equally confusing as the previous version was, but more concise, and arguably is equally explanatory to the previously uninitiated reader. Plainly this will require intensive cooperation among the knowledgeable editors. ... Kenosis 04:31, 24 October 2006 (UTC)

I'm afraid it is. That is, "confusing". What is wanted in the introduction is some notion of what entropy is for or what it is about, before we start to define what it is. I do not understand your reasons for removing the first sentence. Why "never start with 'while'"? Would something like this be better:-

"The concept of entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. This is in contrast to the first law of thermodynamics, where the concept of energy is central and energy is conserved. The second law deals with the flow of energy, such as heat flows spontaneously from a hot body to a cold body, but not from a cold body to a hot body."

I am not so sure that conciseness is a virtue here. We want to ease someone into the article by giving them an idea where it is all going. Let us try to imagine someone who has no real idea of what entropy is about or what it is. --Bduke 08:24, 24 October 2006 (UTC)