Review of Entropy and the Second Law: Interpretation and Misss

Disorder - A Cracked Crutch for Supporting Entropy Discussions ... Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms - Examples of Entropy Increa...
12 downloads 0 Views 516KB Size
Book and Media Review pubs.acs.org/jchemeduc

Review of Entropy and the Second Law: Interpretation and Misss-Interpretationsss Hal H. Harris* Department of Chemistry and Biochemistry, University of MissouriSt. Louis, St. Louis, Missouri 63121, United States Entropy and the Second Law: Interpretation and MisssInterpretationsss, by Arieh Ben-Naim. World Scientific Publishing Company: Singapore, 2012. 263 pp. ISBN 9789814374897 (paper). $18.00.

T

his book is a debate, or perhaps more accurately, half of a debate. Arieh Ben-Naim is arguing against Frank Lambert1 and many other JCE and textbook authors who “explain” entropy on the molecular level as “lack of order”, “freedom”, or “the spread of energy”. Ben-Naim has written on this subject before, both in this Journal2 and in a book3 that I reviewed for JCE. He is especially critical of explanations of why a system evolves from one state of entropy to another, unless it is simply because the state with the larger number of configuration is more probable. Ben-Naim’s exposition in Entropy and the Second Law: Interpretation and Misss-Interpretationsss is based on the groundbreaking 1948 paper by C. E. Shannon,4 which formally deals with information theory in communication, yet which also has far-reaching implications for statistical mechanics and thermodynamics. That entropy was related to information (or lack thereof) was presaged in comments by G. N. Lewis nearly 20 years earlier,5 but Shannon’s measure of information (SMI) provided a mathematically rigorous treatment of the subject. The application to a molecular treatment of thermodynamics was a kind of side product of his work. For a general distribution of probabilities pi, SMI (symbol H) is

Cover image provided by World Scientific Publishing Company and reproduced with permission.

that in the first case the particles are distinguishable and the second they are not.

H = K ∑ pi log(pi )

where K is simply a unit of measure and the pi values are the probabilities of occurrence of i events. The SMI is used in decision theory, communications theory, and transport theory, reliability engineering, and provides the basis for what is called the maximum entropy formalism.6 Most teachers of chemical thermodynamics or even statistical mechanics will not be plowing through Shannon’s dense 44 pages, but perhaps we should be more careful about the descriptions we use for entropy, lest they mislead more than they enlighten. Ben-Naim is disdainful of many of the popular metaphors for entropy (S), and that is the reason for the extra S’s in his subtitle. I find Entropy and the Second Law provocative, which is entirely the author’s intent. I also find it quite readable, informative, and largely persuasive. As an example, Ben-Naim re-examines the perennial “entropy of mixing” of ideal gases. The way this is nearly always presented is as process I below, in which a partition between two different gases at the same pressure is removed, and the entropy change is called ΔmixS. Ben-Naim claims that this is incorrect, because the parallel process II below in which two identical gases are “mixed” has no net change and therefore “ΔmixS” is zero. The only difference between the processes is © 2014 American Chemical Society and Division of Chemical Education, Inc.

Ben-Naim says that the calculated entropy of mixing in process I is actually just a consequence of the fact that the volume available to molecules of each type is doubled, giving ΔS = NkB ln 2, where N is the number of molecules and kB is the Boltzmann constant. His preferred model for the entropy of mixing is shown below, in which blue molecules and red molecules in equal-sized containers are mixed and then recompressed back to their original volume.

In this circumstance, the only thing that has changed is that the gases have been mixed. Should not the entropy of this change be called the “entropy of mixing”, instead of that of the process first described? Published: February 6, 2014 310

dx.doi.org/10.1021/ed500035f | J. Chem. Educ. 2014, 91, 310−311

Journal of Chemical Education

Book and Media Review

(2) Ben-Naim, A. J. Chem. Educ. 2009, 86, 99. Ben-Naim, A. J. Chem. Educ. 2011, 88, 594. (3) Ben-Naim, A. Entropy Demystified: The Second Law Reduced to Plain Common Sense; World Scientific Publishing Company: Singapore, 2007. (4) Shannon, C. E. Bell Syst. Tech. J. 1948, 27, 379−423. (5) Lewis, G. N. Science 1930, 71, 569. (6) Levine, R. D.; Tribus, M. The Maximum Entropy Formalism; MIT Press: Cambridge, MA, 1979.

Ben-Naim asks those who identify changes in entropy with order and disorder, “spread of energy”, or “freedom”, to consider the racemization of alanine by a catalyst at constant volume, temperature, and pressure. If we start with the pure d-alanine isomer and convert half of it to the equilibrium racemate, the entropy clearly increases, but it is not easy to see how this can be described by any of the three popular metaphors, although the SMI statistical measure does so straightforwardly.

This is a challenging booknot because it is particularly mathematical or difficult, but because it challenges us to reexamine the molecular-level rationalizations that are often used for entropy changes and the second law of thermodynamics. Although they are popular and seductively truthy, teachers of chemistry should not be “explaining” molecular-level processes using language including the entropy word without statistical justification. This book has the most uninformative index I have ever seen. It is only one page in length and includes entries like “A (the first letter of the alphabet)every page of the book” and “Third Law of ThermodynamicsNowhere in this book”. On the other hand, the bibliography is pretty good. It includes references to all the folks that the author is scolding, and his own books and JCE articles on entropy and thermodynamics. It is not possible to do justice in this review to the thoroughly statistical exposition of entropy and the second law that the author presents. I think that one can learn a great deal by listening to a debate, or even half of a debate. Ben-Naim has a point worth making, and he has done it both passionately and effectively. I will attempt to be more circumspect about my own language describing entropy.



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. Notes

Figures adapted from Entropy and the Second Law: Interpretation and Misss-Interpretationsss, by Arieh Ben-Naim, copyright 2012, World Scientific Publishing Company, and used with permission.



REFERENCES

(1) Lambert, F. L. J. Chem. Educ. 1999, 76, 1385. Lambert, F. L. J. Chem. Educ. 2002, 79, 187. Lambert, F. L. J. Chem. Educ. 2006, 79, 1241. Lambert, F. L. Chemistry 2006, 15, 13. Lambert, F. L. J. Chem. Educ. 2007, 84, 1548. 311

dx.doi.org/10.1021/ed500035f | J. Chem. Educ. 2014, 91, 310−311