Entropy, Disorder, and Freezing

Oct 10, 1999 - introductory chemistry courses, entropy is introduced as a measure of' “disorder” or “randomness”. Later, this view is often ju...
9 downloads 0 Views 48KB Size
In the Classroom

Entropy, Disorder, and Freezing Brian B. Laird Department of Chemistry, University of Kansas, Lawrence, KS 66045; *[email protected]

The entropy, S, is one of the most important concepts in the thermodynamics of chemical systems, but it is also the most difficult one to understand and to teach. Typically, in introductory chemistry courses, entropy is introduced as a measure of‘ “disorder” or “randomness”. Later, this view is often justified by an appeal to the Boltzmann view of entropy as a measure of the number of microstates, Ω, consistent with a given macroscopic state; that is, S = k ln Ω

(1)

where the constant of proportionality, k, is Boltzmann’s constant. The argument is then made that increasing the “disorder” in a system corresponds to an increase in Ω, often with appeals to illustrative examples from the macroscopic world such as the probability of various poker hands and the tendency of children’s rooms to become messy. The Boltzmann entropy can be interpreted within the context of information theory as a measure of “lack of information”—the greater the number microstates consistent with a system in a given macroscopic state, the less information is available about the precise microscopic state of that system at a given instant. (If there is only one microscopic state, say for a perfect crystal at T = 0, then there is no lack of information about the precise microscopic state and the entropy is zero.) This is, of course, a statistical mechanical view. Within the context of thermodynamics, one can similarly define entropy in terms of “lack of constraint” (1) (here a constraint is any restriction, internal or external, placed on the number or ranges of the independent macroscopic variables that are necessary to describe the thermodynamic state of the system). One could, of course, define disorder as “lack of information” or “lack of constraint”, but this is (i) unnecessary, given that more precise descriptors exist (as discussed above), and (ii) undesirable, since the concept of disorder is so closely associated with structural disorder that attempting to generalize the concept can lead to significant misunderstanding in the classroom, for both students and educators. Some of the problems associated with the use of entropy as a measure of structural disorder have been discussed earlier in these pages (2). The problems in the use of the term disorder to describe entropy are perhaps most acute in the study of the freezing transition. Namely, there exist situations where, given a liquid and a crystal of the same material subject to the same thermodynamic conditions, the crystal phase has a higher entropy than the (metastable) liquid. This is in direct contradiction to the usual assumption regarding entropy and the commonly held concept of disorder (i.e., structural or topological disorder) because the crystal exhibits long-range order, which is absent in the fluid phase. So if this is true, then the usual assumption of entropy and its relation to structural disorder would not only be irrelevant in determining entropy differences, it could even give the wrong answer! Such situations can indeed arise at high density when packing considerations dominate the thermodynamics, and the violation of the conventional wisdom about the relationship of entropy to disorder comes 1388

about because our intuition about this relationship has been developed primarily at low density where packing is not important. In the next section, the issues of packing, structural disorder, and entropy at high density are discussed. Packing at High Density: Suitcases and Hard-Spheres A familiar macroscopic example of a system in which the maximum entropy state at high density is a spatially ordered state, as opposed to a disordered one, is the suitcase-packing problem: Imagine packing a suitcase for a trip. If the trip is a short one, relatively few items are required and, from experience, when the total volume of items to be packed is much less than the volume of the suitcase we know that the easiest way to pack is to randomly toss the items in and shut the suitcase. Putting the items to be packed in some “ordered” arrangement requires extra work and the order would most probably be destroyed during transport. We put this into statistical mechanical terms by saying that there are far more ways in which the suitcase can be packed in a “disordered” arrangement than in an “ordered” one; that is, the disordered state has a higher entropy than the ordered one. This lowdensity case is consistent with the standard paradigm. Now, suppose instead that the trip is a long one and many items are required to be packed in the same suitcase. In this case the volume of the objects to be packed is on the order of the total volume of the suitcase. This would be a high-density system. From experience, one knows that if the items to be packed are randomly tossed into the suitcase it is impossible to shut the case. Such a configuration is then incompatible with the volume constraints of the suitcase. On the other hand, if the contents are packed in a neat and ordered arrangement, the suitcase can be closed without difficulty. We could say then that the number of ordered “microstates” (i.e., arrangements of suitcase contents) that are consistent with the given “macroscopic” constraints (i.e., the fixed volume of the suitcase) is greater than the number of corresponding disordered states; therefore, an ordered arrangement in our (high-density) suitcase can be said to have a higher entropy, in the Boltzmann sense, than a structurally disordered one. The disorder-to-order transition as the suitcase density is increased can be said to be purely entropy driven, since energetic concerns do not enter into the analogy. (The concept of purely entropy-driven transitions is not new and has been discussed in the physics literature by Frenkel [3].) A classic molecular model that illustrates this effect is a system consisting of hard spheres of diameter σ. The interatomic interaction energy, v(r), of two such particles separated by a distance r is given by

v(r) = ∞; v(r) = 0 ;

r