Nonequilibrium Entropy Conservation - American Chemical Society

Jul 23, 2008 - Kirkwood's treatment of the transport equations but also allows for a consistent analysis of ..... Boltzmann's and Gibbs' entropy as sp...
1 downloads 0 Views 92KB Size
166

Ind. Eng. Chem. Res. 2009, 48, 166–171

Completing Irving and Kirkwood’s Molecular Theory of Transport Processes: Nonequilibrium Entropy Conservation Michael H. Peters* Department of Chemical and Life Science Engineering and Center of the Study of Biological Complexity, Virginia Commonwealth UniVersity, Richmond, Virginia 23284

Following Irving and Kirkwood’s (J. Chem. Phys. 1950, 18, 817) classical approach to the statistical mechanics of transport processes and the conservation equations for mass, momentum, and energy, we introduce a particular dynamical variable for entropy and derive the general nonequilibrium entropy conservation equation. This particular formalism is shown to encompass both Boltzmann’s and Gibbs’ entropy definitions as special cases. Entropy generation is shown to follow from phase-space dimensionality loss and truncations or approximations in higher-order space, the latter of which is consistent with the thesis of Jaynes (Am. J. Phys. 1965, 33, 391). The general approach to entropy conservation given here not only completes Irving and Kirkwood’s treatment of the transport equations but also allows for a consistent analysis of all transport equations for any given system. Following standard perturbation expansion methods about local equilibrium states, we derive the closed form of the entropy conservation equation for isolated systems, which is shown to be in agreement with well-known phenomenological results and the principles of irreversible thermodynamics. In addition, the generalized nonequilibrium entropy developed here is fully consistent with its equilibrium counterpart. As an example, our formalism allows the analysis of entropy changes in dense gases and liquids through the introduction of a nonequilibrium Green’s entropy. This study provides a firm molecular basis of entropy conservation by consistent methods across the transport equations, allowing ready extensions to complex systems. Such foundations are of contemporary importance in designing energy-efficient or minimum entropy generating engineering systems. 1. Introduction In general, nonequilibrium systems of gases, liquids, and solids may be classically described by the so-called conservation equations for mass, momentum, energy, and entropy. The specific irreversible nature or “times arrow” of nonequilibrium systems is entailed in the so-called phenomenological laws, i.e., Fick’s Law of Diffusion, Newton’s Law of Viscosity, Fourier’s Law of Heat Conduction, and the Law of Entropy Increase. From classical molecular theory, the phenomenological laws of mass, momentum, and energy have been shown to follow from the condition of small perturbations from local equilibrium states. 1 Molecular-based derivations of the conservation equations for mass, momentum, and energy and the associated phenomenological laws have been well-developed based on Irving and Kirkwood’s2 classical paper on the statistical mechanics of transport processes. However, a classical molecular theory derivation of the nonequilibrium entropy conservation equation and the Law of Entropy Increase has remained somewhat elusive or, at best, lacking of comparable clarity and consistency. As reviewed by Gaspard,3 a number of different classical molecular theory approaches to the nonequilibrium entropy problem have been proposed, including coarse graining and thermostatted system approaches. Recently, a unified approach to nonequilibrium entropy based on coarse-grained methods has been given in the book by Ottinger.4 Other studies based on Bayesian or information theoretic approaches have directly associated entropy to “missing information” and entropy increases to “system information loss”,5,6 for example, Boltzmann’s equation “loses information” about correlations between particles.6 As is well-known, Boltzmann’s gas kinetic molecular * Corresponding author e-mail: [email protected]. Phone: (804) 8287789. Fax: (804) 828-3846.

theory analysis leads to the Law of Entropy Increase, as entailed in Boltzmann’s H-theorem, whereas the classical Gibbs’ definition of entropy celebrated in the equilibrium thermodynamic analysis of dense gases and liquids does not. In general, classical molecular theory approaches are fundamentally based on the Liouville equation. The Liouville equation, however, can be readily shown to fundamentally lack finite entropy generation.3 This seeming contradiction in classical approaches can be traced to the use of approximations, such as in Boltzmann’s analysis or in coarse-graining methods.7 From an information theoretic point of view, entropy is a direct measure of “uncertainty” and approximations will always lead to an increase in uncertainty or, in other words, entropy generation. As we show here, entropy generation is formally a part of the “closure” problem of classical statistical mechanics of the transport equations, and “closure” is always based on truncations or approximations. The exact origin of the Second Law is still an outstanding theoretical question, and we will instead focus here on the completion of the classical statistical mechanics of the transport equations based on Irving and Kirkwood’s methods. Our analysis has a more practical bent in achieving results that not only are fully consistent and agree with phenomenology but also provide general entropy expressions based on intermolecular interaction forces that, in turn, allow for system-to-system comparisons and rational engineering systems analysis. This latter concept is key to designing systems for minimizing energy waste and improving efficiencies of energy conversion systems. In Irving and Kirkwood’s2 original approach to nonequilibrium statistical mechanics of gases, liquids, and solids, they first considered a general conservation equation, obtained directly from the Liouville equation, for a general dynamical variable. A particular selection of this dynamic variable was shown to lead to the conservation equations of either mass, momentum, or energy in a straightforward fashion; a dynamic variable for

10.1021/ie800170s CCC: $40.75  2009 American Chemical Society Published on Web 07/23/2008

Ind. Eng. Chem. Res., Vol. 48, No. 1, 2009 167

entropy and entropy conservation was not attempted. So, we will first begin with the Liouville equation itself. 2. Liouville Equation and Reduced Forms In Cartesian vector notation, the Liouville equation for particle number preserving systems reads8

[

N ∂fN pi ∂fN ∂fN )· + Fi(rN) · ∂t mi ∂ri ∂pi i)1



]

(1)

where fN is the N-particle density function, ri and pi are the position and momentum coordinates for the ith particle of mass mi, and we have assumed that the force acting on particle i is only a function of the position coordinates rN ≡ (r1, r2,..., rN). Now consider a reduced form of the Liouville equation for a set of molecules {s} ) {1, 2, 3,..., s} that can be obtained by integrating the Liouville equation over the phase-space of the other {N - s} set of molecules. Following standard procedures,8 we can integrate eq 1 over drN-s dpN-s space to obtain the reduced Liouville equation

( )

s s pi ∂fs ∂fs ∂ 1 + · + · (N - s)! ∂t i)1 mi ∂ri ∂p i i)1





∫∫Ff

i N

drN-s dpN-s ) 0(2)

The force acting on the ith molecule can be written as the gradient of a potential as Fi ) -

∂ N( N) Φ r ∂ri

(3)

and for the purposes of specific applications below, the potential will be approximated by a sum over the pair interaction potential as (“pairwise additivity assumption”) Fi ) -

∂ N( N) Φ r ≈∂ri

N

∑ ∂r∂ φ(r ) i

where the pair potential φ(rij) is the interaction potential between any two molecules in the system, i.e., the effects of three or more body interactions on the pair potential expression are neglected. Note that this assumption does not limit the generality of our results given below. Substituting eq 4 into eq 2 gives



[(

) ∑(

pi ∂fs · mi ∂ri

s

j)1 j*i

[

∂φ(rij) ∂fs · ∂ri ∂pi N

∑ ∫∫

j)s+1

)]



[( ) ∑ ( pi ∂fs · mi ∂ri

s



∂ 1 · (N - s) ! ∂p i i)1

]

∂φ(rij) f drN-s dpN-s (5) ∂ri N

j)1

∂φ(rij) ∂fs · ∂ri ∂pi

j*i

s

∑ ∫∫ i)1

(

)]

SB(r, t) ) -

k m

∫ f (r, p, t) ln f (r, p, t) dp 1

(7)

1

where k is Boltzmann’s constant and m is the molecular mass, and Gibbs’ entropy celebrated in equilibrium thermodynamic systems, SG ) -

1 N ! p3N

∫∫f′

N

ln f ′N drN dpN

(8)

where f 'N ) p3NfN and p is Planck’s constant. Boltzmann’s entropy definition leads to the “Law of Entropy Increase” in dilute gas systems, whereas the seemingly more general Gibbs’ entropy definition does not.3 However, we now show that it is possible to introduce a more general definition of entropy that encompasses both Boltzmann’s and Gibbs’ definitions as limiting cases. This general definition will allow us to extend molecular-based derivations of entropy conservation to dense gases and liquids, as well as to complex fluids, in a manner entirely consistent between all transport equations. Our results will also be shown to be consistent with phenomenology and the particular expressions of entropy increase or generation. We begin by defining a general quantity R as

[

s

]∑

N! 1 δ(rj - r) ln[p3sfs(rs, ps, t)] s ! (N - s) ! s j)1

(9)

where fs is an sth order, reduced-density function. The phasespace average of R, , is computed by integrating R over all s phase-space coordinates as s

) nS¯BG ) -

∑ ∫ ∫ δ(r - r)f

k1 s! s j)1

j

s

ln(p3sfs) drs dps (10)

s

)

The integration on the right-hand side can be performed over all the (rN-s, pN-s) space except the (s + 1) molecule. Thus, eq 5 becomes s ∂fs + ∂t i)1

As noted in the Introduction, there are two seemingly different definitions of entropy, i.e., Boltzmann’s entropy celebrated in dilute gas kinetic theory,

R ) -k

j*i

s ∂fs + ∂t i)1

3. Entropy Conservation

(4)

ij

j)1

equation where the evolution of the fs density depends on the next higher-order, fs+1, density. This nonhomogeneous dimensionality feature is the so-called BBGKY hierarchy named after its originators: Bogoliubov, Born, Green, Kirkwood, and Yvon.8 For the sake of simplicity, we will henceforth consider only a single- or pure-component system, although the results given here can be readily extended to multicomponent systems as well.

We have called this average the Boltzmann-Gibbs entropy, where SjBG is the entropy per molecule and n is the local molecular number density. For s ) 1,we recover Boltzmann’s definition, nS¯B(r, t) ) -k

∫ f (r, p′ t) ln(p f )dp′ 3

1

,

1

(11)

and for s ) N, we obtain Gibbs’ entropy, s

∑ ∫ ∫ δ(r - r)f

k 1 nS¯G(r, t) ) N! N j)1

)

j

N

ln(p3NfN) drN dpN (12)

)

∂φ(ri,s+1) ∂fs+1 · drs+1 dps+1(6) ∂ri ∂pi

Equation 6 is the reduced Liouville equation for pairwise additive interaction forces, and it is an integro-differential

We note that, for systems at equilibrium, fN is independent of the locator vector r and, utilizing the fact that n ≡ N/V ) N/∫dr at equilibrium, eq 12 becomes equal to Gibbs’ entropy for equilibrium systems, eq 8. Also, note that the introduction of Planck’s constant in the logarithmic term of Boltzmann’s

168 Ind. Eng. Chem. Res., Vol. 48, No. 1, 2009

entropy, eq 11, is necessary on account of dimensional arguments. The dynamical variable for entropy given above depends explicitly on time through the s-order distribution function, whereas the dynamical variables for mass, momentum, and energy introduced by Irving and Kirkwood2 have no explicit time dependence. Now, to obtain an entropy-conservation equation following Irving and Kirkwood’s approach, we can work with the Liouville equation and the dynamic variable modified to include the explicit time dependence in R, or it is somewhat easier and equivalent to work directly with the reduced Liouville equation, eq 6; we choose the latter approach. Multiplying eq 6 by s



k 1 ln(p3sfs)δ(rj - r) s! s j)1

∂fs ln(p3sfs)δ(rj - r) drs dps ) ∂t j)1 3s

s

s

j

s

j)1

s

∑ ∫∫

k s!s j)1

∂ ∂ fs [ln(p3sfs)]δ(rj - r) drs dps ) (nS¯BG) + ∂t ∂t k ∂n(r, t) (13) s ∂t

s

-

s

s

∑∑ ∫∫

-

j)1 s

s

j)1

{

pi ∂ · [f ln(p3sfs)δ(rj - r)] drs dps mi ∂ri s pi

3s

s

i

}

fs)δ(rj - r)] drs dps (14)

i

The first term on the right-hand side of eq 14 vanishes by virtue of Gauss’ theorem and the properties on the density function for isolated systems,8–10 also employed by Irving and Kirkwood in their derivation of the transport equations,

The second term on the right-hand side of eq 14 expands into two terms s

+

s

∑∑ ∫∫

k s!s i)1

j)1

{

pi ∂ · f ln(p3sfs) δ(rj - r) + mi s ∂ri

}

pi ∂ · f δ(r - r) [ln(p3sfs)] drs dps ) mi s j ∂ri s

-

pi

∑ ∫∫ m

k ∂ · s!s ∂r i)1

fs ln(p3sfs) drs-1 dps +

i

s

s

pi ∂fs

∑ ∑ ∫ ∫ m · ∂r δ(r - r) dr

k s! i)1

s

j

j)1

i

i

fs ln(p3sfs) drs-1 dps +

j)1

pi ∂fs ∂ · δ(r - r) drs dps) · nS¯BGv0 + mi ∂ri j ∂r

∂ k ∂ ·s+ · nv0 (18) ∂r s ∂r where we have introduced the entropy flux vector s defined by s

s(r, t) ) -

p′i

∑ ∫∫ m f

k s!s i)1

s

ln(p3sfs) drs-1 dps

)

(19)

i

∂ ∂n + · [nv0] ) 0 (20) ∂t ∂r Now it can be readily shown from Gauss’ theorem that the third term in eq 6 integrates to zero. Finally, the last term in eq 6 becomes s

( )

∫ ∫ m · f ∂r∂ [ln(p

(17)

1

i

s

∑∑

k s! i)1

-

pi ∂fs · ln(p3sfs)δ(rj - r) drs dps ) mi ∂ri

∑∑ ∫∫

k s!s i)1

pi

∑ ∫∫ m

k ∂ · s!s ∂r i)1 s

For the second term in eq 6, we have k s!s i)1

∫ ( mp )f (r, p, t) dp

which is the flux of entropy relative to the mass average velocity, v0. Note that the last term in eq 18 cancels with the last term in eq 13, by virtue of the continuity equation

s

s

1 n

gives

s

)∑ k ∂ f ln(p f )δ(r - r) dr dp ∂t ∫ ∫ ( s ! s )∑

∫∫(

v0(r, t) ≡

(

and integrating over all (rs, ps) space gives the following termby-term analysis. For the time-dependent term, we have k s!s

Substituting pi/mi ) p′i/mi + v0, where p′i ≡ pi - miv0 is the molecular momentum relative to v0, into the first term on the right-hand side of eq 16 and reducing the second term using the definition of bulk average velocity v0

dps(16)

s

∑ ∑ ∫ ∫ ln(p

k s!s i)1

3s

j)1

[

fs)δ(rj - r) ×

]

∂Φ(ri,s+1) ∂fs+1 · drs drs+1 dps dps+1 ≡ ns¯g(21) ∂ri ∂pi

where jsg is the entropy generation per molecule. For s ) 1, we obtain the well-known Boltzmann’s entropy-generation term,8,9 ns¯g(r, t) ) -k

∫ ∫ ln[p f (r, p , t)] 3

[

1

1

]

∂Φ(|r - r2|) ∂f2(r, r2, p1, p2, t) · dr2 dp1p2(22) ∂r ∂p1

More generally, it is clear from eq 21 that, anytime we truncate the BBKGY hierarchy by neglecting the fs+1 term, the entropy-generation term, with entropy defined in the s-space, identically vanishes by virtue of Gauss’ theorem and the properties2,10

The Gibbs’ entropy, defined in the total N-space, represents a trivial case, i.e., there are no fN+1 terms by definition and, therefore, no entropy generation. Another example of zero entropy generation would be a dilute gas with entropy defined in (s ) 2)-space. Thus, entropy generation can exist, according to this paradigm, whenever the number of variables or degrees of freedom s as specified by fs ln fs is less than the number of variables required to specify the dynamic state of the system. In the subject of information theory and statistics,5 the entropy definition, eq 10, is also known as a “measure of uncertainty”. When this measure of uncertainty “misses” important dynamic variables (“hidden” variables), entropy as defined in the s-space

Ind. Eng. Chem. Res., Vol. 48, No. 1, 2009 169

can be generated. In the case of Gibbs’ entropy, there is no missing information and, hence, no entropy generation. In Boltzmann’s dilute-gas analysis, considered in more detail below, entropy is defined in the (s ) 1)-space, yet two-body interactions or collisions are necessary to account for the overall gas dynamics. Although not proven here, eq 21 or 22 may be zero in a more general sense if all variables and distribution functions involved are accurately determined without any approximations or truncations. Furthermore, we still have not shown that the general form of the entropy-generation term is sufficient to guarantee nonzero, positive entropy generation. In the case of Boltzmann’s analysis, further approximations of eq 22 will be shown below to lead to the Law of Entropy Increase for dilute gases. More generally, truncations of the BBKGY hierarchy are practically necessary for resolving higher-order distribution functions. Jaynes7 has argued that only through such deliberate truncations and/or approximations is it possible to obtain positive entropy generation, which establishes a sufficient condition for positive, nonzero entropy generation. Regardless of the arguments as to the exact origin of positive, nonzero entropy generation and the Second Law, the above analysis provides a general, practical framework upon which entropy generation can be developed for both gas and liquid systems in a manner consistent across the transport equations. Equally important, it allows for a bridge between equilibrium entropy analysis, such as Green’s dense gas and liquid equilibrium state entropy defined in the (s ) 2)-space, and nonequilibrium entropy, introduced here. Specifically, the nonequilibrium, two-space Green’s entropy would be defined according to eq 10 as

Now, we introduce the Chapman-Enskog8,9 perturbation expansion f1 ) f(0)(1 + φ), where f(0) is the local equilibrium distribution function

∑ ∫ ∫ δ(r - r)f

]

(28)

and the perturbation function is expressed in terms of bulk velocity and temperature gradients as8,9 ∂ ∂ ln T v -A· ∂r 0 ∂r

φ ) -B :

(29)

We further note that ln[p3f(0)(1 + φ)] ≈ ln(p3f(0)) + φ

(30)

Thus, from eq 27, we obtain to linear order in φ s ) -k

∫ (p ⁄ m - v )φf 0

(0)

ln(p3f(0)) dp

(31)

Using eq 28 for f(0), we further have

[

]

n1p3 (p - mv0)2 (32) 3⁄2 (2πmkT) 2mkT When substituted into eq 31, the first term of the right-hand side of eq 32 vanishes by virtue of one of the so-called auxiliary conditions on φ8,9 ln(p3f(0)) ) ln

∫ pf

φ dp ) 0

(33)

(0)

leaving us with s)

2

k nS¯Green(r, t) ) 4 j)1

[

n1(r, t) (p - mv0)2 exp (2πmkT)3⁄2 2mkT

f(0) )

∫ (p - mv )(p - mv ) f

1 2m2T

0

0

2

φ dp ) qk ⁄ T

(0)

(34)

ln(p6f2) dr2 dp2 (24)

where the kinetic contribution to the energy flux vector is defined by8,9

which, under equilibrium conditions, can readily be shown to reduce to Green’s equilibrium expression.11 Thus, the generalized Boltzmann-Gibbs entropy expression, eq 10, provides a very simple and straightforward method of obtaining higherorder expressions for nonequilibrium entropy that are fully consistent with their equilibrium counterparts. Summarizing the results of eqs 13, 18, and 21, we have the following general entropy-conservation equation

1 (p - mv0)2(p - mv0)f1(r, p, t) dp 2m2 1 ) 2 (p - mv0)2(p - mv0)f(0)(1 + φ) dp (35) 2m Thus, we have derived the well-known thermodynamic argument that the local entropy flux is equal to the local energy flux divided by the local temperature. Note that, for dilute gases, there is no potential contribution to the energy flux vector and the subscript k can be dropped.8,9 For the entropy-generation term in dilute gases, we have from eq 21

j

2

∂ (nS¯BG) ∂ ∂ + · (nS¯BGv0) ) - · s + ns¯g (25) ∂t ∂r ∂r Expanding the left-hand side and eliminating the equation of continuity, we get the equivalent form of entropy conservation ∂S¯BG ∂S¯BG ∂ + nv0 · ) - · s + ns¯g (26) ∂t ∂r ∂r where s and jsg are defined by eqs 19 and 21, respectively. n



s ) -k

∫ (p⁄m - v )f

0 1

3

ln(p f1) dp

(27)

∫ ∫ ln(p f )[

]

∂φ(r, r2) ∂f2 · dr2 dp dp2 (36) ∂r ∂p Now, expanding as before, retaining terms to linear order in φ, and using eq 6 for s ) 1, it can be readily shown that ns¯g(r, t) ) -k

ns¯g ) k

4. Entropy Flux and Entropy Generation in Gases and Liquids Finally, we turn to the equation of entropy conservation, eq 26, and look at the specific expressions for the entropy-flux and entropy-generation terms for gases and liquids based on regular perturbation expansions about local equilibrium states. First, considering a dilute-gas system with entropy defined in the s ) 1 space, the entropy flux from eq 19 is



qk )

∫ [ln(p f 3

3

1

) + φ]f 0

(0)

{

( )

b:

∂v0 5 - - w2 ∂r 2

(

[(p - mv ) · 0

)} × ∂ ln T dp(37) ∂r

]

where 1 b ) 2 ww - w2I 3

[

]

(38)

and w)

(p - mv0) √2mkT

(39)

170 Ind. Eng. Chem. Res., Vol. 48, No. 1, 2009

Substituting our expression for φ from eq 29 and carrying out the integrations, we are left with ns¯g ) k

∫ (B : ∂r )f [2(ww - 31 w I) : ∂r ] dp 5 ∂ ln T ∂ ln T - w )[(p - mv ) · k ∫ (A · f dp(40) ∂r ) ( 2 ∂r ] ∂v0

∂v0

2

(0)

2

(0)

0

Now, utilizing well-known results in gas kinetic theory,8,9 it can be readily shown that eq 40 yields the well-known phenomenological result13 for entropy generation as ns¯g )

(

) (

1 ∂v0 1 ∂T : 2µS - 2 q · T ∂r ∂r T

)

(41)

where µ is the gas viscosity and S is the rate-of-strain tensor defined as

[( ) ( ) (

)]

∂v0 t 2 ∂ 1 ∂v0 · v0 I (42) + 2 ∂r ∂r 3 ∂r Note that entropy generation, via eq 41, is necessarily a positive quantity. Although it is beyond the scope of the current paper, using the same type of perturbation expansion about a (s ) 2) local equilibrium state, it is possible to extend these results to dense gases and liquids and show the general equivalency to the phenomenological results.11 S)

5. Discussion Within the context of Irving and Kirkwood’s2 classical paradigm on the statistical mechanics of transport processes, we constructed a dynamical variable for entropy that recovers Boltzmann’s and Gibbs’ entropy as special cases. The table below summarizes each dynamic variable and the associated conservation relations that complete Irving and Kirkwood’s statistical mechanical treatment of the transport equations. 〈R〉

R Σk ) 1 δ(rk - r)

n(r, t)

ΣkN) 1 pkδ(rk - r)

Fv0(r, t) j k(r, t) + U j φ(r, t) n[U + (1/2)mV02 + φext(r, t)]

N

(1/2m)ΣkN )1 pk2δ(rk - r) + ΣkN )1 φext(r, t)δ(rk - r) + (1/2)Σj Σ k φint(|rj - rk|) k*j δ(rk - r) -k[N!/(s!(N - s)!)] (1/s) Σjs ) 1 δ(rj - r) ln[p3sfs(rs, ps, t)]

njSBG(r, t)

conservation relation mass momentum energy

entropy

Our connection of entropy generation to phase-space dimensionality loss (hidden variables) or, in a Bayesian context, a generator of missing information is interesting, but it remains doubtful if any classical approach can lead to the origins of the Second Law, as discussed here and elsewhere. It was shown that, for perturbations (approximations) about local equilibrium states, the entropy-flux and entropy-generation terms yielded well-known phenomenological results and the Law of Entropy increase. Our analysis provides a consistent framework to study entropy behavior in any given system as part of a complete transportphenomena analysis. No coarse graining or any special treatment is needed for an entropy analysis. In addition, our general nonequilibrium entropy definition is fully consistent with its equilibrium counterpart, and the expressions of Boltzmann and Gibbs are both contained within it. The solution to the closure problem of transport phenomena can be done in a fully

consistent manner across the transport equations, including the conservation of entropy. This self-consistency is believed to be of heightened importance in the treatment of complex fluids and systems, where errors in self-consistency would appear to be exasperated. In addition, complex fluids are likely to be those involved in engineering systems designed specifically for energy efficiency or, in other words, minimum entropy production. The study of entropy appears to be of the utmost contemporary importance given the current depletion of natural resources and the recognition of the social and environmental preciousness of energy sources. 6. Conclusions We have succeeded in developing a dynamic variable for entropy in the context of Irving and Kirkwood’s classical paradigm on the statistical mechanics of transport processes. We have shown that both Boltzmann’s and Gibbs’ expressions are derivable from this generalized entropy function, and we have derived a general entropy-conservation equation that completes the molecular-based set of transport equations. Using the same perturbative approaches in the closure treatment of mass, momentum, and energy conservation, we derived forms of the entropy flux and entropy generation that were shown to be fully consistent with phenomenology. Our results are believed to be important in the inclusion of entropy conservation in the design of energy-conversion systems, which requires consideration of all transport equations in a self-consistent fashion. Applications and extensions of our analysis to complex systems and alternative closure methods are ongoing. Acknowledgment I wish to gratefully thank my Ph.D. advisor, counselor, and friend Dr. L.-S. Fan for his many years of support and guidance. Nomenclature A ) vector defined by perturbation expansion, eq 29 b ) tensor defined by eq 38 B ) tensor defined by perturbation expansion, eq 29 fN ) N-particle density function fs ) reduced s-particle density function p ) Planck’s constant k ) Boltzmann’s constant mi ) mass of ith particle n ) local molecular number density pi ) momentum coordinates for the ith particle qk ) kinetic contribution to the energy flux vector ri ) position coordinates for the ith particle s ) entropy flux vector jsg ) entropy generation per molecule S ) rate-of-strain tensor defined by eq 42 SB(r, t) ) Boltzmann’s entropy SBG ) Boltzmann-Gibbs entropy per molecule SG ) Gibbs’ entropy SjGreen(r, t) ) nonequilibrium, two-space Green’s entropy Greek Symbols R ) general dynamical variable ) phase-space average of R ΦN(rN) ) total intermolecular potential function φ(rij) ) interaction potential between any two molecules in the system v0 ) bulk average velocity

Ind. Eng. Chem. Res., Vol. 48, No. 1, 2009 171

Literature Cited (1) For a more contemporary view, see, e.g., Prigogine, I. From Being to Becoming. Time and Complexities in the Physical Sciences; W. H. Freeman Co.: New York, 1980. (2) Irving, J. H.; Kirkwood, J. G. The Statistical Mechanical Theory of Transport Processes. IV. The Equations of Hydrodynamics. J. Chem. Phys. 1950, 18, 817. (3) Gaspard, P. Entropy Production in Open Volume-Preserving Systems. J. Stat. Phys. 1997, 88, 1215. and the references cited therein. (4) Ottinger, H. C. Beyond Equilibrium Thermodynamics; Wiley: New York, 2005. (5) (a) Jaynes, E. T. Phys. Rev. 1957, 106, 620; 108, 171. (b) Katz, A. Principles of Statistical Mechanics, The Information Theory Approach; W. H. Freeman: San Francisco, CA, 1967. (6) Schack, R.; Caves, C. M. Chaos for Liouville Probability Densities. Phys. ReV. E 1996, 53, 3387. (7) Jaynes, E. T. Gibbs Vs. Boltzmann Entropies. Am. J. Phys. 1965, 33, 391. (8) Hirschfelder, J. O.; Curtiss, C. F.; Bird, R. B. Molecular Theory of Gases and Liquids; Wiley: New York, 1964.

(9) Chapman, S.; Cowling, T. G. The Mathematical Theory of NonUniform Gases; Cambridge University Press: New York, 1970. (10) Abramowitz, M. Stegun, I. A. Handbook of Mathematical Functions; Dover: New York, 1972; in particular, see Sec. 4.1.31 for the asymptotic behavior of x ln x. (11) Peters, M. H. Molecular Thermodynamics and Transport Phenomena. Complexities of Scales in Space and Time; McGraw-Hill: New York, 2005. (12) Dense gas and liquid perturbation expansions were first treated in a series of papers by Born, M.; Green, H. S. Proc. R. Soc. London, Ser. A 1947, 189, 27; ibid, 1946, 188, 10; ibid, 1947, 189, 103. See, e.g., ref 8 for more extensive citations and discussions. (13) Bird, R. B.; Stewart, W. E.; Lightfoot, E. N. Transport Phenomena; Wiley: New York, 1960.

ReceiVed for reView January 30, 2008 ReVised manuscript receiVed April 25, 2008 Accepted April 25, 2008 IE800170S