Nonequilibrium Thermodynamics of Dynamical Systems - The Journal

The balance equation of the information entropy of a dynamical system subjected to noise ..... The Journal of Physical Chemistry B 2010 114 (32), 1056...
2 downloads 3 Views 298KB Size
J. Phys. Chem. 1996, 100, 19187-19191

19187

Nonequilibrium Thermodynamics of Dynamical Systems G. Nicolis* and D. Daems Center for Nonlinear Phenomena and Complex Systems, UniVersite´ Libre de Bruxelles, Campus Plaine, CP 231, 1050 Brussels, Belgium ReceiVed: August 30, 1996X

The balance equation of the information entropy of a dynamical system subjected to noise is derived. Entropy flow and entropy production terms are identified, featuring such quantities as the sum of the Lyapunov exponents and the noise strength. A comparison with irreversible thermodynamics is carried out, showing that the (information) entropy production is related to the excess entropy production around a reference state averaged over the probability density, rather than to the full thermodynamic entropy production.

1. Introduction One of the aims of thermodynamics is to provide a characterization of the states of macroscopic systems in terms of state functionals depending on a limited number of observables. It is well-known that at equilibrium entropy (for an isolated system) and Helmholtz or Gibbs free energies (for systems at constant temperature) provide an elegant description of this sort, largely independent of the details of the processes going on at the microscopic level. Away from equilibrium these functionals no longer follow universal trends. In this range the key quantity becomes, then, the entropy production P ) diS/dt which enjoys the following properties:1,2 (i) It enters the entropy balance through

dS deS ) +P dt dt

(1)

with deS/dt being the entropy flow. (ii) It is a nonnegative quantity, vanishing only in the state of equilibrium,

P g 0, Peq ) 0

(2)

(iii) In the range of local formulation of irreversible processes it takes the bilinear form

P ) ∫dr ∑JkXk

(3)

k

where the integral runs over the volume occupied by the system, Jk are the flows of the irreversible processes present, and Xk are the associated generalized forces. It is by now well established that large classes of dynamical systems can present, under nonequilibrium conditions, complex behaviors associated with bifurcations culminating in some cases to deterministic chaos.3,4 A great deal of work has been devoted to the characterization of this complexity. A variety of quantities related to the dynamics, including entropy-like ones, have been introduced and have turned out to provide a rather successful description: Lyapunov exponents, Kolmogorov-Sinai entropy, and block entropies are some representative examples.5,6 The objective of the present work is to explore the possibilities of introducing entropy production-like quantities related directly X

to the dynamics and to assess their status with respect to the thermodynamic entropy production, eqs 1-3. This will be achieved by adopting a probabilistic formulation. Our main interest will be in dissipative systems. These are known to attain, in the course of time, an invariant set of zero phase space volume referred to as the attractor.3,4 It follows that, as a rule, the probability densities will become singular in the limit t f ∞.7 To cope with this difficulty, we shall include in the description a weak-amplitude stochastic forcing and take the noiseless limit only at the very final stage of the analysis. Notice that such forcings are always present in a real-world system, because of the ubiquity of the thermodynamic fluctuations. The probabilistic formulation is laid down in section 2. In section 3 the entropy balance induced by the evolution equation for the probability density is derived. It allows one to identify the entropy flow and entropy production terms, whose explicit form is given for a simple model system. In section 4 the traditional thermodynamic formalism is worked out, and the results are compared with those of the purely dynamical formulation. The main conclusions are drawn in section 5.

Abstract published in AdVance ACS Abstracts, November 15, 1996.

S0022-3654(96)02650-0 CCC: $12.00

2. Continuous Time Dynamical Systems: Probabilistic Description The evolution of a stochastically forced continuous time dynamical system is given by a set of coupled first-order Langevin equations of the form8,9

x3 ) F(x,µ) + R(t)

(4)

Here x is the state vector, F is the vector field, µ is a set of control parameters, and R(t) stands for the effect of fluctuations or external noise on the macroscopic dynamics. This effect will be modeled as an additive multi-Gaussian white noise,

〈Ri(t) Rj(t′)〉 ) Qijδ(t-t′)

(5)

The structure of the covariance matrix Qij (a positive definite matrix) is imposed in the case of external noise but follows from fluctuation-dissipation types of relationships in the case of thermodynamic fluctuations.2,9 As is well-known, eqs 4 and 5 define a Markov process of the diffusion type and induce a Fokker-Planck equation for the evolution of the probability density F(x,t)8,9 © 1996 American Chemical Society

19188 J. Phys. Chem., Vol. 100, No. 49, 1996

∂F ∂t

n

)∑ i)1



(-FiF) +

∂xi

) LF +

∂2F Q ∑ ∑ ij 2 i)1 j)1 ∂xi ∂xj 1

n

dSI

n

∂2F

1

∑Qij 2 ij ∂x ∂x i

dt

j

∂f-t(x,µ) | ∂x

F(x,t) ) F0[f-t(x,µ)]|

(7)

where f t(x0,µ) provides the formal solution of the noiseless limit of eq 4,

x3 ) F(x,µ) S x(t) ) f t(x0,µ)

(8)

and F0 is the initial probability density. A simple, yet nontrivial, example of eq 4 is provided by the normal form of supercritical pitchfork bifurcation4 in the presence of noise,

x˘ ) µx - x + R(t) 3

∂F ∂F ∂ ) - [(µx - x3)F] + D 2 ∂t ∂x ∂x

(10a)

where we have set

D ) 1/2Q11 > 0

(10b)

It admits the time-independent solution

)]

1 x2 x4 µ D 2 4

[(

Z ) ∫-∞dx exp ∞

)]

∑Qij ∂x ∂x ln F 2 ij i

j

[

]

∂ ∂F F 1 ) ∫dx ∑ FiF ln - ∑Qij ln F + dt e 2 j ∂xj i ∂xi 1 ∂F ∂F 1 ∫dx F div F + 2∑Qij∫dx F ∂x ∂x (13) ij i j

dSI

The first term on the right-hand side can be converted to a surface integral using Gauss’ divergence theorem. In a dissipative system this surface integral vanishes, since the probability density goes rapidly to zero as |x| f ∞. Noticing that the third term is positive definite and comparing with eq 1, we are then led to identify the (information) entropy flux and (information) entropy production as

deSI ) ∫dx F div F dt PI )

1

( )( )

1 ∂F ∂F

∑Qij∫dx F ∂x 2 ij

i

∂xj

(14a) g0

(14b)

In the absence of noise, expression 14a reduces to the sum of Lyapunov exponentssa negative quantity for a dissipative dynamical system. In the weak noise limit there will be an additional contribution which will tend to zero as |Qij| f 0. One may thus write

deSI dt

n

) ∑σi + O(|Qij|) e 0 (weak noise limit)

(15a)

i)1

On the other hand, dSI/dt must tend to zero in the long time limit, t f ∞. We conclude that expression 15a must be canceled in this limit by the contribution of the entropy production, eq 14b. Despite its apparent linear dependence on Qij, the latter must therefore tend, in the double limit of long times and weak noise, to a finite quantity displaying the parameters appearing in the deterministic equations of evolution and depending only weakly on the noise strength, n

i)1

(weak noise limit and long time limit) (15b) (11b)

3. Entropy Balance Having mapped the dynamics into a stochastic process, it is natural to inquire about the properties of information (Shannon)like entropies.11 Specifically, we consider the one-time entropy

SI ) -∫ dx F(x,t) ln F(x,t)

]

∂2F

1

PI ) -∑σi + O(Qij) g 0

4

1 x x µ D 2 4

ln F -

(11a)

where Z stands for the normalization constant 2

i

]

∂2F Q ∑ ij ∂x ∂x ln F 2 ij i j 1

or, after some straightforward manipulations,

2

[(

[( ) ∂

∑i ∂x FiF

(9)

As is well-known, in the noiseless limit and for µ < 0, this equation admits a single stable fixed point whose Lyapunov exponent is σ ) µ. This fixed point becomes repelling for µ > 0. In this range two new simultaneously stable branches, x( ) ( xµ, emerge from x ) 0. The Lyapunov exponents associated with these new attractors are σ ) -2µ. The Fokker-Planck equation associated with (9) reads (cf. eq 6)

F ) Z-1 exp

) -∫dx LF +

)∫ dx

(6)

Here n denotes the total number of variables present, and L stands for the Liouville operator, which would fully describe the evolution of F in the absence of noise. In this limit the solution of eq 6 reduces to10

[

Nicolis and Daems

(12)

and derive its balance using the evolution eq 6 for F. Differentiating with respect to time and using the normalization of F, we obtain

This conclusion is in qualitative agreement with a recent proposal by Ruelle;12 see also refs 13 and 14. It is instructive to illustrate the structure of deSI/dt and PI as well as their cancellation at the steady state on the example of the normal form of a pitchfork bifurcation, eq 9. Using (14a) and (14b) as well as eq 11 for the stationary density, one has

[( [(

)] )]

deSI ∞ 1 x2 x4 µ ) Z-1∫-∞dx (µ - 3x2) exp dt D 2 4

(16a)

∞ 1 x2 x4 1 µ PI ) Z-1∫-∞dx (µx - x3)2 exp D D 2 4

(16b)

Thermodynamics of Dynamical Systems

J. Phys. Chem., Vol. 100, No. 49, 1996 19189

Integrating eq 16b by parts, one gets

[(

PI ) -Z-1∫-∞dx (µ - 3x2) exp ∞

)]

1 x2 x4 µ D 2 4

≡-

deSI dt (17)

so that, as expected, in the long time limit the flow and the production of information entropy cancel each other. We now turn to the explicit evaluation of PI, which can be conveniently rewritten as

d ln Z - µ PI ) 6D dµ

( )

2 µ2 1 Z ) x2|µ|eµ /8D K1/4 4 8D

(19)

where Kν denotes the modified Bessel function. A more explicit form can be obtained in the weak noise limit, D f 0 and finite |µ|. Using the asymptotic expansion15

x2xπ e

(20a)

x2πD |µ|

(20b)

3D |µ|

(21)

-x

one gets

Z)

1 2

Substituting into eq 18 yields

PI ) -µ +

In the long time and weak noise limits, the information entropy production thus features the opposite of the Lyapunov exponent of the single attracting fixed point x ) 0 plus a contribution that is linear in the noise strength. This is in full agreement with eq 15b. Notice that the D-independent part of this result is identical to that obtained by the linear version of eq 9. µ ) 0 (At Bifurcation). More generally, this is the limit of finite noise and increasingly small µ. Equation 18 no longer holds and one has to resort to (17)

( ) ( ) 4

PI ) 3

x ∫-∞∞dx x2 exp - 4D x4 ∫-∞dx exp - 4D ∞

(22a)

PI ) 6D

(23)

3D µ

(24)

which yields

PI ) 2µ -

Beyond bifurcation, in the double limit of long times and weak noise, the information entropy production reduces thus again to the opposite of the Lyapunov exponent associated this time with the new attractors plus a linear contribution in the noise strength. Again, this is in full agreement with eq 15b. 4. Connection with Irreversible Thermodynamics Although natural in many respects, the above formulation does not appeal in an explicit manner to the constraints acting on the system and, in particular, those that may be responsible for its maintenance in a nonequilibrium state. As a matter of fact, expressions 15a and 15b predict nontrivial values of entropy flux and entropy production in the state of thermodynamic equilibrium as well as in a nonequilibrium statesa result that is, clearly, at variance with the basic premises of irreversible thermodynamics.1,2 The main reason for this is that the dynamics of the macrovariables remains dissipative, irrespective of the nature (equilibrium or not) of the reference state around which it may take place. Stated differently, the control parameters µ appearing in the evolution laws (eqs 8 and 9) contain purely “kinetic” contributions (related, for instance, to reaction rate constants or to transport coefficients) surviving in equilibrium and giving rise to nontrivial dynamics in this limit. Our objective in this section will be to disentangle the effects of nonequilibrium constraints and of the dynamics and establish a connection between the information entropy balance and the entropy balance featured by irreversible thermodynamics. The principal idea is to expand thermodynamic entropy and its rate of change around a reference state and subsequently average the resulting expressions over a suitable probability distribution. In the presence of an attractor this will be chosen most naturally to be the invariant distribution. This view point is similar to that adopted in Onsager’s theory of fluctuations around equilibrium.2

Γ

1/2

xπDµ

Let xj be a point on the system’s attractor. Using xj as a reference state, we decompose x as

This expression can be evaluated exactly to yield

(43) 1 Γ( ) 4

2

Z ) 2eµ /4D

(18)

We consider separately the cases µ < 0, µ ) 0, and µ > 0. µ < 0 (Before Bifurcation). In this case, Z is given by15

Kν(x) ≈

attractors x( ) (xµ. One gets, after some standard manipulations,

x ) xj + δx(t)

(22b)

showing a nonanalytic behavior in D in the neighborhood of the bifurcation. As expected, information entropy production is dominated in this limit by the fluctuations. In actual fact, one expects that D itself is related to the quantifiers of microscopic chaos prevailing at the level of the dynamics of the underlying many-body system.16 This connection is, however, out of the scope of the present study. µ > 0 (Beyond Bifurcation). In this case Z can be computed for small D using the steepest descent method around the new

(25)

where δx represents the perturbation around xj, arising from environmental or internal (thermodynamic fluctuations) mechanisms. In what follows we shall focus on the linearized equations of evolution of δx,

dδx/dt ) J‚δx

(26)

where J is the (generally time-dependent) Jacobian matrix. Equation 25 induces the following expansion for the entropy S,

19190 J. Phys. Chem., Vol. 100, No. 49, 1996

Nicolis and Daems

F(xj) ) δ(xj - xeq)

S(x) ) S(xj + δx)

) S(xj) + ∑ i

() ∂S

∂xji

( ) 2

∂S

1

∑ 2 ij ∂xj ∂xj

δxi +

i

) S(xj) + ∑hhiδxi + i

1

〈δxiδxR〉 ) -kBgjRi-1

δxiδxj + ...

j

∑ij gjijδxiδxj + ...

2

entailing that

(27)

Owing to the convexity properties of entropy the (symmetric) matrix gjij is negative definite. The coefficients hhi of the linear part vanish in an isolated system at equilibrium but are nonzero in a system subjected to nonequilibrium constraints. Differentiating both sides of eq 27 with respect to time and using eq 26 as well as the symmetry of gjij, one obtains

dS(x) dt

)

dS(xj) dt

(

+∑ R

∑i hhiJiR +

(

1 dgjiR

∑ iR 2

dt

)

dhhR dt

δxR +

)

+ ∑gjijJjR δxiδxR + ... (28) j

Expressions 27 and 28 are now to be averaged over the invariant distribution of the phase space variables. In the spirit of the decomposition of eq 25 the latter can be expressed as

F(x) ) F(xj) p(δx|xj)

(29)

where F(xj) is the invariant distribution on the attractor and p the conditional probability of fluctuations around a given attractor. As a result of the averaging, the linear term in eq 27 will give a vanishing contribution since, by definition, the average value of the fluctuation around the attractor is zero. The remaining parts yield

Sh ) ∫dxj F(xj) S(xj) +

1

∫dxj F(xj)∑gjij(xj)〈δxiδxj〉

2

(30)

ij

where the brackets denote ensemble averaging over the distribution p(δx|xj). The first term of eq 30 is the natural generalization of thermodynamic (Gibbs) entropy when the system possesses an attractor not necessarily identical to a fixed point. The second term describes the variability around the attractor and can therefore be referred to as the (mean) “entropy of fluctuations”.17 Owing to the convexity properties of entropy, this term is negative definite

Sh ) SG + Sfl, Sfl e 0

(31)

We next proceed to calculate the mean rate of change of entropy by averaging eq 28. The first term will obviously generate the entropy flux, deSG/dt, and entropy production, PG, of irreversible thermodynamics.1,2 The contribution of the linear term will again vanish, whereas the third term will represent the mean rate of change of entropy due to the fluctuations, Pfl:

dS deSG ) + PG + Pfl dt dt Pfl ) ∫dxj F(xj)∑ iR

(

1 dgjiR 2 dt

(32a)

)

+ ∑gjij(xj)JjR 〈δxiδxR〉 (32b) j

(33)

When the system evolves around equilibrium, the Gibbs entropy contributions vanish and only the last term survives in eq 32. Under this condition one also has2,9,18

1 kB

kr Pfl ) - ∑gjijgjRi-1JjR ) - ∑δRj JjR Rj

ijR

) - ∑Jjj

(34)

j

The result is nothing but the negative sum of Lyapunov exponents featured in eq 15b. This establishes the connection between information entropy production and irreversible thermodynamics, showing that the former is to be interpreted as an excess entropy production19 rather than as the Gibbs entropy production. Notice, however, that the excess entropy of fluctuations is negative whereas the information entropy is positive. A class of systems away from equilibrium for which the above result carries through are sequences of unimolecular reactions in an ideal system.20 In this case

gjij ) -

1 1 ∂µi ) -kB δijkr T ∂xjj jxj

〈δxiδxj〉 ) jxjδijkr (Poissonian fluctuations)

(35)

and eq 34 is again secured. The full entropy production contains, of course, an additional term of the form of eq 3, whose explicit expression is system-dependent. For more general systems away from equilibrium one has to resort to the full eq 32. The connection with the information entropy balance of section 3 becomes now more involved. It is worth stressing that chaotic dynamics plays no particular role in the developments outlined in sections 3 and 4. This is natural once one recognizes that dissipation, or excess dissipation, is related to the sum of all Lyapunov exponents rather than solely the positive ones, entailing that the “transverse” modes associated with the evolution toward the attractor should be dominant. These are the only ones to survive (barring the zero Lyapunov exponents associated with motion along the trajectory) in a nonchaotic system. The presence of chaos will merely add to these transverse Lyapunov exponents, the contribution of the unstable modes reflecting sensitivity to initial conditions on the attractor. 5. Conclusions and Perspectives In this paper the information entropy balance of a dissipative dynamical system subjected to weak noise has been derived. The entropy flow and entropy production terms featured in this balance reduce, in the long time limit, to a dominant contribution equal (up to a sign) to the sum of the Lyapunov exponents, plus a contribution vanishing with the noise strength. This result provides one with an intrinsic definition of entropy production of a dynamical system. The structure of the evolution equations of a dynamical system is not altered in a direct manner when the constraints driving this system out of equilibrium are varied: the bifurcation sequences generated under these conditions may be deeply affected, but whatever the distance from equilibrium and the attractors present might be, a dissipative system will always exhibit a negative total sum of Lyapunov exponents and hence a positive information entropy production. This implies necessarily that information entropy production cannot be identical

Thermodynamics of Dynamical Systems to Gibbs entropy production featured in irreversible thermodynamics, which is bound to vanish in the state of equilibrium and vary subsequently smoothly with the distance from equilibrium. We have proposed a thermodynamic analog to information entropy production, namely, the thermodynamic entropy production generated by the fluctuations averaged over the invariant distribution of the phase space variables. These two quantities are identical at least when the system operates around equilibrium or when its dynamics results from sequences of first-order reactions. However, in the most general case the entropy production of fluctuations contains more detailed, systemdependent information on the underlying dynamics. This happens to be the case of systems involving bifurcations and/ or exhibiting chaos. The equivalence with the information entropy production is then lost. The entropy production of the fluctuations defined in eq 32 can be regarded as an extension of the excess entropy production introduced by Glandorff and Prigogine,19 who derived a sufficient (linearized) stability criterion of nonequilibrium steady states based on the sign of this quantity. In the information theoretic formulation of sections 2 and 3 there can be no change of sign of information entropy production. The loss of stability of a reference state is then manifested through different forms of dependence of information entropy production across a bifurcation on the parameters determining the Lyapunov exponents, as seen explicitly on comparing eqs 21 and 24. The nonequilibrium thermodynamics of dynamical systems is still in its infancy. An extension of the formulation of sections 2 and 3 to spatially distributed systems would allow one to include transport processes in the formalism. It would also be desirable to analyze more closely the structure of the main expressions in the presence of low-dimensional and spatiotemporal chaos, where the existence of a smooth invariant distribution over the expanding directions allows one to work at the outset in the noiseless limit in so far as the dynamics on the attractor is concerned.

J. Phys. Chem., Vol. 100, No. 49, 1996 19191 Acknowledgment. This work has been supported in part by the Belgian government under the Poˆles d’Attraction Interuniversitaires program. D.D. is Aspirant Chercheur at the Fonds National de la Recherche Scientifique (Belgium). References and Notes (1) Prigogine, I. Introduction to Thermodynamics of IrreVersible Processes; Wiley: New York, 1961. (2) De Groot, S.; Mazur, P. Nonequilibrium Thermodynamics; North-Holland: Amsterdam, 1962. (3) Ott, E. Chaos in Dynamical Systems; Cambridge University Press: Cambridge, 1993. (4) Nicolis, G. Introduction to Nonlinear Science; Cambridge University Press: Cambridge, 1995. (5) Beck, C.; Schlo¨gl, F. Thermodynamics of Chaotic Systems; Cambridge University Press: Cambridge, 1993. (6) Ebeling, W.; Nicolis, G. Chaos, Solitons and Fractals 1992, 2, 635. (7) Eckmann, J. P.; Ruelle, D. ReV. Mod. Phys. 1985, 57, 617. (8) Lasota, A.; Mackey, M. Probabilistic Properties of Deterministic Systems; Cambridge University Press: Cambridge, 1985. (9) van Kampen, N. Stochastic Problems in Physics and Chemistry; North-Holland: Amsterdam, 1981. (10) Gaspard, P.; Nicolis, G.; Provata, A.; Tasaki, S. Phys. ReV. 1995, E51, 74. (11) Khinchin, A. Mathematical Foundations of Information Theory; Dover: New York, 1959. (12) Ruelle, D. Positivity of entropy production in nonequilibrium statisical mechanics, preprint, IHES/P/963. (13) Evans, D.; Cohen, E.; Morris, G. Phys. ReV. 1990, 42A, 5990. (14) Gallavoti, G.; Cohen, E. Phys. ReV. Lett., in press. (15) Abramowitz, M.; Stegun, I. Handbook of Mathematical Functions; Dover: New York, 1972. (16) Gaspard, P.; Nicolis, G. Phys. ReV. Lett. 1990, 65, 1693. (17) Luo, J. L.; Van den Broeck, C.; Nicolis, G. Z. Phys. 1984, B56, 165. (18) Nicolis, G.; Prigogine, I. Self-Organization in Nonequilibrium Systems; Wiley: New York, 1977. (19) Glansdorff, P.; Prigogine, I. Thermodynamic Theory of Structure, Stability and Fluctuations; Wiley: London, 1971. (20) Nicolis, G.; Babloyantz, A. J. Chem. Phys. 1969, 51, 2632.

JP962650O