Novel Causal Network Modeling Method Integrating Process

Oct 23, 2017 - ... to load: https://cdn.mathjax.org/mathjax/contrib/a11y/accessibility-menu.js .... For building a good causal network model, transfer...
0 downloads 2 Views 2MB Size
Article Cite This: Ind. Eng. Chem. Res. XXXX, XXX, XXX-XXX

pubs.acs.org/IECR

Novel Causal Network Modeling Method Integrating Process Knowledge with Modified Transfer Entropy: A Case Study of Complex Chemical Processes Qun-Xiong Zhu,†,‡ Qian-Qian Meng,†,‡ Ping-Jiang Wang,†,‡ and Yan-Lin He*,†,‡ †

College of Information Science & Technology, Beijing University of Chemical Technology, Beijing 100029, China Engineering Research Center of Intelligent PSE, Ministry of Education of China, Beijing 100029, China



ABSTRACT: With the increasing development of modern industries, ensuring the safety and reliability of production processes becomes a more and more urgent task. Alarm root cause analysis plays a very significant role in preventing faults of complex industrial processes. Causal network modeling is an important part of alarm root cause analysis. For building a good causal network model, transfer entropy is usually adopted as an effective method. However, there are some problems in determining prediction horizons of transfer entropy. In order to solve these problems and further enhance the performance of the original transfer entropy method, a modified transfer entropy method taking the prediction horizon from one variable to another and to itself simultaneously into consideration is proposed. Moreover, based on data-driven and process knowledge based modeling methods, an approach integrating transfer entropy with superficial process knowledge is designed to correct the false calculation of transfer entropy and then optimize the causal network model to improve the capacity of causality detection. In order to verify the effectiveness of the proposed approach, two case studies using a stochastic process and a complex chemical process named Tennessee Eastman are carried out. Simulation results show that the discovering ability of causality detection can be much improved.

1. INTRODUCTION

causal alarms by capturing the causality between process variables and establishing an effective causal model. A causal model can be used to describe the causality between process variables. Methods for modeling the causality between process variables can be divided into two kinds: one is knowledge based modeling methods and the other one is datadriven modeling methods. Knowledge based modeling methods are qualitative and lack quantitative information to determine the strength of causality. Besides, it is hard to build knowledge based models without expertise. With the rapid development of computer and measuring techniques, data-driven modeling methods, such as the time-delayed correlation analysis,3 the Granger causality,4−6 the Bayesian network,7−12 the interpretive structural model,13 and the transfer entropy, have been broadly explored and developed. Transfer entropy was proposed by Thomas Schreiber to quantify information exchange.14 The transfer entropy essentially describes the causality caused by information flow and has been widely applied in many fields, such as neurology and economics.15,16 Bauer et al. used transfer entropy to study disturbance propagation paths in chemical plants.17 Afterward, different kinds of modified transfer entropy

In modern process industries, distributed control systems (DCSs) are widely used. As a result, large numbers of process variables can be collected from DCSs. Those collected process data are useful for monitoring processes. Alarm systems play a fundamental role in ensuring the safety of modern industrial systems.1 In an alarm system, there are always some alarms due to some process variables falling out of their normal scope. Generally speaking, there are four kinds of alarms: double high alarms, high alarms, low alarms, and double low alarms. With the increasing development of modern process industries, ensuring the safety and reliability in production processes becomes a more and more urgent task. An alarm system with good performance will contribute to avoiding hazardous events through reminding operators to carry out suitable actions. However, nowadays alarm floods are one of the most common problems of industrial alarm systems due to large-scale and complicated processes. So, it is necessary to avoid alarm floods for keeping alarm systems in good performance. Nuisance alarms and causal alarms are the main sources of alarm floods.2 Causal network modeling can be used to overcome the problem of causal alarms. With the help of a causal model, causality can be visualized by directed graphs, and alarm root causes can be easily diagnosed. As a result, alarm floods can be alleviated at the source. Therefore, it is important to handle © XXXX American Chemical Society

Received: Revised: Accepted: Published: A

July 3, 2017 October 23, 2017 October 23, 2017 October 23, 2017 DOI: 10.1021/acs.iecr.7b02700 Ind. Eng. Chem. Res. XXXX, XXX, XXX−XXX

Article

Industrial & Engineering Chemistry Research

constant. But in formula 2, the interval will change with h. It does not make sense; and in some cases the prediction horizon could not be determined. So Shu and Zhao21 changed the formula as follows:

were put forward. Staniek et al. proposed symbolic transfer entropy to reduce the influence of noise through replacing original time series with symbolic time series.18 Duan et al. put forward a kind of direct transfer entropy to determine the direct causality between variables.19 To avoid kernel density estimation, Yu et al. calculated the transfer entropy using binary alarm data.20 Although these methods have been proven to be efficient, a presumption that the process is a static Markov process must be met. That is to say, process dynamics should keep unchanged. However, not all the real processes can be approximated as Markov processes. As a result, the traditional transfer entropy analysis may be wrong.21 Motivated by the above considerations, in this paper, a new kind of modified transfer entropy is proposed to overcome the shortcomings of parameters optimization. In the proposed algorithm, prediction horizons are introduced to consider the information transfer between the variables itself fully. Besides, superficial process knowledge is used to rectify the results of the proposed algorithm. Then a causal network can be established. The feasibility and effectiveness of the proposed approach are validated by case studies using a stochastic process and the Tennessee Eastman (TE) process. The rest of this paper is organized as follows. In section 2, a brief overview about two transfer entropies is introduced. In section 3, a detailed description of the proposed method is presented. Case studies using a stochastic process and the Tennessee Eastman process are provided and the simulation results are shown in section 4. Conclusions are given in section 5.

T (X |Y ) = T (Xi + h|Xi + h − 1 , Yi ) xi + h , xi(+k)h − 1, yi(l)

log

xi + 1, xi(k), yi(l)

p(xi + 1 , xi(k) , yi(l))log

xi + h , xi(k), yi(l)

(4)

3.1. Modified Transfer Entropy Algorithm. Transfer entropy can be expressed as the difference between information entropy. Formula 3 can be rewritten as follows: T (Xi + h|Xi + h − 1 , Yi ) ∑ p(xi+ h , xi(+k)h − 1, yi(l))log p(xi+ h|xi(+k)h − 1, yi(l)) = xi + h , xi(+k)h − 1, yi(l)





p(xi + h , xi(+k)h − 1)log p(xi + h|xi(+k)h − 1)

xi + h , xi(+k)h − 1

= H(xi + h|xi(+k)h − 1) − H(xi + h|xi(+k)h − 1 , yi(l))

p(xi + 1|xi(k) , yi(l))

(5)

p(xi + 1|xi(k))

where H(xi+h|xi+h−1) is the conditional information entropy of xi+h−1 to xi+h, and H(xi+h|xi+h−1, yi) is the conditional information entropy of xi+h−1and yi to xi+h. From formula 5, we can see that the information propagates not only between two different variables but also between the same variable from current time to the future time. In formula 5, the interval between the current time and the future time is one sample interval. Different processes may have different sampling time. In this way, the interval cannot be adjusted automatically. According to the principle of maximum information entropy,24−26 the distribution is closest to the actual physical system when the conditional information entropy gets the maximum value. And, this maximum conditional information entropy is the most objective. In order to get a model that is most suitable to the actual system, we propose a new kind of modified transfer entropy by introducing hx considering the prediction horizon itself fully. The formula is shown as follows:

T (X |Y ) = T (Xi + h|Xi , Yi ) p(xi + h , xi(k) , yi(l))log

(3)

3. MODIFIED TRANSFER ENTROPY AND CAUSAL NETWORK MODELING

where X and Y denote two different process variables in a (l) Markov process; x(k) i = [xi, xi−1, ..., xi−k+1] and yi = [yi, yi−1, ..., yi−l+1] represent the measured values at time i, k, and l denote the order of X and Y, respectively; log means the logarithm with (l) base 2; p(xi+1, x(k) i , yi ) represents joint probability and p(xi+1| (l) x(k) , y ) represents conditional probability. Considering the i i time delay of the information exchange between two variables, Bauer changed the formula as follows:



p(xi + h|xi(+k)h − 1)

If Ty→x is positive, Y is the cause of X; whereas X is the cause of Y. If Ty→x is close to zero, then there is no causality between them.

(1)

=

p(xi + h|xi(+k)h − 1 , yi(l))

Ty → x = T (X |Y ) − T (Y |X )

T (X |Y ) = T (Xi + 1|Xi , Yi )



p(xi + h , xi(+k)h − 1 , yi(l))

(k) where x(k) i in eq 2 is changed to xi+h−1 in eq 3. In this way, the (k) (k) interval between xi+h−1 and xi stays constant. Since the effects of two variables made on each other are different, the transfer entropy is asymmetric. The causality between two variables can be measured as follows:

2. TRANSFER ENTROPY Transfer entropy originates from the information entropy proposed by Shannon in 1948.22 Information entropy aims at quantifying process uncertainty. The higher the information entropy, the greater the uncertainty, and the more information the process contains. On the basis of information entropy, Schreiber and Kaiser proposed a transfer entropy theory to measure the information exchange23 in 2002. The formula of transfer entropy is shown as follows:

=



=

p(xi + h|xi(k) , yi(l)) p(xi + h|xi(k)) (2)

where h represents the prediction horizon. The interval between the current time and the future time should be B

DOI: 10.1021/acs.iecr.7b02700 Ind. Eng. Chem. Res. XXXX, XXX, XXX−XXX

Article

Industrial & Engineering Chemistry Research

edge, we can first model each of the subsystem using the proposed modified transfer entropy, and then the whole causal model can be established by combining these subcausal models. In division, all the process variables will be allocated into different subsystems. The variables in each subsystem consist in the variables belonging to this subsystem, the inflow variables, the outflow variables and the variables belonging to their adjacent subsystem which have influence on the adjacent subsystem. After division, the variables should be processed according to superficial process knowledge. The rules are listed as follows:

T (Xi + h|Xi + h − hx , Yi )



=

xi + h , xi(+k)h − hx , yi(l)

log =

p(xi + h , xi(+k)h − hx , yi(l))

p(xi + h|xi(+k)h − hx , yi(l)) p(xi + h|xi(+k)h − hx)

∑ xi + h , xi(+k)h − hx , yi(l)

p(xi + h , xi(+k)h − hx , yi(l))log p(xi + h

|xi(+k)h − hx , yi(l)) −

∑ xi + h , xi(+k)h − hx

p(xi + h , xi(+k)h − hx)

(1) According to the inflow and outflow relationship, define inflow variables are as root variables and outflow variables as leaf variables. (2) Determine the related variables between adjacent subsystems, such as the variables with same attribute or the variables with obvious causality. (3) Analyze the time order of each subsystem according to the process flowchart. (4) Based on the time order, divide the related variables into a high level and a low level. Using these rules, process variables are now endowed with the characteristics of input, output, and time order. From the perspective of information flow, the input variables are the causes and the output variables are the effects; from the perspective of time order, the cause precedes effect. These findings from process knowledge can validate and rectify the results of the proposed modified transfer entropy. Besides, system division largely decreases the computation complexity of algorithms. The flowchart for causal network modeling integrating process knowledge with modified transfer entropy is shown in Figure 1.

log p(xi + h|xi(+k)h − hx) = H(xi + h|xi(+k)h − hx) − H(xi + h|xi(+k)h − hx , yi(l)) H(xi + h|xi + h − hx) = −

∑ xi + h , xi(+k)h − hx

(6)

p(xi + h , xi(+k)h − hx)

log p(xi + h|xi(+k)h − hx) =−

∑ xi + hx , xi(k)

p(xi + hx , xi(k))log p(xi + hx|xi(k)) (7)

where hx denotes the prediction horizon between xi + h − hx and xi+h; h denotes the prediction horizon between xi+h and yi. Using formula 6, h can be fixed and the interval between xi + h − hx and xi+h can be optimized by changing hx. The parameters k, l, h, and hx in formula 6 should be determined. According to the study of Overbey and Todd27,28 and Nichols et al.,29 for simplicity, processes can be regarded as first-order Markov processes. So k = l = 1. hx and h can be decided by maximizing the information entropy H(xi+h|xi+h−1) and H(xi+h|xi+h−1,yi), respectively. After calculations, a significance level test should be conducted to validate the causality. By creating new time series Ns = (Xnew, Ynew)30 and calculating the transfer entropy of the new time series λi = T yi → xnew (i = 1, . . . Ns), the new significance level can be formulated based on the 3σ rule: sλ = μλ + 3σλ

4. CASE STUDY In order to verify the superiority and efficiency of the proposed causal network modeling method, two case studies using a stochastic process and the TE process are provided.

(8)

where μλ and σλ are the means and variance of λi, respectively. In our study, there are 10 time series are created to calculate the significance level. If Ty→x is larger than sλ, then y is the cause of x. 3.2. Causal Network Modeling Integrating Process Knowledge with Modified Transfer Entropy. The method of transfer entropy must meet the assumption that the process is a static Markov process; that is to say, the current state at time t is decided by the states at time t − 1 or t − k with certain k. The process dynamics should keep steady in an enough long time. However, the above assumptions are hard to meet. Therefore, only based on the data-driven transfer entropy, the results may be not accurate. By combining superficial process knowledge with our proposed modified transfer entropy, the results of causal network modeling will be credible. In a large-scale process, the whole system is composed of multiple subsystems.31,32 Each subsystem represents a specific unit, in which there are more than one process variable. Through process division based on superficial process knowl-

Figure 1. Flowchart of causal network modeling based on process knowledge and modified transfer entropy. C

DOI: 10.1021/acs.iecr.7b02700 Ind. Eng. Chem. Res. XXXX, XXX, XXX−XXX

Article

Industrial & Engineering Chemistry Research 4.1. Stochastic Process Case Study. The stochastic process is denoted by formula 9 including both linear and nonlinear relationships. 2 ⎧ ⎪ Yk + 1 = 2Xk + 2Xk + 5 |Z k + 2.5| + v1k ⎨ ⎪ ⎩ Zk + 1 = 0.3Zk + 0.86(Xk − 2) + v2k

(9)

where Xk, Yk, and Zk are three random variables. Xk ∈ (0, 12). Yk is nonlinearly related with Xk and Zk. Zk is linearly related with Xk. v1k ∈ (0, 0.12) and v2k ∈ (0, 0.12) are white noise. Calculating the values of transfer entropy between these three variables according to the modified transfer entropy proposed in this paper (e.g., formula 6) and the transfer entropy modified by Shu and Zhao21 (e.g., formula 3), respectively. The significance levels are calculated according to formula 8. The results of different transfer entropies are shown in Tables 1 and 2, respectively. Table 1. Values of the Modified Transfer Entropy Proposed by This Paper and the Corresponding Significant Levels Xk Yk Zk

Xk

Yk

Zk

N/A 0.3103(0.0267) 0.5835(0.0457)

0.0315 (0.0329) N/A 0.0318(0.0334)

0.0561 (0.0893) 0.0626 (0.0550) N/A

Figure 2. Causal network of the selected stochastic process. (a) Causality between X, Y, and Z recognized by the modified transfer entropy proposed by this paper. (b) Causality between X, Y, and Z recognized by the transfer entropy modified by Shu and Zhao.21

are listed in Table 4. The level of each variable in the subunits determined bases on the time order is listed in Table 5. After division, the transfer entropy between any two variables in each subunit can be calculated out. The parameters hx and h should be first determined. Taking variables F5 and F6 in u1 for an example, based on formula 7, the trend of the conditional information entropy from F5 to itself with different intervals is shown in Figure 4. When hx = 6, H(F5(i + hx)|F5(i)) reaches the maximum value. So hx is set as 6. Then calculate the transfer entropy T(F6|F5) using formula 6. The trend with different h values is shown in Figure 5. When h = 9, T(F6|F5) reaches the maximum. So h is set as 9. In the same way, the transfer entropy between any two variables in u1 can be calculated. In order to illustrate the effectiveness of the improved method, we had calculated the H(xi + h|xi + h − hx)with different hx values of different variables, respectively. The maximum value, minimum value of H(xi + h|xi + h − hx), and the value of H(xi + h|xi + h − hx)when hx = 1 are shown in Table 6. From Table 6, it is clear that the difference between maximum and minimum of H(xi + h|xi + h − hx) is obvious; besides, there is obvious difference between the maximum H(xi + h|xi + h − hx) and the H(xi + h|xi + h − hx) when hx = 1. Take T11 as an example, Figure 6 shows the trend of H(xi + h|xi + h − hx)of T11 when hx takes different values. The causality between two variables can be measured using formula 4 and determined by comparing with the corresponding significance level. If Ty→x is larger than sλ, y is the cause of x; otherwise, there is no causality from y to x. The causal matrix about u1 is shown in Table 7. “1” denotes that the variable at the column is the cause of the variable at the row. “0: denotes that the variable at the column is not the cause of the variable at the row. The causal network of subunit u1 based on the transfer entropy proposed in this paper, is shown in Figure 7.

Table 2. Values of the Transfer Entropy Modified by Shu and Zhao21 and the Corresponding Significant Levels Xk Yk Zk

Xk

Yk

Zk

N/A 0.3150 (0.0442) 0.6435 (0.0611)

0.0299 (0.0316) N/A 0.0282 (0.0324)

0.0533 (0.0856) 0.0460 (0.0625) N/A

The value outside the bracket means the transfer entropy from the variable listed in the column to that in the row, and the value inside the bracket means the corresponding significance level. According to Tables 1 and 2, the causalities between these three variables recognized by different transfer entropies are shown in Figure 2. It can be seen in Figure 2a that Xk influences Zk and Yk is influenced by Xk and Zk. The causality shown in Figure 2a is consistent with that in formula 8. From Figure 2b, it can be seen that the causality between Yk and Zk cannot be recognized by the transfer entropy modified by Shu and Zhao.21 The comparisons of simulation results show that our proposed causal network modeling method is more accurate and efficient. 4.2. TE Process Case Study. Proposed by Downs and Vogel in 1993, the Tennessee Eastman process is a simulation model of a real chemical process.33 This model mainly consists of five units: a reactor, a condenser, a separator, a stripper, and a compressor. The flowchart of the TE process is shown in Figure 3. The TE process contains 12 manipulated variables, 22 continuous process measurements, and 19 composition measurements. In our work, 22 continuous process measurements (shown in Table 3) are chosen to establish the required model. The number of samples is 500, and the sampling time is 1.8 s. According to the flowchart, the TE process can be divided into five subunits. Let U = {u1, u2, u3, u4, u5} represent the whole TE process; u1, u2, u3, u4, and u5 represent the reactor unit, condenser unit, separator unit, stripper unit, and compressor unit, respectively. The variables in each subunit D

DOI: 10.1021/acs.iecr.7b02700 Ind. Eng. Chem. Res. XXXX, XXX, XXX−XXX

Article

Industrial & Engineering Chemistry Research

Figure 3. Flowchart of TE process.

Table 3. Twenty-two Continuous Process Measurements of the TE Process variable number F1 F2 F3 F4 F5 F6 P7 L8 T9 F10 T11

variable description

variable number

variable description

A feed D feed E feed A and C feed recycle flow reactor feed rate reactor pressure reactor level reactor temperature purge rate

L12 P13 F14 L15 P16 F17 T18 F19 J20

separator level separator pressure separator underflow stripper level stripper pressure stripper underflow stripper temperature stripper steam flow compressor work

T21

separator temperature

T22

reactor cooling water outlet T condenser cooling water outlet T

Figure 4. Conditional information entropy of variable F5 to itself.

Table 4. Result of Unit Division unit

variables

u1 u2 u3 u4 u5

F1, F2, F3, F5, F6, P7, L8, T9, T21 T9, T22 P7, L8, T9, T11, L12, P13, F14, T22 F4, T11, L12, P13, F14, L15, P16, F17, T18, F19 F5, F10, T11, L12, P13, C20

Table 5. Result of Hierarchy Division unit

levels

u1 u2 u3 u4 u5

lev{F1, F2, F3, F5, F6} > lev{ F6, P7, L8, T9, T21} lev{T9} > lev{T22} lev{P7, L8, T9} > lev{T11, L12, P13, F14, T22} lev{F4, T11, L12, P13, F14} > lev{F4, L15, P16, F17, T18, F19} lev{T11, L12, P13} > lev{F10} > lev{C20} > lev{F5}

Figure 5. Transfer entropy from F5 to F6.

Figure 7, the yellow arrow should be removed. Besides, the flowrates of feed A, D, and E are independent. Combining with the flowchart of the TE process, the causalities of F1 → F6, F2 → F6, and F3 → F6 (e.g., the red arrows in Figure 8) should be added. Based on this process knowledge, the optimized causal matrix of the subunit u1 is shown in Table 8. The casual network of the subunit u1 is shown in Figure 8. Accordingly, the causal networks of the other four subunits can be built. Combining these subcausal networks, the whole

Afterward, process knowledge is used to validate and rectify the causal matrix. For subunit u1, the inflow variables are {F1, F2, F3, F5, F6} and are regarded as root variables. The direction of influence should be from {F1, F2, F3, F5, F6} to {P7, L8, T9, T21}; otherwise the causality will be removed. For example, in E

DOI: 10.1021/acs.iecr.7b02700 Ind. Eng. Chem. Res. XXXX, XXX, XXX−XXX

Article

Industrial & Engineering Chemistry Research Table 6. Conditional Information Entropy of Different Variables variable H(xi + h|xi + h − hx)

F1

P7

F10

T11

P13

P16

T18

T21

T22

minimum maximum hx = 1

2.1228 2.4673 2.1228

2.237 2.4947 2.2375

2.1501 2.3587 2.151

2.2849 2.7873 2.285

2.3427 2.6501 2.3427

2.2341 2.5451 2.235

2.2863 2.4496 2.2905

2.5943 2.6591 2.6046

2.3427 2.6509 2.3427

Figure 8. Causal network of subunit u1 optimized by this process knowledge. Figure 6. Conditional information entropy of variable T11 to itself.

Table 8. Optimized Causal Matrix of u1 Based on This Process Knowledge

Table 7. Causal Matrix of u1 Recognized by the Modified Transfer Entropy Proposed in This Paper variable

F1

F2

F3

F5

F6

P7

L8

T9

T21

F1 F2 F3 F5 F6 P7 L8 T9 T21

0 0 0 0 0 0 0 0 0

0 0 1 0 0 0 0 0 0

0 0 0 0 0 0 0 0 0

1 1 1 0 1 0 1 1 1

0 1 1 0 0 0 1 0 1

1 0 0 0 0 0 1 0 1

0 1 1 0 0 0 0 0 0

0 1 1 0 0 0 1 0 1

0 0 0 0 0 0 1 0 0

variable

F1

F2

F3

F5

F6

P7

L8

T9

T21

F1 F2 F3 F5 F6 P7 L8 T9 T21

0 0 0 0 1 0 0 0 0

0 0 1 0 1 0 0 0 0

0 0 0 0 1 0 0 0 0

1 1 1 0 1 0 1 1 1

0 0 0 0 0 0 1 0 1

0 0 0 0 0 0 1 0 1

0 0 0 0 0 0 0 0 0

0 0 0 0 0 0 1 0 1

0 0 0 0 0 0 1 0 0

in Table 9. Besides, the results of the proposed transfer entropy integrated with superficial process knowledge are listed in Table 9. In Table 9, the dash (“−”) represents that there is no causality between corresponding variables; “YES” means the causality between corresponding variables is recognized correctly; “reverse” represents that although the causality between corresponding variables is recognized, the direction of the causality is reverse. Compared with the modified transfer entropy proposed by Shu and Zhao,21 our proposed modified transfer entropy can find some causality which are hard to be captured by the above algorithm. For instance, the modified transfer entropy proposed in the ref 21 cannot accurately recognize the causalities of F6 → L8 and F17 → L15; however, these two causalities can be accurately recognized using our modified transfer entropy. That is because in our proposed algorithm, the prediction horizon is introduced to consider the information transfer between the variables itself fully. Combining with simple process knowledge, the causalities of F1 → F6, F2 → F6, and F3 → F6 are added, and then, the causal network can be more accurate. Besides, the subsystem division based on process knowledge decreases the computation complexity greatly.

Figure 7. Causal network of subunit u1 captured by the proposed modified transfer entropy.

causal model of the TE process (shown in Figure 9) can be established. There are some obvious causalities (e.g., F1 → F6, F2 → F6, F3 → F6, F5 → F6, F6 → L8, T9 → T21, F14 → L15, F17 → L15, F19 → T18) that cannot be recognized by the transfer entropy modified by Shu and Zhao.21 The modified transfer entropy proposed in this paper is used to recognize the causalities between the above variables. Comparisons of different transfer entropies are listed

5. CONCLUSIONS A novel causal network modeling method integrating process knowledge with modified transfer entropy is proposed in this paper. The information transfer between variables themselves F

DOI: 10.1021/acs.iecr.7b02700 Ind. Eng. Chem. Res. XXXX, XXX, XXX−XXX

Article

Industrial & Engineering Chemistry Research

Figure 9. Causal network of TE process



Table 9. Causality Comparisons between Proposed Algorithm and Traditional Transfer Entropy

causality F1 → F6 F2 → F6 F3 → F6 F5 → F6 F6 → L8 T9 → T21 F14 → L15 F17 → L15 F19 → T18

modified transfer entropy proposed by Shu and Zhao21

modified transfer entropy proposed by this paper

Corresponding Author

*Tel.: +86-10-64426960. Fax: +86-10-64437805. E-mail: heyl@ mail.buct.edu.cn.

proposed transfer entropy integrated with superficial process knowledge

− − − − − −

− − − YES YES YES

YES YES YES YES YES YES



YES

YES



YES

YES



reverse

reverse

AUTHOR INFORMATION

ORCID

Yan-Lin He: 0000-0002-0037-6871 Notes

The authors declare no competing financial interest.



ACKNOWLEDGMENTS This work was supported by National Natural Science Foundation of China, China grant nos. 61473026 and 61573051 and Fundamental Research Funds for the Central Universities under grant nos. JD1708 and ZY1704.



REFERENCES

(1) Qin, Y.; Zhao, C. H.; Gao, F. R. An Iterative Two-Step Sequential Phase Partition (ITSPP) Method for Batch Process Modeling and Online Monitoring. AIChE J. 2016, 62 (7), 2358−2373. (2) Zhu, Q. X.; Gao, H. H.; Liu, F. F. Research progress of alarm systems in the process industries. Comput. Appl. Chem. 2014, 31 (2), 129−134. (3) Bauer, M.; Thornhill, N. F. A practical method for identifying the propagation path of plant-wide disturbances. J. Process Control 2008, 18 (7), 707−719. (4) Bressler, S. L.; Seth, A. K. Wiener-Granger causality: A well established methodology. NeuroImage 2011, 58 (2), 323−329. (5) Yuan, T.; Qin, S. J. Root cause diagnosis of plant-wide oscillations using Granger causality. J. Process Control 2014, 24 (2), 450−459. (6) Friston, K.; Moran, R.; Seth, A. K. Analysing connectivity with Granger causality and dynamic causal modeling. Curr. Opin. Neurobiol. 2013, 23 (2), 172−178.

and different variables is taken into consideration by optimizing the prediction horizon fully. This increases the discovering ability of causality quite a bit. What’s more, superficial process knowledge is utilized to divide the whole process into multiple subsystems, which facilitates the calculation of transfer entropy between variables and validates the results of data-driven models. Case studies of a stochastic process and the TE process confirm the feasibility and effectiveness of the proposed scheme for causal network modeling. G

DOI: 10.1021/acs.iecr.7b02700 Ind. Eng. Chem. Res. XXXX, XXX, XXX−XXX

Article

Industrial & Engineering Chemistry Research (7) Dey, S.; Stori, J. A. A. Bayesian network approach to root cause diagnosis of process variations. Int. J. Mach. Tool Manu 2005, 45 (1), 75−91. (8) Alaeddini, A.; Dogan, I. Using Bayesian networks for root cause analysis in statistical process control. Expert Syst. Appl. 2011, 38 (9), 11230−11243. (9) Wee, Y. Y.; Cheah, W. P.; Tan, S. C.; Wee, K. K. A method for root cause analysis with a Bayesian belief network and fuzzy cognitive map. Expert Syst. Appl. 2015, 42 (1), 468−87. (10) He, S.; Wang, Z.; Wang, Z.; Gu, X.; Yan, Z. Fault detection and diagnosis of chiller using Bayesian network classifier with probabilistic boundary. Appl. Therm. Eng. 2016, 107, 37−47. (11) Zhao, Y.; Wen, J.; Wang, S. Diagnostic Bayesian networks for diagnosing air handling units faults−Part II: Faults in coils and sensors. Appl. Therm. Eng. 2015, 90, 145−157. (12) Zhao, Y.; Wen, J.; Xiao, F.; Yang, X.; Wang, S. Diagnostic Bayesian networks for diagnosing air handling units faults−part I: Faults in dampers, fans, filters and sensors. Appl. Therm. Eng. 2017, 111, 1272−1286. (13) Gao, H. H.; Xu, Y.; Zhu, Q. X. Spatial Interpretive Structural Model Identification and AHP-Based Multimodule Fusion for Alarm Root-Cause Diagnosis in Chemical Processes. Ind. Eng. Chem. Res. 2016, 55 (12), 3641−3658. (14) Schreiber, T. Measuring Information Transfer. Phys. Rev. Lett. 2000, 85 (2), 461−464. (15) Choi, H. Localization and regularization of normalized transfer entropy. Neurocomputing 2014, 139, 408−414. (16) Sensoy, A.; Sobaci, C.; Sensoy, S.; Alali, F. Effective transfer entropy approach to information flow between exchange rates and stock markets. Chaos, Solitons Fractals 2014, 68, 180−185. (17) Bauer, M.; Cox, J. W.; Caveness, M. H.; Downs, J. J.; Thornhill, N. F. Finding the direction of disturbance propagation in a chemical process using transfer entropy. IEEE T. Contr. Syst. T 2007, 15 (1), 12−21. (18) Staniek, M.; Lehnertz, K. Symbolic transfer entropy. Phys. Rev. Lett. 2008, 100 (15), 158101. (19) Duan, P.; Yang, F.; Chen, T.; Shah, S. L. Direct Causality Detection via the Transfer Entropy Approach. IEEE T. Contr. Syst. T 2013, 21 (6), 2052−2066. (20) Yu, W.; Yang, F. Detection of Causality between Process Variables Based on Industrial Alarm Data Using Transfer Entropy. Entropy 2015, 17 (8), 5868−5887. (21) Shu, Y.; Zhao, J. Data-driven causal inference based on a modified transfer entropy. Comput. Chem. Eng. 2013, 57, 173−180. (22) Shannon, C. E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379−423. (23) Kaiser, A.; Schreiber, T. Information transfer in continuous processes. Phys. D 2002, 166, 43−62. (24) Jaynes, E. T. Information Theory and Statistical Mechanics. Phys. Rev. 1957, 106 (4), 620−630. (25) Jaynes, E. T. Information Theory and Statistical Mechanics II. Phys. Rev. 1957, 108, 171−190. (26) Otwinowski, H. Maximum Entropy Method in Comminution Modelling. Granular Matter 2006, 8 (3−4), 239−249. (27) Overbey, L. A.; Todd, M. D. Dynamic system change detection using a modification of the transfer entropy. J. Sound Vib 2009, 322 (1−2), 438−453. (28) Overbey, L. A.; Todd, M. D. Effects of noise on transfer entropy estimation for damage detection. Mech. Syst. Signal Pr 2009, 23 (7), 2178−2191. (29) Nichols, J. M.; Seaver, M.; Trickey, S. T.; Todd, M. D.; Olson, C.; Overbey, L. Detecting nonlinearity in structural systems using the transfer entropy. Phys. Rev. E 2005, 72, 046217. (30) Duan, P.; Yang, F.; Shah, S. L.; Chen, T. Transfer Zero-Entropy and Its Application for Capturing Cause and Effect Relationship Between Variables. IEEE T. Contr. Syst. T 2015, 23 (3), 855−867. (31) Yin, X.; Arulmaran, K.; Liu, J.; et al. Subsystem decomposition and configuration for distributed state estimation. AIChE J. 2016, 62 (6), 1995−2003.

(32) Yin, X.; Liu, J. Input−output pairing accounting for both structure and strength in coupling. AIChE J. 2017, 63 (4), 1226−1235. (33) Downs, J. J.; Vogel, E. F. A plant-wide industrial process control problem. Comput. Chem. Eng. 1993, 17 (3), 245−255.

H

DOI: 10.1021/acs.iecr.7b02700 Ind. Eng. Chem. Res. XXXX, XXX, XXX−XXX