an adadelta-enhanced artificial neural network based

0 downloads 0 Views 1MB Size Report
Poisoning. Heat Exchanger. Fouling. Disturbance parameter changes ... LNG System .... SCOPE AND LIMITATION. •For simulation, Aspen HYSYS will be used.
AN ADADELTA-ENHANCED ARTIFICIAL NEURAL NETWORK BASED FAULT DETECTION AND DIAGNOSIS IN A SERIES OF DISTILLATION COLUMNS Art Philippe Bucaneg Jerome Pablo

WHAT IS A FAULT? A fault is any “unpermitted deviation of at least one characteristic property of the system.” Fault

Process parameter changes

Catalyst Poisoning Heat Exchanger Fouling

Disturbance parameter changes Extreme changes in parameters

Actuator problems

Sticking Valve

Sensor Problems

Bias

Complete Failure

Drifting

Chiang, Russel and Braatz (2001)

Precision Degradation

Production of Acrylic Acid from Propylene (Suo, et.al, 2015)

No Fault Detection Yes

Fault Identification

Fault Diagnosis

Fault Recovery Chiang, Russel & Braatz, 2001

No Fault Detection Yes

Fault Identification

Fault Diagnosis

Fault Recovery Chiang, Russel & Braatz, 2001

No Fault Detection Yes

Fault Identification

Fault Diagnosis

Fault Recovery Chiang, Russel & Braatz, 2001

No Fault Detection Yes

Fault Identification

Fault Diagnosis

Fault Recovery Chiang, Russel & Braatz, 2001

LEVELS OF FAULTS • SINGLE FAULT, SINGLE EQUIPMENT • A single root fault that originates from one equipment

• SIMULTANEOUS, MULTIPLE FAULTS, SINGLE EQUIPMENT • More than one root fault occurring at the same time from one equipment

• SIMULTANEOUS MULTIPLE FAULTS, MULTIPLE EQUIPMENT • More than one root fault occurring at the same time from more than one equipment

LNG System

SOLUTION TO FAULT DETECTION AND DIAGNOSIS: BIG DATA

Logistic Regression SVM

Binary

Density based Clustering

ANN

Density

Decision Tree

gdbscan

Ordered Logit model

Classification

Some knowledge of cluster structure

Ordinal

ANN

K-means Number of Clusters

Clustering

Unknown Structure

Decision Tree

Self-organizing Map

Multiclass

Hierarchical clustering

Nominal Supervised Learning

Unsupervised Learning

Multinomial Logit Model Hierarchical Logit Model Artificial Neural Network Decision Tree

Principal Component Analysis

Dimensionality Reduction

Support Vector Machine K-nearest Neighbors

Multi-Dimensional Scaling Continuous

Self-organizing Map

Multiple Regression ANN

Regression

ANN Discrete

Decision Tree

Simplified guide to statistical and machine learning choices (D. Beck et. al, 2016)

BIG DATA INFORMATICS

Logistic Regression SVM Binary

Density based Clustering

ANN

Density

Decision Tree

gdbscan

Ordered Logit model

Classification

Some knowledge of cluster structure

Ordinal

ANN

K-means Number of Clusters

Clustering

Unknown Structure Unsupervised Learning

Hierarchical clustering

Decision Tree

Self-organizing Map

Multiclass

Nominal

Supervised Learning

Decision Tree

Principal Component Analysis

Dimensionality Reduction

Multinomial Logit Model Hierarchical Logit Model Artificial Neural Network

Support Vector Machine K-nearest Neighbors

Multi-Dimensional Scaling Continuous

Self-organizing Map

Multiple Regression ANN

Regression

ANN Discrete

Decision Tree

Simplified guide to statistical and machine learning choices (D. Beck et. al, 2016)

COMPARISON OF VARIOUS DIAGNOSTIC METHODS Observer

Digraphs

Abstraction Hierarchy

Expert Systems

QTA

PCA

Neural Networks

Quick detection And diagnosis



?

?









Isolability















Robustness















Novelty Identifiability

?







?





Classification Error















Adaptability









?





Explanation Facility















Modelling Requirement

?













Storage and Computation



?

?









Multiple Fault Identifiability















COMPARISON OF VARIOUS DIAGNOSTIC METHODS Observer

Digraphs

Abstraction Hierarchy

Expert Systems

QTA

PCA

Neural Networks

Quick detection And diagnosis



?

?









Isolability















Robustness















Novelty Identifiability

?







?





Classification Error















Adaptability









?





Explanation Facility















Modelling Requirement

?













Storage and Computation



?

?









Multiple Fault Identifiability















ARTIFICIAL NEURAL NETWORK Example of ANN accuracy

AN ADADELTA-ENHANCED ARTIFICIAL NEURAL NETWORK BASED FAULT DETECTION AND DIAGNOSIS IN A SERIES OF DISTILLATION COLUMNS Art Philippe Bucaneg Jerome Pablo

SIGNIFICANCE OF THE STUDY Artificial neural networks… • Does not require the use of design equations • Represents the system into a more compact model

We chose the multiple distillation columns for our study because… • It is the premier separation method in the chemical and petroleum industries, the subject of distillation control has been extensively studied for many decades. (Luyben, 2013) • Most of these studies have looked at individualcolumns in isolation. (Luyben, 2013)

OBJECTIVES OF THE STUDY • To improve the existing mode of FDD through the use of ANN • To utilize an algorithm for the neural network for optimal detection time • To simulate the series distillation column system data • To provide an insight to the potential of a neural network specifically trained for single fault in the detection of multiple faults

SCOPE AND LIMITATION • For simulation, Aspen HYSYS will be used. All single faults and only some multiple will be simulated. • The ANN is meant to quicken fault detection and diagnosis in the said setup and not as a predictive model on when faults might occur. • Fault recovery is not part of the study.

METHODOLOGY 1. Learn ANN algorithm from various sources • Video tutorials • Journals • Online lectures/Slides

2. Study various optimization techniques for convergence • Gradient Descent Method • Batch • Stochastic • Mini-Batch

• Optimization Techniques • • • • •

Momentum Nesterov Momentum RMSprop ADAprop ADADelta

3. Test code with given training sets • MNIST (handwritten number recognition) • Wine Classification • Iris (Orchids) classification • Tic-Tac-Toe Simulation

METHODOLOGY 4. Simulation of faults using Aspen HYSYS • Values • Test nominal value • 2.5-15% deviation, 2.5% interval, for single error • 5% for multiple errors

• For multiple columns in series, single and multiple errors

5. Train ANN with data from simulation • Using 5%, 10%, and 15% • Only single error

6. Test ANN for single error • Using 2.5%, 7.5%, and 12.5%

7. Test ANN for multiple errors

METHODOLOGY • Test for accuracy • Error value during iteration

• % 𝑎𝑐𝑐𝑢𝑟𝑎𝑐𝑦 =

# 𝑜𝑓 𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑐𝑙𝑎𝑠𝑠𝑖𝑓𝑖𝑐𝑎𝑡𝑖𝑜𝑛 # 𝑜𝑓 𝑑𝑎𝑡𝑎 𝑝𝑜𝑖𝑛𝑡𝑠

• Further optimization through ANN architecture • Number of hidden layers • Number of nodes per hidden layer • Minimize Over/underfitting

MILESTONES 1. 2. 3. 4. 5. 6. 7.

Learn ANN Generate general ANN MatLab Code Implement several convergence optimization algorithms Test on several data sets Start simulation for multiple distillation column Test on simulated faults Optimize the code

GANTT CHART Activity Learn ANN Code Writing

Learn and Apply Optimization Techniques Test Code

Lit. Rev.

2016 July

August

September

October

November

December

GANTT CHART Activity Simulation of Data Points

Testing of Data Points Further Optimization

Lit. Review Paper Writing

2017 January

February

March

April

May

IRIS PLANT CLASSIFICATION Error Value vs Epoch Graph and Accuracy per Epoch Graph

TIC TAC TOE Error Value vs Epoch Graph and Accuracy per Epoch Graph

WINE CLASSIFICATION Error Value vs Epoch Graph and Accuracy per Epoch Graph

SINGLE COLUMN (SHARMA, 2003) Error Value vs Epoch Graph and Accuracy per Epoch Graph

MNIST RESULTS

MNIST RESULTS

REFERENCES 1. Battiti, R. (1989). Accelerated Backpropagation Learning: Two Optimization Methods. Complex System Publication, Inc. 2. Behbahani, R.M., Jazayeri-Rad, H., Hajimirzaee, S. (2009). Fault Detection and Diagnosis in a Sour Gas Absorption Column using Neural Networks. Chemical Engineering and Technology. 3. Bottou, L. (2010). Large-Scale Machine Learning with Stochastic Gradient Descent.

4. Bottou, L. (2012). Stochastic Gradient Descent Tricks. 5. Darken, C., Moody, J. (1991). Towards Faster Stochastic Gradient Search.

6. Eljack, F. and Kazi, M (2016). Process safety and abnormal situation management. Current Opinion in Chemical Engineering 7. Reising, D., Colgate, B. and Bullemer, P. (2013). SPE 166584 Abnormal Situation Management and its Relevance to Process Safety for Offshore Operations. 8. Shu, Y., Ming, L., Cheng, F., Zhang, Z., and Zhao, J. (2015). Abnormal situation management: Challenges and opportunities in the big data era. Computers and Chemical Engineering. 9. Venkatasubramanian, V. (2003). ABNORMAL EVENTS MANAGEMENT IN COMPLEX PROCESS PLANTS : CHALLENGES AND OPPORTUNITIES IN INTELLIGENT SUPERVISORY CONTROL. 10. Ferreira, L. And Trierweiler, J. (2009). Modeling and simulation of the polymeric nanocapsule formation process. IFAC Proceedings Volumes (IFAC-PapersOnline).

REFERENCES 11. Mohd Ali, J., Ha Hoang, N., Hussain, M. And Dochain, D. (2015). Review and classification of recent observers applied in chemical process systems. Computers and Chemical Engineering. 12. Mohd Ali, J., Hussain, M., Tade, M., and Zhang, J. (2015). Artificial Intelligence techniques applied as estimator in chemical process systems - A literature survey. Expert Systems with Applications. 13. Himmelblau, D. (2000). Applications of artificial neural networks in chemical engineering. Korean Journal of Chemical Engineering.

14. Luyben, W.L. (2013). Control of a Train of Distillation Columns for the Separation of NGL. Ind. Eng. Chem. Res. 15. Manssouri, I., Chetouani, Y., and El Kihel, B. (2008) Using neural networks for fault detection in a distillation column. International Journal of Computer Applications in Technology. 16. Tonnang, H. and Olatunbosun, A. (2010). Neural Network Controller for a Crude Oil Distillation Column. Journal of Engineering and Applied Sciences. 17. Venkatasubramanian, V., Vaidyanathan, R., and Yamamoto, Y. (1990). Process Fault Detection and Diagnosis Using neural networks- I. Steady-state processes. Computers & Chemical Engineering. Venkatasubramanian, V., Rengaswamy, R., Kavuri, S., and Yin, K. (2003). A review of process fault detection and diagnosis part III: Process history based methods. Computers and Chemical Engineering. 18. Venkatasubramanian, V., Rengaswamy, R., Yin, K., and Kavuri, S. (2003). A review of process fault detection and diagnosis part I: Quantitative model-based methods. Computers and Chemical Engineering. 19. Zhao, M., Adib, F., and Katabi, D. (2016). Emotion Recognition using Wireless Signals.

20. Process Engineering Guide : Troubleshooting in Distillation Columns. GBH enterprises, Ltd.

REFERENCES 21.

de Canete, J., del Saz-Orozco, P., Gonzalez, S. and Garcia-Moral, I. (2012). Dual composition control and soft estimation for a pilot distillation column using a neurogenetic design. Computers and Chemical Engineering.

22.

Frattini Fileti, A., Cruz, S., and Pereira, J. (2000). Control strategies analysis for a batch distillation column with experimental testing. Chemical Engineering and Processing: Process Intensification.

23.

Gonzalez, J., Aguilar, R., Alvarez-Ramirez, J., Fernandez, G., and Barron, M. (1999). Linearizing control of a binary distillation column based on a neuro-estimatorensification. Artificial Intelligence in Engineering.

24.

Sharma, R., Singh, K., Singhal, D., and Ghosh, R. (2004). Neural network applications for detecting process faults in packed towers. Chemical Engineering and Processing: Process Intensification.

25.

Watanabe, K., Hirota, S., Hou, L., And HIMMELBLAU, D. (1994). Diagnosis of Multiple Simultaneous Fault Via Hierarchical Artificial Neural Networks. AIChE Journal.

26.

Zamprogna, E., Barolo, M., and Seborg, D. (2001). Composition Estimations in a Middle-Vessel Batch Distillation Column Using Artificial Neural Networks. Chemical Engineering Research and Design.

27.

Jacobs, R.A. (1988). Increased Rate of Convergence through Adaptive Learning Rate. Neural Networks Vol. I

28.

LeCunn, Y., Bottou, L., Orr, G.B., Muller, K. (1998). Efficient Backprop. Neural Networks: Tricks of the Trade

29.

Smith, L.N. (2015). No More Pesky Learning Rates.

30.

Zeiler, M.D. (2012). ADADELTA: An Adaptive Learning Rate Method

REFERENCES 31.

Chiang, L. H., Russel, E. L., and Braatz, R. D. (2002). Fault Detection and Diagnosis in Industrial Systems. London.

32.

Beck, D., Carothers, J., Venkatasubramanian, V., and Pfaendtner, J. (2016). Data Science: Accelerating and Discovery in Chemical Engineering. AIChE Journal.