Finite-sensor fault-diagnosis simulation study of gas turbine engine using information entropy and deep belief networks

De-long FENG, Ming-qing XIAO, Ying-xi LIU, Hai-fang SONG, Zhao YANG, Ze-wen HU

PDF(879 KB)
PDF(879 KB)
Front. Inform. Technol. Electron. Eng ›› 2016, Vol. 17 ›› Issue (12) : 1287-1304. DOI: 10.1631/FITEE.1601365
Article
Article

Finite-sensor fault-diagnosis simulation study of gas turbine engine using information entropy and deep belief networks

Author information +
History +

Abstract

Precise fault diagnosis is an important part of prognostics and health management. It can avoid accidents, extend the service life of the machine, and also reduce maintenance costs. For gas turbine engine fault diagnosis, we cannot install too many sensors in the engine because the operating environment of the engine is harsh and the sensors will not work in high temperature, at high rotation speed, or under high pressure. Thus, there is not enough sensory data from the working engine to diagnose po-tential failures using existing approaches. In this paper, we consider the problem of engine fault diagnosis using finite sensory data under complicated circumstances, and propose deep belief networks based on information entropy, IE-DBNs, for engine fault diagnosis. We first introduce several information entropies and propose joint complexity entropy based on single signal entropy. Second, the deep belief networks (DBNs) is analyzed and a logistic regression layer is added to the output of the DBNs. Then, information entropy is used in fault diagnosis and as the input for the DBNs. Comparison between the proposed IE-DBNs method and state-of-the-art machine learning approaches shows that the IE-DBNs method achieves higher accuracy.

Keywords

Deep belief networks (DBNs) / Fault diagnosis / Information entropy / Engine

Cite this article

Download citation ▾
De-long FENG, Ming-qing XIAO, Ying-xi LIU, Hai-fang SONG, Zhao YANG, Ze-wen HU. Finite-sensor fault-diagnosis simulation study of gas turbine engine using information entropy and deep belief networks. Front. Inform. Technol. Electron. Eng, 2016, 17(12): 1287‒1304 https://doi.org/10.1631/FITEE.1601365

References

[1]
Aguiar, V., Guedes, I., 2015. Shannon entropy, Fisher infor-mation and uncertainty relations for log-periodic oscil-lators. Phys. A, 423:72–79. http://dx.doi.org/10.1016/j.physa.2014.12.031
[2]
Bengio, Y., 2009. Learning Deep Architectures for AI. Available from http://www.iro.umontreal.ca/~bengioy/ papers/ftml.pdf
[3]
Bengio, Y., 2012. Practical recommendations for gradient- based training of deep architectures. LNCS, 7700:437–478. http://dx.doi.org/10.1007/978-3-642-35289-8_26
[4]
Bengio, Y., Courville, A., Vincent, P., 2013. Representation learning: a review and new perspectives. IEEE Trans. Patt. Anal. Mach. Intell., 35(8):1798–1828. http://dx.doi.org/10.1109/TPAMI.2013.50
[5]
Bottou, L., 2012. Stochastic gradient descent tricks. LNCS, 7700:421–436. http://dx.doi.org/10.1007/978-3-642-35289-8_25
[6]
Chen, Y.S., Zhao, X., Jia, X.P., 2015. Spectral-spatial classi-fication of hyperspectral data based on deep belief net-work. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens. , 8(6):2381–2392. http://dx.doi.org/10.1109/JSTARS.2015.2388577
[7]
Cui, H.X., Zhang, L.B., Kang, R.Y., , 2009. Research on fault diagnosis for reciprocating compressor valve using information entropy and SVM method. J. Loss Prevent. Process Ind. , 22(6):864–867. http://dx.doi.org/10.1016/j.jlp.2009.08.012
[8]
Dai, J.H., Tian, H.W., 2013. Entropy measures and granularity measures for set-valued information systems. Inform. Sci. , 240:72–82. http://dx.doi.org/10.1016/j.ins.2013.03.045
[9]
Ferrer, A., 2007. Multivariate statistical process control based on principal component analysis (MSPC-PCA): some re-flections and a case study in an autobody assembly pro-cess. Qual. Eng. , 19(4):311–325. http://dx.doi.org/10.1080/08982110701621304
[10]
Geng, J.B., Huang, S.H., Jin, J.S., , 2006. A method of rotating machinery fault diagnosis based on the close degree of information entropy. Int. J. Plant Eng. Manag. , 11(3):137–144. http://dx.doi.org/10.13434/j.ckni.1007-4546.2006.03.002
[11]
Hinton, G.E., 2010. A Practical Guide to Training Restricted Boltzmann Machines. Available from https://www.cs. toronto.edu/~hinton/absps/guideTR.pdf
[12]
Hinton, G.E., Osindero, S., Teh, Y.W., 2006. A fast learning algorithm for deep belief nets. Neur. Comput. , 18(7): 1527–1554. http://dx.doi.org/10.1162/neco.2006.18.7.1527
[13]
Hinton, G.E., Deng, L., Yu, D., , 2012. Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Pro-cess. Mag. , 29(6):82–97. http://dx.doi.org/10.1109/MSP.2012.2205597
[14]
Jin, C.X., Li, F.C., Li, Y., 2014. A generalized fuzzy ID3 algorithm using generalized information entropy. Knowl.- Based Syst. , 64:13–21. http://dx.doi.org/10.1016/j.knosys.2014.03.014
[15]
Koverda, V.P., Skokov, V.N., 2012. Maximum entropy in a nonlinear system with a 1/f power spectrum. Phys. A, 391(1-2):21–28. http://dx.doi.org/10.1016/j.physa.2011.07.015
[16]
Larochelle, H., Bengio, Y., Louradour, J., , 2009. Ex-ploring strategies for training deep neural networks. J. Mach. Learn. Res. , 10(10):1–40.
[17]
Li, F.C., Zhang, Z., Jin, C.X., 2016. Feature selection with partition differentiation entropy for large-scale data sets. Inform. Sci. , 329:690–700. http://dx.doi.org/10.1016/j.ins.2015.10.002
[18]
Li, J., 2015. Recognition of the optical image based on the wavelet space feature spectrum entropy. Optik-Int. J. Light Electron Opt. , 126(23):3931–3935. http://dx.doi.org/10.1016/j.ijleo.2015.07.166
[19]
Liu, Z.G., Hu, Q.L., Cui, Y., , 2014. A new detection approach of transient disturbances combining wavelet packet and Tsallis entropy. Neurocomputing, 142:393–407. http://dx.doi.org/10.1016/j.neucom.2014.04.020
[20]
Martens, J., Sutskever, I., 2012. Training deep and recurrent networks with Hessian-free optimization. LCNS, 7700: 479–535. http://dx.doi.org/10.1007/978-3-642-35289-8_27
[21]
Memisevic, R., Hinton, G.E., 2010. Learning to represent spatial transformations with factored higher-order Boltzmann machine. Neur. Comput. , 22(6):1473–1492. http://dx.doi.org/10.1162/neco.2010.01-09-953
[22]
Mohamed, A.R., Dahl, G.E., Hinton, G.E., 2012. Acoustic modeling using deep belief networks. IEEE Trans. Audio Speech Lang. Process. , 20(1):14–22. http://dx.doi.org/10.1109/TASL.2011.2109382
[23]
Nichols, J.M., Seaver, M., Trickey, S.T., 2006. A method for detecting damage-induced nonlinearities in structures using information theory. J. Sound Vibr. , 297(1-2):1–16. http://dx.doi.org/10.1016/j.jsv.2006.01.025
[24]
Niu, J., Bu, X.Z., Li, Z., , 2014. An improved bilinear deep belief network algorithm for image classification. 10th Int. Conf. on Computational Intelligence and Secu-rity, p.189–192. http://dx.doi.org/10.1109/CIS.2014.38
[25]
Nourani, V., Alami, M.T., Vousoughi, F.D., 2015. Wavelet- entropy data pre-processing approach for ANN-based groundwater level modeling. J. Hydrol. , 524:255–269. http://dx.doi.org/10.1016/j.jhydrol.2015.02.048
[26]
Ong, B.T., Sugiura, K., Zettsu, K., 2014. Dynamic pre-training of deep recurrent neural networks for predicting envi-ronmental monitoring data. IEEE Int. Conf. on Big Data, p.760–765. http://dx.doi.org/10.1109/BigData.2014.7004302
[27]
Pan, Y.B., Yang, B.L., Zhou, X.W., 2015. Feedstock molecu-lar reconstruction for secondary reactions of fluid cata-lytic cracking gasoline by maximum information entropy method. Chem. Eng. J. , 281:945–952. http://dx.doi.org/10.1016/j.cej.2015.07.037
[28]
Rastegin, A.E., 2015. On generalized entropies and infor-mation-theoretic Bell inequalities under decoherence. Ann. Phys. , 355:241–257. http://dx.doi.org/10.1016/j.aop.2015.02.015
[29]
Rodríguez, P.H., Alonso, J.B., Ferrer, M.A., , 2013. Ap-plication of the Teager-Kaiser energy operator in bearing fault diagnosis. ISA Trans. , 52(2):278–284. http://dx.doi.org/10.1016/j.isatra.2012.12.006
[30]
Saimurugan, M., Ramachandran, K.I., Sugumaran, V., , 2011. Multi component fault diagnosis of rotational me-chanical system based on decision tree and support vector machine. Expert Syst. Appl. , 38(4):3819–3826. http://dx.doi.org/10.1016/j.eswa.2010.09.042
[31]
Sainath, T.N., Kingsbury, B., Soltau, H., , 2013. Opti-mization techniques to improve training speed of deep neural networks for large speech tasks. IEEE Trans. Au-dio Speech Lang. Process. , 21(11):2267–2276. http://dx.doi.org/10.1109/TASL.2013.2284378
[32]
Sainath, T.N., Kingsbury, B., Saon, G., , 2015. Deep convolutional neural networks for large-scale speech tasks. Neur. Networks, 64:39–48. http://dx.doi.org/10.1016/j.neunet.2014.08.005
[33]
Sekerka, R.F., 2015. Entropy and information theory. In: Thermal Physics: Thermodynamics and Statistical Me-chanics for Scientists and Engineers. Elsevier, p.247–256. http://dx.doi.org/10.1016/B978-0-12-803304-3.00015-6
[34]
Sermanet, P., Chintala, S., LeCun, Y., 2012. Convolutional neural networks applied to house numbers digit classifi-cation. 21st Int. Conf. on Pattern Recognition, p.3288–3291.
[35]
Song, X.D., Sun, G.H., Dong, S.H., 2015. Shannon infor-mation entropy for an infinite circular well. Phys. Lett. A, 379(22-23):1402–1408. http://dx.doi.org/10.1016/j.physleta.2015.03.020
[36]
Su, H.T., You, G.J.Y., 2014. Developing an entropy-based model of spatial information estimation and its applica-tion in the design of precipitation gauge networks. J. Hydrol. , 519(D):3316–3327. http://dx.doi.org/10.1016/j.jhydrol.2014.10.022
[37]
Susan, S., Hanmandlu, M., 2013. A non-extensive entropy feature and its application to texture classification. Neu-rocomputing, 120:214–225. http://dx.doi.org/10.1016/j.neucom.2012.08.059
[38]
Sutskever, I., Hinton, G.E., Taylor, G.W., 2008. The recurrent temporal restricted Boltzmann machine. Proc. 22nd An-nual Conf. on Neural Information Processing Systems, p.1601–1608.
[39]
Tamilselvan, P., Wang, P.F., 2013. Failure diagnosis using deep belief learning based health state classification. Re-liab. Eng. Syst. Safety, 115:124–135. http://dx.doi.org/10.1016/j.ress.2013.02.022
[40]
Tamilselvan, P., Wang, P.F., Youn, B.D., 2011. Multi-sensor health diagnosis using deep belief network based state classification. ASME Int. Design Engineering Technical Conf. & Computers and Information in Engineering Conf., p.749–758. http://dx.doi.org/10.1115/DETC2011-48352
[41]
Tran, V.T., AlThobiani, F., Ball, A., 2014. An approach to fault diagnosis of reciprocating compressor valves using Teager–Kaiser energy operator and deep belief networks. Expert Syst. Appl. , 41(9):4113–4122. http://dx.doi.org/10.1016/j.eswa.2013.12.026
[42]
Xie, Y., Zhang, T., 2005. A fault diagnosis approach using SVM with data dimension reduction by PCA and LDA method. Chinese Automation Congress, p.869–874. http://dx.doi.org/10.1109/CAC.2015.7382620
[43]
Zhang, W.L., Li, R.J., Deng, H.T., , 2015. Deep convo-lutional neural networks for multi-modality isointense infant brain image segmentation. NeuroImage, 108:214–224. http://dx.doi.org/10.1016/j.neuroimage.2014.12.061
[44]
Zhao, X.Z., Ye, B.Y., 2016. Singular value decomposition packet and its application to extraction of weak fault feature. Mech. Syst. Signal Process. , 70-71:73–86. http://dx.doi.org/10.1016/j.ymssp.2015.08.033
[45]
Zhou, S.S., Chen, Q.C., Wang, X.L., 2014. Deep adaptive networks for visual data classification. J. Multim. , 9(10): 1142–1151.

RIGHTS & PERMISSIONS

2016 Zhejiang University and Springer-Verlag Berlin Heidelberg
PDF(879 KB)

Accesses

Citations

Detail

Sections
Recommended

/