Nonlinear industrial process fault diagnosis with latent label consistency and sparse Gaussian feature learning
Xian-ling Li , Jian-feng Zhang , Chun-hui Zhao , Jin-liang Ding , You-xian Sun
Journal of Central South University ›› 2023, Vol. 29 ›› Issue (12) : 3956 -3973.
Nonlinear industrial process fault diagnosis with latent label consistency and sparse Gaussian feature learning
With the increasing complexity of industrial processes, the high-dimensional industrial data exhibit a strong nonlinearity, bringing considerable challenges to the fault diagnosis of industrial processes. To efficiently extract deep meaningful features that are crucial for fault diagnosis, a sparse Gaussian feature extractor(SGFE) is designed to learn a nonlinear mapping that projects the raw data into the feature space with the fault label dimension. The feature space is described by the one-hot encoding of the fault category label as an orthogonal basis. In this way, the deep sparse Gaussian features related to fault categories can be gradually learned from the raw data by SGFE. In the feature space, the sparse Gaussian (SG) loss function is designed to constrain the distribution of features to multiple sparse multivariate Gaussian distributions. The sparse Gaussian features are linearly separable in the feature space, which is conducive to improving the accuracy of the downstream fault classification task. The feasibility and practical utility of the proposed SGFE are verified by the handwritten digits MNIST benchmark and Tennessee-Eastman (TE) benchmark process, respectively.
nonlinear fault diagnosis / multiple multivariate Gaussian distributions / sparse Gaussian feature learning / Gaussian feature extractor
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
GROSHEV M, GUIMARÃES C. Assessing the need for 5G driven edge and fog solution for digital twin systems [C]//WiNTECH’20: Proceedings of the 14th International Workshop on Wireless Network Testbeds, Experimental evaluation & Characterization. 2020: 126 - 127. DOI: https://doi.org/10.1145/3411276.3414697. |
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
|
| [35] |
KINGMA D P, WELLING M. Auto-encoding variational bayes [EB/OL] arXiv preprint arXiv, 2013:1312.6114. |
| [36] |
|
| [37] |
|
| [38] |
RIETH C, AMSEL B D, TRAN R, et al. Additional Tennessee Eastman process simulation data for anomaly detection evaluation [OL]. Harvard Dataverse, 2017. |
/
| 〈 |
|
〉 |