Generative adversarial network based novelty detection usingminimized reconstruction error

Huan-gang WANG , Xin LI , Tao ZHANG

Front. Inform. Technol. Electron. Eng ›› 2018, Vol. 19 ›› Issue (1) : 116 -125.

PDF (1187KB)
Front. Inform. Technol. Electron. Eng ›› 2018, Vol. 19 ›› Issue (1) : 116 -125. DOI: 10.1631/FITEE.1700786
Article
Article

Generative adversarial network based novelty detection usingminimized reconstruction error

Author information +
History +
PDF (1187KB)

Abstract

Generative adversarial network (GAN) is the most exciting machine learning breakthrough in recent years, and it trains the learning model by finding the Nash equilibrium of a two-player zero-sum game. GAN is composed of a generator and a discriminator, both trained with the adversarial learning mechanism. In this paper, we introduce and investigate the use of GAN for novelty detection. In training, GAN learns from ordinary data. Then, using previously unknown data, the generator and the discriminator with the designed decision boundaries can both be used to separate novel patterns from ordinary patterns. The proposed GAN-based novelty detection method demonstrates a competitive performance on the MNIST digit database and the Tennessee Eastman (TE) benchmark process compared with the PCA-based novelty detection methods using Hotelling’s T2 and squared prediction error statistics.

Keywords

Generative adversarial network (GAN) / Novelty detection / Tennessee Eastman (TE) process

Cite this article

Download citation ▾
Huan-gang WANG, Xin LI, Tao ZHANG. Generative adversarial network based novelty detection usingminimized reconstruction error. Front. Inform. Technol. Electron. Eng, 2018, 19(1): 116-125 DOI:10.1631/FITEE.1700786

登录浏览全文

4963

注册一个新账户 忘记密码

References

RIGHTS & PERMISSIONS

Zhejiang University and Springer-Verlag GmbH Germany, part of Springer Nature

AI Summary AI Mindmap
PDF (1187KB)

Supplementary files

FITEE-0116-18011-HGW_suppl_1

FITEE-0116-18011-HGW_suppl_2

4931

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/