Support vector machine incremental learning triggered by wrongly predicted samples

Ting-long Tang, Qiu Guan, Yi-rong Wu

Optoelectronics Letters ›› , Vol. 14 ›› Issue (3) : 232-235.

Optoelectronics Letters ›› , Vol. 14 ›› Issue (3) : 232-235. DOI: 10.1007/s11801-018-7254-3
Article

Support vector machine incremental learning triggered by wrongly predicted samples

Author information +
History +

Abstract

According to the classic Karush-Kuhn-Tucker (KKT) theorem, at every step of incremental support vector machine (SVM) learning, the newly adding sample which violates the KKT conditions will be a new support vector (SV) and migrate the old samples between SV set and non-support vector (NSV) set, and at the same time the learning model should be updated based on the SVs. However, it is not exactly clear at this moment that which of the old samples would change between SVs and NSVs. Additionally, the learning model will be unnecessarily updated, which will not greatly increase its accuracy but decrease the training speed. Therefore, how to choose the new SVs from old sets during the incremental stages and when to process incremental steps will greatly influence the accuracy and efficiency of incremental SVM learning. In this work, a new algorithm is proposed to select candidate SVs and use the wrongly predicted sample to trigger the incremental processing simultaneously. Experimental results show that the proposed algorithm can achieve good performance with high efficiency, high speed and good accuracy.

Cite this article

Download citation ▾
Ting-long Tang, Qiu Guan, Yi-rong Wu. Support vector machine incremental learning triggered by wrongly predicted samples. Optoelectronics Letters, , 14(3): 232‒235 https://doi.org/10.1007/s11801-018-7254-3

References

[1]
AdeR, DeshmukhP R. International Journal of Computer Applications, 2014, 2: 1039
[2]
YuanY, FangJ, WangQ. IEEE Transactions on Cybernetics, 2015, 45: 548
CrossRef Google scholar
[3]
WahabO A, MouradA, OtrokH, BentaharJ. Expert Systems with Applications, 2016, 50: 40
CrossRef Google scholar
[4]
ChuM, ZhaoJ, LiuX, GongR. Chemometrics and Intelligent Laboratory Systems, 2017, 168: 15
CrossRef Google scholar
[5]
NilashiM, IbrahimO B, MardaniA, AhaniA, JusohA. A Soft Computing Approach for Diabetes Disease Classification, Health Informatics Journal, 2016,
[6]
MarkovićI, StojanovićM, BožićM, StankovićJ. ICT Innovations, 2014, 105
[7]
KarasuyamaM, TakeuchiI. Advances in Neural Information Processing Systems, 2009, 907
[8]
CauwenberghsG, PoggioT. Incremental and Decremental Support Vector Machine Learning, 2000, 388
[9]
BordesA, ErtekinS, WestonJ, BottouL. Journal of Machine Learning Research, 2005, 6: 1579
[10]
ErtekinS, BottouL, Lee GilesC. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33: 368
CrossRef Google scholar
[11]
LiC, LiuK, WangH. Applied Intelligence, 2011, 34: 19
CrossRef Google scholar
[12]
GuB, ShengV S, WangZ, HoD, OsmanS, LiS. Neural Networks, 2015, 67: 140
CrossRef Google scholar
[13]
YiY, WuJ, XuW. Expert Systems with Applications, 2011, 38: 7698
CrossRef Google scholar
[14]
ChitrakarR, HuangC. Computers & Security, 2014, 45: 231
CrossRef Google scholar
[15]
GaoF, MeiJ, SunJ, WangJ, YangE, HussainA. Plos One, 2015, 10: e0135709
CrossRef Google scholar
[16]
ChengW-Y, JuangC-F. Fuzzy Sets and Systems, 2011, 163: 24
CrossRef Google scholar
[17]
YangY, CheJ, LiY, ZhaoY, ZhuS. Energy, 2016, 113: 796
CrossRef Google scholar
[18]
ChangC-C, LinC-J. ACM Transactions on Intelligent Systems and Technology, 2001, 2: 1
CrossRef Google scholar
[19]
PangS, ZhuL, ChenG, SarrafzadehA, BanT, InoueD. Neural Networks, 2013, 44: 87
CrossRef Google scholar
[20]
BacheK, LichmanM. UCI Machine Learning Repository, 2013, CA, University of California, School of Information and Computer Science

This work has been supported by the National Natural Science Foundation of China (Nos.U1509207 and 61325019).

Accesses

Citations

Detail

Sections
Recommended

/