Reduce Training Error of Extreme Learning Machine by Selecting Appropriate Hidden Layer Output Matrix

Yang Lv , Bang Li , Jinghu Yu , Yiming Ding

Journal of Systems Science and Systems Engineering ›› 2021, Vol. 30 ›› Issue (5) : 552 -571.

PDF
Journal of Systems Science and Systems Engineering ›› 2021, Vol. 30 ›› Issue (5) : 552 -571. DOI: 10.1007/s11518-021-5502-8
Article

Reduce Training Error of Extreme Learning Machine by Selecting Appropriate Hidden Layer Output Matrix

Author information +
History +
PDF

Abstract

Extreme learning machine(ELM) is a feedforward neural network with a single layer of hidden nodes, where the weight and the bias connecting input to hidden nodes are randomly assigned. The output weight between hidden nodes and outputs are learned by a linear model. It is interesting to ask whether the training error of ELM is significantly affected by the hidden layer output matrix H, because a positive answer will enable us obtain smaller training error from better H. For single hidden layer feedforward neural network(SLFN) with one input neuron, there is significant difference between the training errors of different Hs. We find there is a reliable strong negative rank correlation between the training errors and some singular values of the Moore-Penrose generalized inverse of H. Based on the rank correlation, a selection algorithm is proposed to choose robust appropriate H to achieve smaller training error among numerous Hs. Extensive experiments are carried out to validate the selection algorithm, including tests on real data set. The results show that it achieves better performance in validity, speed and robustness.

Keywords

ELM / SLFNs / training error / Moore-Penrose generalized inverse / selection algorithm

Cite this article

Download citation ▾
Yang Lv, Bang Li, Jinghu Yu, Yiming Ding. Reduce Training Error of Extreme Learning Machine by Selecting Appropriate Hidden Layer Output Matrix. Journal of Systems Science and Systems Engineering, 2021, 30(5): 552-571 DOI:10.1007/s11518-021-5502-8

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Afzal A L, Nair N K, Asharaf S. Deep kernel learning in extreme learning machines. Pattern Analysis and Applications, 2020, 2: 71-90.

[2]

Andina D, Vega-Corona A. The Media and Advertising- Computational Intelligence: For Engineering and Manufacturing, 2007, US: Springer

[3]

Ben-Israel A, Greville T N. Generalized Inverses: Theory and Applications, 2003.

[4]

Brian D R. Pattern Recognition and Neural Networks, 1996

[5]

Cao W, Ming Z, Wang X. Improved bidirectional extreme learning machine based on enhanced random search. Memetic Computing, 2019, 11(1): 19-26.

[6]

Chandrasekaran V, Sanghavi S. 47th Annual Allerton Conference, 2009

[7]

Chen T P, Hong C. Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems. IEEE Transactions on Neural Networks, 1995, 6(4): 911-917.

[8]

Cybenko G. Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals and Systems, 1989, 2(4): 303-314.

[9]

Deng C, Huang G B, Xu J, Tang J. Extreme learning machines: New trends and applications. Science China Information Sciences, 2015, 58(2): 1-16.

[10]

Dreyfus G. Neural Networks, 2005.

[11]

Eldar Y C, Kutyniok G. Compressed Sensing: Theory and Applications, 2012.

[12]

Eshtay M, Faris H, Obeid N. A competitive swarm optimizer with hybrid encoding for simultaneously optimizing the weights and structure of Extreme Learning Machines for classification problems. International Journal of Machine Learning and Cybernetics, 2020, 11: 1801-1823.

[13]

Frenay B, Verleysen M. Parameter-insensitive kernel in extreme learning for non-linear support vector regression. Neurocomputing, 2011, 74(16): 2526-2531.

[14]

Huang G B. What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cognitive Computation, 2015, 7(3): 263-278.

[15]

Huang G B, Chen L, Siew C. Can threshold networks be trained directly?. IEEE Transactions on Circuits and Systems II: Express Briefs, 2006, 53(3): 187-191.

[16]

Huang G B, Chen L, Siew C. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Transactions on Neural Networks, 2006, 17(4): 879-892.

[17]

Huang G B, Ding X, Zhou H. Optimization method based extreme learning machine for classification. Neurocomputing, 2010, 74(1): 155-163.

[18]

Huang G B, Song S. Semi-supervised and unsupervised extreme learning machines. Science China Information Sciences, 2014, 44(12): 2405-2417.

[19]

Huang G B, Zhou H, Ding X, Zhang R. Extreme learning machine for regression and multiclass classification. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 2012, 42(2): 513-529.

[20]

Huang G B, Zhu Q Y, Siew C K. Extreme learning machine: Theory and applications. Neurocomputing, 2006, 70(1–3): 489-501.

[21]

Lawson C, Hanson R. Society for Industrial and Applied Mathematics, 1995

[22]

Liu Q, He Q, Shi Z. Advances in Knowledge Discovery and Data Mining, 2008

[23]

Liu X, Lin S, Fang J, Xu Z. Is extreme learning machine feasible? A theoretical assessment (part i). IEEE Transactions on Neural Networks and Learning Systems, 2015, 26(1): 7-20.

[24]

Liu X, Lin S, Fang J, Xu Z. Is extreme learning machine feasible? A theoretical assessment (part ii). IEEE Transactions on Neural Networks and Learning Systems, 2015, 26(1): 21-34.

[25]

Miche Y, Sorjamaa A, Lendasse A. OP-ELM: Theory, experiments and a toolbox. Artificial Neural Networks, 2008, 42(2): 145-154.

[26]

Muller-Gronbach T, Novak E, Petras K, Maiorov V. Approximation by neural networks and learning theory. IEEE Journal of Complexity, 2006, 22(1): 102-117.

[27]

Rong H J, Huang G B, Ong Y S. IEEE International Joint Conference on Neural Networks, 2008

[28]

Ronny M, Jose F F. Physica A: Statistical Mechanics and Its Applications, 1993

[29]

Zhu Q Y, Qin A K, Suganthan P N, Huang G B. Evolutionary extreme learning machine. IEEE Pattern Recognition, 2005, 38(10): 1759-1763.

[30]

Zhu X, Li Z, Zhang X Y. Deep convolutional representations and kernel extreme learning machines for image classification. Multimedia Tools and Applications, 2018, 78(20): 29271-29290.

AI Summary AI Mindmap
PDF

122

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/