Forecasting foreign exchange rates with an improved back-propagation learning algorithm with adaptive smoothing momentum terms

Lean YU, Shouyang WANG, Kin Keung LAI

PDF(619 KB)
PDF(619 KB)
Front. Comput. Sci. ›› 2009, Vol. 3 ›› Issue (2) : 167-176. DOI: 10.1007/s11704-009-0020-8
RESEARCH ARTICLE

Forecasting foreign exchange rates with an improved back-propagation learning algorithm with adaptive smoothing momentum terms

Author information +
History +

Abstract

The slow convergence of back-propagation neural network (BPNN) has become a challenge in data-mining and knowledge discovery applications due to the drawbacks of the gradient descent (GD) optimization method, which is widely adopted in BPNN learning. To solve this problem, some standard optimization techniques such as conjugategradient and Newton method have been proposed to improve the convergence rate of BP learning algorithm. This paper presents a heuristic method that adds an adaptive smoothing momentum term to original BP learning algorithm to speedup the convergence. In this improved BP learning algorithm, adaptive smoothing technique is used to adjust the momentums of weight updating formula automatically in terms of “3 σ limits theory.” Using the adaptive smoothing momentum terms, the improved BP learning algorithm can make the network training and convergence process faster, and the network’s generalization performance stronger than the standard BP learning algorithm can do. In order to verify the effectiveness of the proposed BP learning algorithm, three typical foreign exchange rates, British pound (GBP), Euro (EUR), and Japanese yen (JPY), are chosen as the forecasting targets for illustration purpose. Experimental results from homogeneous algorithm comparisons reveal that the proposed BP learning algorithm outperforms the other comparable BP algorithms in performance and convergence rate. Furthermore, empirical results from heterogeneous model comparisons also show the effectiveness of the proposed BP learning algorithm.

Keywords

back-propagation neural network / adaptive smoothing momentum / heuristic method / foreign exchange rates forecasting

Cite this article

Download citation ▾
Lean YU, Shouyang WANG, Kin Keung LAI. Forecasting foreign exchange rates with an improved back-propagation learning algorithm with adaptive smoothing momentum terms. Front Comput Sci Chin, 2009, 3(2): 167‒176 https://doi.org/10.1007/s11704-009-0020-8

References

[1]
Rumelhart D, Hinton G, Williams R. Learning internal representations by error propagation. In: Rumelhart D and McClelland J, eds. Parallel Distributed Processing: Explorations in the Microstructure of Cognition I. Cambridge: MIT Press, 1986, 318-363
[2]
Yu L, Wang S Y, Lai K K. A novel nonlinear ensemble forecasting model incorporating GLAR and ANN for foreign exchange rates. Computers & Operations Research, 2005, 32: 2523-2541
CrossRef Google scholar
[3]
Hornik K, Stinchocombe M, White, H. Multilayer feedforward networks are universal approximators. Neural Networks, 1989, 2: 359-366
CrossRef Google scholar
[4]
White, H. Connectionist nonparametric regression: multilayer feedforward networks can learn arbitrary mappings. Neural Networks, 1990, 3: 535-549
CrossRef Google scholar
[5]
Widrow B, Lehr M A. 30 Years of adaptive neural networks: perception, madaline, and backprpagation. In: Proceedings of the IEEE Neural Networks I: Theory & Modeling, 1990, 1415-1442
[6]
Yu X H. Can back-propagation error surface not have local minima? IEEE Transactions on Neural Networks, 1992, 3: 1019-1021
CrossRef Google scholar
[7]
Lawrence S, Giles C L, Tsoi A C. Lessons in neural network training: overfitting may be harder than expected. In: Proceedings of the Fourteenth National Conference on Artificial Intelligence (AAAI-97), California: AAAI Press, 1997, 540-545
[8]
Yu L, Wang S Y, Lai K K. An integrated data preparation scheme for neural network data analysis. IEEE Transactions on Knowledge and Data Engineering, 2006, 18(2): 217-230
CrossRef Google scholar
[9]
Yu L, Lai K K, Wang S Y, . A bias-variance-complexity trade-off framework for complex system modeling. Lecture Notes in Computer Science, 2006, 3980: 518-527
CrossRef Google scholar
[10]
Ng A Y. Preventing ‘overfitting’ of cross validation data. In: Proceedings of the Fourteenth International Conference on Machine Learning, Nashville: Morgan Kaufmann, 1997, 245-253
[11]
Behera L, Kumar S, Patnaik A. On adaptive learning rate that guarantees convergence in Feedforward Networks. IEEE Transactions on Neural Networks, 2006, 17(5): 1116-1125
CrossRef Google scholar
[12]
Chen G, Ogmen H.Modified extended Kalman filtering for supervised learning. International Journal of System Science, 1993, 24: 1207-1214
CrossRef Google scholar
[13]
Iiguni Y, Sakai H, Tokumaru H. A real-time learning algorithm for a multilayered neural network based on the extended Kalman filter. IEEE Transactions on Signal Processing, 1992, 40: 959-966
CrossRef Google scholar
[14]
Ruck D W, Rogers S K, Kabrisky M,. Comparative analysis of back-propagation and the extended Kalman filter for training multilayer perceptrons. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1992, 14: 686-691
CrossRef Google scholar
[15]
Sha D, Bajic V B. An online hybrid learning algorithm for multilayer perceptron in identification problems. Computers and Electrical Engineering, 2002, 28: 587-598
CrossRef Google scholar
[16]
Zhang Y, Li X R. A fast U-D factorization-based learning algorithm with applications to nonlinear system modeling and identification. IEEE Transactions on Neural Networks, 1999, 10: 930-938
CrossRef Google scholar
[17]
Wang G J, Chen C C. A fast multilayer neural networks training algorithm based on the layer-by-layer optimizing procedures. IEEE Transactions on Neural Networks, 1996, 7: 768-775
CrossRef Google scholar
[18]
Ergezinger S, Thomsen E. An accelerated learning algorithm for multilayer perceptrons: optimization layer by layer. IEEE Transactions on Neural Networks, 1995, 6: 3-42
CrossRef Google scholar
[19]
Allred LG, Kelly GE. Supervised learning techniques for backpropagation networks. In: Proceedings of International Joint Conference on Neural Networks, San Diego, 1990, 1: 702-709
[20]
Ooyen A V. Improving the convergence of the back-propagation algorithm. Neural Networks, 1992, 5: 465-571
CrossRef Google scholar
[21]
Shewhart W A. Economic Control of Quality of Manufactured Product. New York: D. Van Nostrand Company, 1931
[22]
Yu L,Wang S Y, Lai K K. Adaptive smoothing neural networks in foreign exchange rate forecasting. Lecture Notes in Computer Science, 2005, 3516: 523-530
[23]
Yu L, Wang S Y, Lai K K. A novel adaptive learning algorithm for stock market prediction. Lecture Notes in Computer Science, 2005, 3827: 443-452
CrossRef Google scholar
[24]
Yu X H, Chen G A, Cheng S X. Dynamic learning rate optimization of the back propagation algorithm. IEEE Transactions on Neural Networks, 1995, 6(3): 669-677
CrossRef Google scholar
[25]
Chase R B, Quilano A N J, Jacobs R F. Production and Operations Management: Manufacturing and Services. McGraw-Hill, 1998
[26]
Raviv Y, Intrator N. Bootstrapping with noise: an effective regularization technique. Connection Science, 1996, 8: 355-372
CrossRef Google scholar
[27]
Yu L, Wang S Y, Lai K K. Foreign-Exchange-Rate Forecasting with Artificial Neural Networks. New York: Springer, 2007

RIGHTS & PERMISSIONS

2014 Higher Education Press and Springer-Verlag Berlin Heidelberg
AI Summary AI Mindmap
PDF(619 KB)

Accesses

Citations

Detail

Sections
Recommended

/