Fractional-order global optimal backpropagation machine trained by an improved fractional-order steepest descent method

Yi-fei PU , Jian WANG

Front. Inform. Technol. Electron. Eng ›› 2020, Vol. 21 ›› Issue (6) : 809 -833.

PDF (4371KB)
Front. Inform. Technol. Electron. Eng ›› 2020, Vol. 21 ›› Issue (6) : 809 -833. DOI: 10.1631/FITEE.1900593
Orginal Article
Orginal Article

Fractional-order global optimal backpropagation machine trained by an improved fractional-order steepest descent method

Author information +
History +
PDF (4371KB)

Abstract

We introduce the fractional-order global optimal backpropagation machine, which is trained by an improved fractionalorder steepest descent method (FSDM). This is a fractional-order backpropagation neural network (FBPNN), a state-of-the-art fractional-order branch of the family of backpropagation neural networks (BPNNs), different from the majority of the previous classic first-order BPNNs which are trained by the traditional first-order steepest descent method. The reverse incremental search of the proposed FBPNN is in the negative directions of the approximate fractional-order partial derivatives of the square error. First, the theoretical concept of an FBPNN trained by an improved FSDM is described mathematically. Then, the mathematical proof of fractional-order global optimal convergence, an assumption of the structure, and fractional-order multi-scale global optimization of the FBPNN are analyzed in detail. Finally, we perform three (types of) experiments to compare the performances of an FBPNN and a classic first-order BPNN, i.e., example function approximation, fractional-order multi-scale global optimization, and comparison of global search and error fitting abilities with real data. The higher optimal search ability of an FBPNN to determine the global optimal solution is the major advantage that makes the FBPNN superior to a classic first-order BPNN.

Keywords

Fractional calculus / Fractional-order backpropagation algorithm / Fractional-order steepest descent method / Mean square error / Fractional-order multi-scale global optimization

Cite this article

Download citation ▾
Yi-fei PU, Jian WANG. Fractional-order global optimal backpropagation machine trained by an improved fractional-order steepest descent method. Front. Inform. Technol. Electron. Eng, 2020, 21(6): 809-833 DOI:10.1631/FITEE.1900593

登录浏览全文

4963

注册一个新账户 忘记密码

References

RIGHTS & PERMISSIONS

Zhejiang University and Springer-Verlag GmbH Germany, part of Springer Nature

AI Summary AI Mindmap
PDF (4371KB)

Supplementary files

FITEE-0809-20001-YFP_suppl_1

FITEE-0809-20001-YFP_suppl_2

848

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/