ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units via Chebyshev Approximation

Shanshan Tang , Bo Li , Haijun Yu

Communications in Mathematics and Statistics ›› : 1 -27.

PDF
Communications in Mathematics and Statistics ›› : 1 -27. DOI: 10.1007/s40304-023-00392-0
Article

ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units via Chebyshev Approximation

Author information +
History +
PDF

Abstract

In a previous study by Li et al. (Commun Comput Phys 27(2):379–411, 2020), it is shown that deep neural networks built with rectified power units (RePU) as activation functions can give better approximation for sufficient smooth functions than those built with rectified linear units, by converting polynomial approximations using power series into deep neural networks with optimal complexity and no approximation error. However, in practice, power series approximations are not easy to obtain due to the associated stability issue. In this paper, we propose a new and more stable way to construct RePU deep neural networks based on Chebyshev polynomial approximations. By using a hierarchical structure of Chebyshev polynomial approximation in frequency domain, we obtain efficient and stable deep neural network construction, which we call ChebNet. The approximation of smooth functions by ChebNets is no worse than the approximation by deep RePU nets using power series. On the same time, ChebNets are much more stable. Numerical results show that the constructed ChebNets can be further fine-tuned to obtain much better results than those obtained by tuning deep RePU nets constructed by power series approach. As spectral accuracy is hard to obtain by direct training of deep neural networks, ChebNets provide a practical way to obtain spectral accuracy, it is expected to be useful in real applications that require efficient approximations of smooth functions.

Cite this article

Download citation ▾
Shanshan Tang,Bo Li,Haijun Yu. ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units via Chebyshev Approximation. Communications in Mathematics and Statistics 1-27 DOI:10.1007/s40304-023-00392-0

登录浏览全文

4963

注册一个新账户 忘记密码

References

Funding

National Natural Science Foundation of China(12171467)

AI Summary AI Mindmap
PDF

336

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/