Towards an Understanding of Residual Networks Using Neural Tangent Hierarchy (NTH)

Yuqing Li , Tao Luo , Nung Kwan Yip

CSIAM Trans. Appl. Math. ›› 2022, Vol. 3 ›› Issue (4) : 692 -760.

PDF (74KB)
CSIAM Trans. Appl. Math. ›› 2022, Vol. 3 ›› Issue (4) : 692 -760. DOI: 10.4208/csiam-am.SO-2021-0053
research-article

Towards an Understanding of Residual Networks Using Neural Tangent Hierarchy (NTH)

Author information +
History +
PDF (74KB)

Abstract

Gradient descent yields zero training loss in polynomial time for deep neural networks despite non-convex nature of the objective function. The behavior of network in the infinite width limit trained by gradient descent can be described by the Neural Tangent Kernel (NTK) introduced in [25]. In this paper, we study dynamics of the NTK for finite width Deep Residual Network (ResNet) using the neural tangent hierarchy (NTH) proposed in [24]. For a ResNet with smooth and Lipschitz activation function, we reduce the requirement on the layer width mm with respect to the number of training samples nn from quartic to cubic. Our analysis suggests strongly that the particular skip-connection structure of ResNet is the main reason for its triumph over fully-connected network.

Keywords

Residual networks / training process / neural tangent kernel / neural tangent hierarchy

Cite this article

Download citation ▾
Yuqing Li, Tao Luo, Nung Kwan Yip. Towards an Understanding of Residual Networks Using Neural Tangent Hierarchy (NTH). CSIAM Trans. Appl. Math., 2022, 3(4): 692-760 DOI:10.4208/csiam-am.SO-2021-0053

登录浏览全文

4963

注册一个新账户 忘记密码

References

AI Summary AI Mindmap
PDF (74KB)

140

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/