Convergence of Hyperbolic Neural Networks Under Riemannian Stochastic Gradient Descent

Wes Whiting , Bao Wang , Jack Xin

Communications on Applied Mathematics and Computation ›› 2023, Vol. 6 ›› Issue (2) : 1175 -1188.

PDF
Communications on Applied Mathematics and Computation ›› 2023, Vol. 6 ›› Issue (2) : 1175 -1188. DOI: 10.1007/s42967-023-00302-9
Original Paper

Convergence of Hyperbolic Neural Networks Under Riemannian Stochastic Gradient Descent

Author information +
History +
PDF

Abstract

We prove, under mild conditions, the convergence of a Riemannian gradient descent method for a hyperbolic neural network regression model, both in batch gradient descent and stochastic gradient descent. We also discuss a Riemannian version of the Adam algorithm. We show numerical simulations of these algorithms on various benchmarks.

Cite this article

Download citation ▾
Wes Whiting, Bao Wang, Jack Xin. Convergence of Hyperbolic Neural Networks Under Riemannian Stochastic Gradient Descent. Communications on Applied Mathematics and Computation, 2023, 6(2): 1175-1188 DOI:10.1007/s42967-023-00302-9

登录浏览全文

4963

注册一个新账户 忘记密码

References

Funding

Directorate for Mathematical and Physical Sciences(MS-1854434)

Directorate for Mathematical and Physical Sciences(DMS-1952339)

U.S. Department of Energy(DE-SC0021142)

AI Summary AI Mindmap
PDF

192

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/