Anderson Acceleration of Gradient Methods with Energy for Optimization Problems

Hailiang Liu, Jia-Hao He, Xuping Tian

Communications on Applied Mathematics and Computation ›› 2023, Vol. 6 ›› Issue (2) : 1299-1318.

Communications on Applied Mathematics and Computation ›› 2023, Vol. 6 ›› Issue (2) : 1299-1318. DOI: 10.1007/s42967-023-00327-0
Original Paper

Anderson Acceleration of Gradient Methods with Energy for Optimization Problems

Author information +
History +

Abstract

Anderson acceleration (AA) is an extrapolation technique designed to speed up fixed-point iterations. For optimization problems, we propose a novel algorithm by combining the AA with the energy adaptive gradient method (AEGD) [arXiv:2010.05109]. The feasibility of our algorithm is ensured in light of the convergence theory for AEGD, though it is not a fixed-point iteration. We provide rigorous convergence rates of AA for gradient descent (GD) by an acceleration factor of the gain at each implementation of AA-GD. Our experimental results show that the proposed AA-AEGD algorithm requires little tuning of hyperparameters and exhibits superior fast convergence.

Cite this article

Download citation ▾
Hailiang Liu, Jia-Hao He, Xuping Tian. Anderson Acceleration of Gradient Methods with Energy for Optimization Problems. Communications on Applied Mathematics and Computation, 2023, 6(2): 1299‒1318 https://doi.org/10.1007/s42967-023-00327-0
Funding
Directorate for Mathematical and Physical Sciences(1812666)

Accesses

Citations

Detail

Sections
Recommended

/