Anderson Acceleration of Gradient Methods with Energy for Optimization Problems
Hailiang Liu, Jia-Hao He, Xuping Tian
Communications on Applied Mathematics and Computation ›› 2023, Vol. 6 ›› Issue (2) : 1299-1318.
Anderson Acceleration of Gradient Methods with Energy for Optimization Problems
Anderson acceleration (AA) is an extrapolation technique designed to speed up fixed-point iterations. For optimization problems, we propose a novel algorithm by combining the AA with the energy adaptive gradient method (AEGD) [arXiv:2010.05109]. The feasibility of our algorithm is ensured in light of the convergence theory for AEGD, though it is not a fixed-point iteration. We provide rigorous convergence rates of AA for gradient descent (GD) by an acceleration factor of the gain at each implementation of AA-GD. Our experimental results show that the proposed AA-AEGD algorithm requires little tuning of hyperparameters and exhibits superior fast convergence.
/
〈 |
|
〉 |