A distributed stochastic optimization algorithm with gradient-tracking and distributed heavy-ball acceleration

Bihao SUN, Jinhui HU, Dawen XIA, Huaqing LI

PDF(1403 KB)
PDF(1403 KB)
Front. Inform. Technol. Electron. Eng ›› 2021, Vol. 22 ›› Issue (11) : 1463-1476. DOI: 10.1631/FITEE.2000615
Orginal Article
Orginal Article

A distributed stochastic optimization algorithm with gradient-tracking and distributed heavy-ball acceleration

Author information +
History +

Abstract

Distributed optimization has been well developed in recent years due to its wide applications in machine learning and signal processing. In this paper, we focus on investigating distributed optimization to minimize a global objective. The objective is a sum of smooth and strongly convex local cost functions which are distributed over an undirected network of n nodes. In contrast to existing works, we apply a distributed heavy-ball term to improve the convergence performance of the proposed algorithm. To accelerate the convergence of existing distributed stochastic first-order gradient methods, a momentum term is combined with a gradient-tracking technique. It is shown that the proposed algorithm has better acceleration ability than GT-SAGA without increasing the complexity. Extensive experiments on real-world datasets verify the effectiveness and correctness of the proposed algorithm.

Keywords

Distributed optimization / High-performance algorithm / Multi-agent system / Machine-learning problem / Stochastic gradient

Cite this article

Download citation ▾
Bihao SUN, Jinhui HU, Dawen XIA, Huaqing LI. A distributed stochastic optimization algorithm with gradient-tracking and distributed heavy-ball acceleration. Front. Inform. Technol. Electron. Eng, 2021, 22(11): 1463‒1476 https://doi.org/10.1631/FITEE.2000615

RIGHTS & PERMISSIONS

2021 Zhejiang University Press
PDF(1403 KB)

Accesses

Citations

Detail

Sections
Recommended

/