Review of Mathematical Optimization in Federated Learning

Shusen Yang , Fangyuan Zhao , Zihao Zhou , Liang Shi , Xuebin Ren , Zongben Xu

CSIAM Trans. Appl. Math. ›› 2025, Vol. 6 ›› Issue (2) : 207 -249.

PDF (50KB)
CSIAM Trans. Appl. Math. ›› 2025, Vol. 6 ›› Issue (2) : 207 -249. DOI: 10.4208/csiam-am.SO-2024-0023
research-article

Review of Mathematical Optimization in Federated Learning

Author information +
History +
PDF (50KB)

Abstract

Federated learning (FL) has been becoming a popular interdisciplinary research area in both applied mathematics and information sciences. Mathematically, FL aims to collaboratively optimize aggregate objective functions over distributed datasets while satisfying a variety of privacy and system constraints. Different from conventional distributed optimization methods, FL needs to address several specific issues (e.g. non-i.i.d. data and differential private noises), which pose a set of new challenges in the problem formulation, algorithm design, and convergence analysis. In this paper, we will systematically review existing FL optimization research including their assumptions, formulations, methods, and theoretical results. Potential future directions are also discussed.

Keywords

Federated learning / distributed optimization / convergence analysis / error bounds

Cite this article

Download citation ▾
Shusen Yang, Fangyuan Zhao, Zihao Zhou, Liang Shi, Xuebin Ren, Zongben Xu. Review of Mathematical Optimization in Federated Learning. CSIAM Trans. Appl. Math., 2025, 6(2): 207-249 DOI:10.4208/csiam-am.SO-2024-0023

登录浏览全文

4963

注册一个新账户 忘记密码

References

AI Summary AI Mindmap
PDF (50KB)

133

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/