A Novel Greedy Block Gauss-Seidel Method for Solving Large Linear Least-Squares Problems

Chao Sun , Xiao-Xia Guo

Communications on Applied Mathematics and Computation ›› 2025, Vol. 7 ›› Issue (5) : 1959 -1976.

PDF
Communications on Applied Mathematics and Computation ›› 2025, Vol. 7 ›› Issue (5) : 1959 -1976. DOI: 10.1007/s42967-024-00417-7
Original Paper
research-article

A Novel Greedy Block Gauss-Seidel Method for Solving Large Linear Least-Squares Problems

Author information +
History +
PDF

Abstract

In this paper, we present a new convergence upper bound for the greedy Gauss-Seidel (GGS) method proposed by Zhang and Li [38]. The new convergence upper bound improves the upper bound of the GGS method. In addition, we also propose a novel greedy block Gauss-Seidel (RDBGS) method based on the greedy strategy of the GGS method for solving large linear least-squares problems. It is proved that the RDBGS method converges to the unique solution of the linear least-squares problem. Numerical experiments demonstrate that the RDBGS method has superior performance in terms of iteration steps and computation time.

Keywords

Greedy strategy / Linear least-squares problem / Block Gauss-Seidel method / Convergence property / 15A24 / 15A06 / 65F10 / 65F45

Cite this article

Download citation ▾
Chao Sun, Xiao-Xia Guo. A Novel Greedy Block Gauss-Seidel Method for Solving Large Linear Least-Squares Problems. Communications on Applied Mathematics and Computation, 2025, 7(5): 1959-1976 DOI:10.1007/s42967-024-00417-7

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

BaiZ-Z, LiuX-G. On the Meany inequality with applications to convergence analysis of several row-action iteration methods. Numer. Math., 2013, 124: 215-236

[2]

BaiZ-Z, PanJ-YMatrix Analysis and Computations, 2021, Philadelphia, PA. SIAM.

[3]

BaiZ-Z, WangL. On convergence rates of Kaczmarz-type methods with different selection rules of working rows. Appl. Numer. Math., 2023, 186: 289-319

[4]

BaiZ-Z, WangL, WuW-T. On convergence rate of the randomized Gauss-Seidel method. Linear Algebra Appl., 2021, 611: 237-252

[5]

BaiZ-Z, WuW-T. On greedy randomized Kaczmarz method for solving large sparse linear systems. SIAM J. Sci. Comput., 2018, 40: A592-A606

[6]

BaiZ-Z, WuW-T. On relaxed greedy randomized Kaczmarz methods for solving large sparse linear systems. Appl. Math. Lett., 2018, 83: 21-26

[7]

BaiZ-Z, WuW-T. On greedy randomized coordinate descent methods for solving large linear least-squares problems. Numer. Linear Algebra Appl., 2019, 26(4): 1-15

[8]

Bai, Z.-Z., Wu, W.-T.: Randomized Kaczmarz iteration methods: algorithmic extensions and convergence theory. Jpn. J. Indust. Appl. Math. 40(3), 1421–1443 (2023)

[9]

BjörckÅNumerical Methods for Least Squares Problems, 1996, Philadelphia. SIAM.

[10]

BoumanCA, SauerK. A unified approach to statistical tomography using coordinate descent optimization. IEEE Trans. Image Process., 1996, 5(3): 480-492

[11]

Canutescu, A.A., Dunbrack, R.L.: Cyclic coordinate descent: a robotics algorithm for protein loop closure. Protein Sci. 12(5), 963–972 (2003)

[12]

ChangKW, HsiehCJ, LinCJ. Coordinate descent method for large-scale L2-loss linear support vector machines. J. Mach. Learn. Res., 2008, 9: 1369-1398

[13]

ChenJ-Q, HuangZ-D. A fast block coordinate descent method for solving linear least squares problems. East Asian J. Appl. Math., 2022, 12: 406-420

[14]

DavisTA, HuY. The University of Florida sparse matrix collection. ACM Trans. Math. Softw., 2011, 38(1): 1-25

[15]

Demmel, J.W.: Applied Numerical Linear Algebra. Tsinghua University Press, Beijing (1997)

[16]

DrineasP, MahoneyMW, MuthukrishnanS. Relative-error CUR matrix decompositions. SIAM J. Matrix Anal. A., 2008, 30(2): 844-881

[17]

DuK, RuanC-C, SunX-H. On the convergence of a randomized block coordinate descent algorithm for a matrix least-squares problem. Appl. Math. Lett., 2022, 124107689

[18]

DuanL-X, ZhangG-F. Variant of greedy randomized Gauss-Seidel method for ridge regression. Numer. Math. Theor. Meth. Appl., 2021, 14: 714-737

[19]

GolubG. Numerical methods for solving linear least-squares problems. Numer. Math., 1965, 7(3): 206-216

[20]

GriebelM, OswaldP. Greedy and randomzied versions of the multiplicative Schwarz method. Linear Algebra Appl., 2012, 437: 1596-1610

[21]

Jin, L.-L., Li, H.-B.: Greedy double subspaces coordinate descent method via orthogonalization. arXiv:2203.02153v2 (2022)

[22]

Leventhal, D., Lewis, A.S.: Randomized methods for linear constraints: convergence rates and conditioning. Math. Oper. Res. 35, 641–654 (2010)

[23]

Li, H.-Y., Zhang, Y.-J.: Greedy block Gauss-Seidel methods for solving large least-squares problem. arXiv: 2004.02476v1 (2020)

[24]

LinQ, LuZ, XiaoL. An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization. SIAM J. Optim., 2015, 25: 2244-2273

[25]

LiuY, JiangX-L, GuC-Q. On maximum residual block and two-step Gauss-Seidel algorithms for linear least-squares problems. Calcolo., 2021, 58(2): 1-32

[26]

MaAN, NeedellD, RamdasA. Convergence properties of the randomized extended Gauss-Seidel and Kaczmarz methods. SIAM J. Matrix Anal. Appl., 2015, 36: 1590-1604

[27]

NiuY-Q, ZhengB. A new randomzied Gauss-Seidel method for solving linear least-squares problems. Appl. Math. Lett., 2021, 116107057

[28]

Nutini, J., Sepehry, B., Laradji, I., Virani, A., Schmidt, M., Koepke, H.: Convergence rates for greedy Kaczmarz algorithms, and faster randomized Kaczmarz rules using the orthogonality graph. arXiv:1612.07838 (2016)

[29]

QuarteroniA, SaccoR, SaleriFNumerical Mathematics, 2002, New York. Springer-Verlag.

[30]

RuheA. Numerical aspects of Gram-Schmidt orthogonalization of vectors. Linear Algebra Appl., 1983, 52(1): 591-601

[31]

SaadYIterative Methods for Sparse Linear Systems, 20032Philadelphia. SIAM.

[32]

Sorensen, D.C., Embree, M.: A DEIM induced CUR factorization. SIAM J. Sci. Comput. 38(3), A1454–A1482 (2016)

[33]

Thoppe, G., Borkar, V.S., Garg, D.: Greedy block coordinate descent (GBCD) method for high dimensional quadratic programs. arXiv:1404.6635v3 (2014)

[34]

WrightSJ. Coordinate descent algorithms. Math. Program., 2015, 151(1): 3-34

[35]

WuW-MConvergence of the randomized block Gauss-Seidel method, 2018, Claremont Colleges. Los Angeles.

[36]

YeJC, WebbKJ, BoumanCA, MillaneRP. Optical diffusion tomography by iterative-coordinate-descent optimization in a Bayesian framework. J. Opt. Soc. Am. A, 1999, 16(10): 2400-2412

[37]

ZhangJ-H, GuoJ-H. On relaxed greedy randomized coordinate descent methods for solving large linear least-squares problems. Appl. Numer. Math., 2020, 157: 372-384

[38]

Zhang, Y.-J., Li, H.-Y.: A novel greedy Gauss-Seidel method for solving large linear least-squares problem. arXiv:2004.03692v1 (2020)

RIGHTS & PERMISSIONS

Shanghai University

AI Summary AI Mindmap
PDF

102

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/