Joint Spectral Regression Methods for Large-Scale Discriminant Analysis

Gang Wu , Wen Yang

Communications on Applied Mathematics and Computation ›› 2025, Vol. 7 ›› Issue (5) : 1791 -1814.

PDF
Communications on Applied Mathematics and Computation ›› 2025, Vol. 7 ›› Issue (5) : 1791 -1814. DOI: 10.1007/s42967-024-00402-0
Original Paper
research-article

Joint Spectral Regression Methods for Large-Scale Discriminant Analysis

Author information +
History +
PDF

Abstract

Spectral regression discriminant analysis (SRDA) is one of the most popular methods for large-scale discriminant analysis. It is a stepwise algorithm composed of two steps. First, the response vectors are obtained from solving an eigenvalue problem. Second, the projection vectors are computed by solving a least-squares problem. However, the independent two steps can not guarantee the optimality of the two terms. In this paper, we propose a unified framework to compute both the response matrix and the projection matrix in SRDA, so that one can extract the discriminant information of classification tasks more effectively. The convergence of the proposed method is discussed. Moreover, we shed light on how to choose the joint parameter adaptively, and propose a parameter-free joint spectral regression discriminant analysis (JointSRDA-PF) method. Numerical experiments are made on some real-world databases, which show the numerical behavior of the proposed methods and the effectiveness of our strategies.

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Keywords

Dimension reduction / Spectral regression discriminant analysis (SRDA) / Joint principal component and discriminant analysis (JPCDA) algorithm / Joint spectral regression discriminant analysis (JointSRDA) / 65F10 / 65F15

Cite this article

Download citation ▾
Gang Wu, Wen Yang. Joint Spectral Regression Methods for Large-Scale Discriminant Analysis. Communications on Applied Mathematics and Computation, 2025, 7(5): 1791-1814 DOI:10.1007/s42967-024-00402-0

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

BelhumeourP, HespanhaJ, KriegmanD. Eigenfaces vs. fisherfaces: recognition using class specific linear projection. IEEE Trans. Pattern Anal. Mach. Intell., 1997, 19: 711-720

[2]

CaiD, HeX, HanJ. SRDA: an efficient algorithm for large-scale discriminant analysis. IEEE Trans. Knowl. Data Eng., 2008, 20: 1-12

[3]

Chung F.: Spectral Graph Theory. AMS, Providence (1997)

[4]

Dai, D., Uen, P.: Face recognition by regularized discriminant analysis. IEEE Trans. Syst. Man. Cybern. Part B (Cybernetics) 37, 1080–1085 (2007)

[5]

DudaR, HartPPattern Classication and Scene Analysis, 1973, New York. Wiley.

[6]

FisherR. The use of multiple measurements in taxonomic problems. Ann. Eugen., 1936, 7: 179-188

[7]

FriedmanJ. Regularized discriminant analysis. J. Am. Stat. Assoc., 1989, 84: 165-175

[8]

GaoW, CaoB, ShanS, ChenX, ZhouD, ZhangX, ZhaoD. The CAS-PEAL large-scale Chinese face database and baseline evaluations. IEEE Trans. Syst. Man. Cybern. Part A, 2008, 38: 149-161

[9]

GolubGH, Van LoanCFMatrix Computations, 20134Baltimore. Johns Hopkins University Press.

[10]

HalkoN, MartinssonP, TroppJ. Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions. SIAM Rev., 2011, 53: 217-288

[11]

He, X., Yan, S., Hu, Y., Niyogi, P., Zhang, H.: Face recognition using Laplacian faces. IEEE Trans. Pattern Anal. Mach. Intell. 27, 328–340 (2005)

[12]

HuL, ZhangW. Orthogonal neighborhood preserving discriminant analysis with patch embedding for face recognition. Pattern Recognit., 2020, 106107450

[13]

Huang, J., Nie, F., Huang H.: Spectral rotation versus K-means in spectral clustering. In: Proceedings of the 27th AAAI Conference on Artificial Intelligence, pp. 431–437 (2013)

[14]

KrzanowskiW, JonathanP, McarthyW, ThomashM. Discriminant analysis with singular covariance matrices: methods and applications to spectroscopic data. Appl. Stat., 1995, 44: 101-115

[15]

LuY, WuG. Fast and incremental algorithms for exponential semi-supervised discriminant embedding. Pattern Recognit., 2020, 108107530

[16]

MohriM, RostamizadehA, TalwalkarAFoundations of Machine Learning, 2012, Cambridge. The MIT Press.

[17]

MoulinC, LargeronC, DucottetC, GeryM, BaratC. Fisher linear discriminant analysis for text-image combination in multimedia information retrieval. Pattern Recognit., 2014, 47: 260-269

[18]

NieF, ZhangR, LiX. A generalized power iteration method for solving quadratic problem on the Stiefel manifold. Sci. China Inf. Sci., 2017, 60112101

[19]

PaigeC, SaundersM. LSQR: an algorithm for sparse linear equations and sparse least squares. ACM Tran. Math. Soft., 1982, 8: 43-71

[20]

PangY, XieJ, NieF, LiX. Spectral clustering by joint spectral embedding and spectral rotation. IEEE Trans. Cybern., 2020, 50: 247-258

[21]

ParkC, ParkH. A comparison of generalized linear discriminant analysis algorithms. Pattern Recognit., 2008, 41: 1083-1097

[22]

RaoR. The utilization of multiple measurements in problems of biological classification. J. R. Stat. Soc., 1948, 10: 159-203

[23]

SasithradeviA, Mohamed MansoorRS. Video classification and retrieval through spatio-temporal Radon features. Pattern Recognit., 2020, 99107099

[24]

ShiW, WuG. Perturbation analysis on PCA plus graph embedding methods and PCA plus exponential graph embedding methods. J. Comput. Appl. Math., 2024, 444115788

[25]

WangF, WangQ, NieF, LiZ, YuW, WangR. Unsupervised linear discriminant analysis for jointly clustering and subspace learning. IEEE Trans. Knowl. Data Eng., 2021, 33: 1276-1290

[26]

WoldS, EsbensenK, GeladiP. Principal component analysis. Chemom. Intell. Lab. Syst., 1987, 2: 37-52

[27]

Wolf, L., Hassner, T., Maoz, I.: Face recognition in unconstrained videos with matched background similarity. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 529–534 (2011)

[28]

WuG, FengT, ZhangL, YangM. Inexact implementation using Krylov subspace methods for large scale exponential discriminant analysis with applications to high dimensionality reduction problems. Pattern Recognit., 2017, 66: 328-341

[29]

Yu, S., Shi, J.: Multiclass spectral clustering. In: Proceedings Ninth IEEE International Conference on Computer Vision. pp. 313–319 (2003)

[30]

ZhangR, NieF, LiX. Self-weighted spectral clustering with parameter-free constraint. Neurocomputing, 2017, 241: 164-170

[31]

Zhang, X., Cheng, L., Chu, D., Liao, L., NG, M., Tan, R.: Incremental regularized least squares for dimensionality reduction of large-scale data. SIAM J. Sci. Comput. 38, B414–B439 (2016)

[32]

ZhangX, WangL, XiangS, LiuC. Retargeted least squares regression algorithm. IEEE Trans. Neur. Netw. Learn. Syst., 2014, 26: 2206-2213

[33]

ZhaoX, GuoJ, NieF, ChenL, LiZ, ZhangH. Joint principal component and discriminant analysis for dimensionality reduction. IEEE Trans. Neur. Netw. Learn. Sys., 2020, 31: 433-444

[34]

ZhouZMachine Learning and Its Applications, 2006, Beijing. Tsinghua University Press.

[35]

Zhu, L., Huang, D.: A Rayleigh-Ritz style method for large-scale discriminant analysis. Pattern Recogn. 47, 1698–1708 (2014)

Funding

National Natural Science Foundation of China(12271518)

RIGHTS & PERMISSIONS

Shanghai University

AI Summary AI Mindmap
PDF

145

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/