Linear discriminant analysis with worst between-class separation and average within-class compactness

Leilei YANG , Songcan CHEN

Front. Comput. Sci. ›› 2014, Vol. 8 ›› Issue (5) : 785 -792.

PDF (413KB)
Front. Comput. Sci. ›› 2014, Vol. 8 ›› Issue (5) : 785 -792. DOI: 10.1007/s11704-014-3337-x
RESEARCH ARTICLE

Linear discriminant analysis with worst between-class separation and average within-class compactness

Author information +
History +
PDF (413KB)

Abstract

Linear discriminant analysis (LDA) is one of the most popular supervised dimensionality reduction (DR) techniques and obtains discriminant projections by maximizing the ratio of average-case between-class scatter to averagecase within-class scatter. Two recent discriminant analysis algorithms (DAS), minimal distance maximization (MDM) and worst-case LDA (WLDA), get projections by optimizing worst-case scatters. In this paper, we develop a new LDA framework called LDA with worst between-class separation and average within-class compactness (WSAC) by maximizing the ratio of worst-case between-class scatter to averagecase within-class scatter. This can be achieved by relaxing the trace ratio optimization to a distance metric learning problem. Comparative experiments demonstrate its effectiveness. In addition, DA counterparts using the local geometry of data and the kernel trick can likewise be embedded into our framework and be solved in the same way.

Keywords

dimensionality reduction / linear discriminant analysis / the worst separation / the average compactness

Cite this article

Download citation ▾
Leilei YANG, Songcan CHEN. Linear discriminant analysis with worst between-class separation and average within-class compactness. Front. Comput. Sci., 2014, 8(5): 785-792 DOI:10.1007/s11704-014-3337-x

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Fukunaga K. Statistical Pattern Recognition. Academic Press, 1990

[2]

Duda R. Hart P. Stork D. Pattern Classification. 2nd ed. New York: John Wiley & Sons, Inc., 2001

[3]

Yang B. Chen S. Wu X. A structurally motivated framework for discriminant analysis. Pattern Analysis and Application, 2011, 14(4): 349−367

[4]

Jae H. Nojun K. Generalization of linear discriminant analysis using L p-norm. Pattern Recognition Letters, 2013, 34(6): 679−685

[5]

Ching W. Chu D. Liao L. Wang X. Regularized orthogonal linear discriminant analysis. Pattern Recognition, 2012, 45(7): 2719−2732

[6]

Li H. Jiang T. Zhang K. Efficient and robust feature extraction by maximum margin criterion. IEEE Transaction on Neural Networks, 2006, 17(1): 157−165

[7]

Nenadic Z. Information discriminant analysis: feature extraction with an information-theoretic objective. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(8): 1394−1407

[8]

Zhu M. Martinez A. Subclass discriminant analysis. IEEE Transactions on Pattern Analysis andMachine Intelligence, 2006, 28(8): 1274−1286

[9]

Gao Q. Liu J. Zhang H. Hou J. Yang X. Enhanced fisher discriminant criterion for image recognition. Pattern Recognition, 2012, 45(10): 3717−3724

[10]

Cai D. He X. Kun Z. Han J. Bao H. Local sensitive discriminant analysis. In Proceedings of the international joint conference on artificial intelligence. 2007, 141−146

[11]

Fan Z. Xu Y. Zhang D. Local linear discriminant analysis framework using sample neighbors. IEEE Transaction on Neural Networks, 2011, 22(7): 1119−1132

[12]

Xu B. Huang K. Liu C. Dimensionality reduction by minimal distance maximization. In: Proceedings of 20th International Conference on Pattern Recognition, 2010, 569−572

[13]

Zhang Y. Yeung D. Worst-case linear discriminant analysis. In: Proceedings of Advances in Neural Information Processing Systems. 2010, 2568−2576

[14]

Ying Y. Li P. Distance metric learning with eigenvalue optimization. Journal of Machine Learning Research, 2012, 13: 1−26

[15]

Overton M. Womersley R. Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices. Math Programming, 1993, 62(2): 321−357

[16]

Overton M. On minimizing the maximum eigenvalue of a symmetric matrix. SIAM Journal on Matrix Analysis and Applications, 1988, 9(2): 256−268

[17]

Zhang Y. Yeung D. Semi-supervised generalized discriminant analysis. IEEE Transactions on Neural Network, 2011, 22(8): 1207−1217

[18]

Mika S. Ratsch G. Weston J. Scholkopf B. Smola A. Muller K. Constructing descriptive and discriminative nonlinear features: Rayleigh coefficients in kernel feature spaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2003, 25(5): 623−628

[19]

Duin R. Loog M. Linear dimensionality reduction via a heteroscedastic extension of LDA: the Chernoff criterion. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004, 26(6): 732−739

[20]

Frank A. Asuncion A. UCI Machine Learning Repository.

[21]

Belhumeur P. Hespanha J. Kriegman D. Eigenfaces vs fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997, 19(7): 711−720

[22]

Martinez A. Benavente R. The AR-face database. Technical Report 24, CVC, 1998.

[23]

Nene S. Nayar S. Murase H. Columbia Object Image Library (COIL-20). Technical Report005, CUCS, 1996.

RIGHTS & PERMISSIONS

Higher Education Press and Springer-Verlag Berlin Heidelberg

AI Summary AI Mindmap
PDF (413KB)

1291

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/