Distance metric learning guided adaptive subspace semi-supervised clustering

Xuesong YIN , Enliang HU

Front. Comput. Sci. ›› 2011, Vol. 5 ›› Issue (1) : 100 -108.

PDF (350KB)
Front. Comput. Sci. ›› 2011, Vol. 5 ›› Issue (1) : 100 -108. DOI: 10.1007/s11704-010-0376-9
RESEARCH ARTICLE

Distance metric learning guided adaptive subspace semi-supervised clustering

Author information +
History +
PDF (350KB)

Abstract

Most existing semi-supervised clustering algorithms are not designed for handling high-dimensional data. On the other hand, semi-supervised dimensionality reduction methods may not necessarily improve the clustering performance, due to the fact that the inherent relationship between subspace selection and clustering is ignored. In order to mitigate the above problems, we present a semi-supervised clustering algorithm using adaptive distance metric learning (SCADM) which performs semi-supervised clustering and distance metric learning simultaneously. SCADM applies the clustering results to learn a distance metric and then projects the data onto a low-dimensional space where the separability of the data is maximized. Experimental results on real-world data sets show that the proposed method can effectively deal with high-dimensional data and provides an appealing clustering performance.

Keywords

semi-supervise clustering / pairwise constraint / distance metric learning / data mining

Cite this article

Download citation ▾
Xuesong YIN, Enliang HU. Distance metric learning guided adaptive subspace semi-supervised clustering. Front. Comput. Sci., 2011, 5(1): 100-108 DOI:10.1007/s11704-010-0376-9

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

WagstaffK, CardieC, RogersS, SchroedlS. Constrained k-means clustering with background knowledge. In: Proceedings of International Conference on Machine Learning (ICML), 2001, 577-584

[2]

BasuS, BilenkoM, MooneyR. A probabilistic framework for semi-supervised clustering. In: Proceedings of International Conference on Knowledge Discovery and Data Mining (SIGKDD), 2004, 59-68

[3]

LuZ, LeenT. Semi-supervised learning with penalized probabilistic clustering. In: Advances in neural information processing systems 17 (NIPS), 2005, 849-856

[4]

XingE P, NgA Y, JordanM I, RussellS. Distance metric learning with application to clustering with side-information. In: Advances in Neural Information Processing Systems 15 (NIPS), 2003, 505-512

[5]

XiangS M, NieF P, ZhangC S. Learning a Mahalanobis distance metric for data clustering and classification. Pattern Recognition, 2008, 41(12): 3600-3612

[6]

Bar-HillelA, HertzT, ShentalN, WeinshallD. Learning Distance Functions Using Equivalence Relations. In: Proceedings of International Conference on Machine Learning (ICML), 2003, 11-18

[7]

YeungD Y, ChangH. A kernel approach for semi-supervised metric learning. IEEE Transactions on Neural Networks, 2007, 18(1): 141-149

[8]

TsangI W, CheungP M, KwokJ T. Kernel Relevant Component Analysis for Distance Metric Learning. In: Proceedings of International Joint Conference on Neural Networks (IJCNN), 2005, 954-959

[9]

YeungD Y, ChangH. Extending the relevant component analysis algorithm for metric learning using both positive and negative equivalence constraints. Pattern Recognition, 2006, 39(5): 1007-1010

[10]

ZhangD Q, ZhouZ H, ChenS C. Semi-supervised Dimensionality Reduction. In: Proceedings of SIAM International Conference on Data Mining (SDM), 2007, 629-634

[11]

TangW, XiongH, ZhongS, WuJ. Enhancing Semi-Supervised Clustering: A Feature Projection Perspective. In: Proceedings of ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (SIGKDD), 2007, 707-716

[12]

BilenkoM, BasuS, MooneyR J. Integrating Constraints and Metric Learning in Semi-Supervised Clustering. In: Proceedings of International Conference on Machine Learning (ICML), 2004, 81-88

[13]

CaiD, HeX F, HanJ W. Semi-Supervised Discriminant Analysis. In: Proceedings of International Conference on Computer Vision (ICCV), 2007, 1-7

[14]

JolliffeI. Principal Component Analysis. 2nd edition. Berlin: Springer, 2002

[15]

YinX S, HuE L, ChenS C. Discriminative semi-supervised clustering analysis with pairwise constraints. Journal of Software, 2008, 19(11): 2791-2802 (in Chinese)

[16]

DhillonI S, GuanY, KulisB. A unified view of kernel k-means, spectral clustering and graph partitioning. Technical report. Department of Computer Sciences, University of Texas at Austin, 2005

[17]

TsengP. Convergence of a block coordinate descent method for nondifferentiable minimization. Journal of Optimization Theory and Applications, 2001, 109(3): 475-494

[18]

MartinezA M, KakA C. PCA versus LDA. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2001, 23(2): 228-233

[19]

DingC. LiT. Adaptive dimension reduction using discriminant analysis and k-means clustering. In: Proceedings of International Conference on Machine Learning (ICML), 2007, 521-528

[20]

YeJ, ZhaoZ, WuM. Discriminative K-Means for Clustering. In: Advances in Neural Information Processing Systems 19 (NIPS), 2007, 1649-1656

[21]

YeJ P, ZhaoZ, LiuH. Adaptive Distance Metric Learning for Clustering. In: Proceedings of Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), 2007, 1-7

[22]

WuM. ScholkopfB. A Local Learning Approach for Clustering. In: Advances in Neural Information Processing Systems 18 (NIPS), 2006, 1529-1536

[23]

WangF, ZhangC, LiT. Clustering with Local and Global Regularization. In: Proceedings of Conference on Artificial Intelligence (AAAI), 2007, 657-662

RIGHTS & PERMISSIONS

Higher Education Press and Springer-Verlag Berlin Heidelberg

AI Summary AI Mindmap
PDF (350KB)

1229

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/