Distance metric learning guided adaptive subspace semi-supervised clustering

Xuesong YIN, Enliang HU

PDF(350 KB)
PDF(350 KB)
Front. Comput. Sci. ›› 2011, Vol. 5 ›› Issue (1) : 100-108. DOI: 10.1007/s11704-010-0376-9
RESEARCH ARTICLE

Distance metric learning guided adaptive subspace semi-supervised clustering

Author information +
History +

Abstract

Most existing semi-supervised clustering algorithms are not designed for handling high-dimensional data. On the other hand, semi-supervised dimensionality reduction methods may not necessarily improve the clustering performance, due to the fact that the inherent relationship between subspace selection and clustering is ignored. In order to mitigate the above problems, we present a semi-supervised clustering algorithm using adaptive distance metric learning (SCADM) which performs semi-supervised clustering and distance metric learning simultaneously. SCADM applies the clustering results to learn a distance metric and then projects the data onto a low-dimensional space where the separability of the data is maximized. Experimental results on real-world data sets show that the proposed method can effectively deal with high-dimensional data and provides an appealing clustering performance.

Keywords

semi-supervise clustering / pairwise constraint / distance metric learning / data mining

Cite this article

Download citation ▾
Xuesong YIN, Enliang HU. Distance metric learning guided adaptive subspace semi-supervised clustering. Front Comput Sci Chin, 2011, 5(1): 100‒108 https://doi.org/10.1007/s11704-010-0376-9

References

[1]
WagstaffK, CardieC, RogersS, SchroedlS. Constrained k-means clustering with background knowledge. In: Proceedings of International Conference on Machine Learning (ICML), 2001, 577-584
[2]
BasuS, BilenkoM, MooneyR. A probabilistic framework for semi-supervised clustering. In: Proceedings of International Conference on Knowledge Discovery and Data Mining (SIGKDD), 2004, 59-68
CrossRef Google scholar
[3]
LuZ, LeenT. Semi-supervised learning with penalized probabilistic clustering. In: Advances in neural information processing systems 17 (NIPS), 2005, 849-856
[4]
XingE P, NgA Y, JordanM I, RussellS. Distance metric learning with application to clustering with side-information. In: Advances in Neural Information Processing Systems 15 (NIPS), 2003, 505-512
[5]
XiangS M, NieF P, ZhangC S. Learning a Mahalanobis distance metric for data clustering and classification. Pattern Recognition, 2008, 41(12): 3600-3612
CrossRef Google scholar
[6]
Bar-HillelA, HertzT, ShentalN, WeinshallD. Learning Distance Functions Using Equivalence Relations. In: Proceedings of International Conference on Machine Learning (ICML), 2003, 11-18
[7]
YeungD Y, ChangH. A kernel approach for semi-supervised metric learning. IEEE Transactions on Neural Networks, 2007, 18(1): 141-149
CrossRef Google scholar
[8]
TsangI W, CheungP M, KwokJ T. Kernel Relevant Component Analysis for Distance Metric Learning. In: Proceedings of International Joint Conference on Neural Networks (IJCNN), 2005, 954-959
[9]
YeungD Y, ChangH. Extending the relevant component analysis algorithm for metric learning using both positive and negative equivalence constraints. Pattern Recognition, 2006, 39(5): 1007-1010
CrossRef Google scholar
[10]
ZhangD Q, ZhouZ H, ChenS C. Semi-supervised Dimensionality Reduction. In: Proceedings of SIAM International Conference on Data Mining (SDM), 2007, 629-634
[11]
TangW, XiongH, ZhongS, WuJ. Enhancing Semi-Supervised Clustering: A Feature Projection Perspective. In: Proceedings of ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (SIGKDD), 2007, 707-716
CrossRef Google scholar
[12]
BilenkoM, BasuS, MooneyR J. Integrating Constraints and Metric Learning in Semi-Supervised Clustering. In: Proceedings of International Conference on Machine Learning (ICML), 2004, 81-88
[13]
CaiD, HeX F, HanJ W. Semi-Supervised Discriminant Analysis. In: Proceedings of International Conference on Computer Vision (ICCV), 2007, 1-7
[14]
JolliffeI. Principal Component Analysis. 2nd edition. Berlin: Springer, 2002
[15]
YinX S, HuE L, ChenS C. Discriminative semi-supervised clustering analysis with pairwise constraints. Journal of Software, 2008, 19(11): 2791-2802 (in Chinese)
[16]
DhillonI S, GuanY, KulisB. A unified view of kernel k-means, spectral clustering and graph partitioning. Technical report. Department of Computer Sciences, University of Texas at Austin, 2005
[17]
TsengP. Convergence of a block coordinate descent method for nondifferentiable minimization. Journal of Optimization Theory and Applications, 2001, 109(3): 475-494
CrossRef Google scholar
[18]
MartinezA M, KakA C. PCA versus LDA. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2001, 23(2): 228-233
CrossRef Google scholar
[19]
DingC. LiT. Adaptive dimension reduction using discriminant analysis and k-means clustering. In: Proceedings of International Conference on Machine Learning (ICML), 2007, 521-528
CrossRef Google scholar
[20]
YeJ, ZhaoZ, WuM. Discriminative K-Means for Clustering. In: Advances in Neural Information Processing Systems 19 (NIPS), 2007, 1649-1656
[21]
YeJ P, ZhaoZ, LiuH. Adaptive Distance Metric Learning for Clustering. In: Proceedings of Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), 2007, 1-7
[22]
WuM. ScholkopfB. A Local Learning Approach for Clustering. In: Advances in Neural Information Processing Systems 18 (NIPS), 2006, 1529-1536
[23]
WangF, ZhangC, LiT. Clustering with Local and Global Regularization. In: Proceedings of Conference on Artificial Intelligence (AAAI), 2007, 657-662

Acknowledgements

We would like to thank the anonymous reviewers for their valuable comments and suggestions to significantly improve the quality of this paper. Also, we would also like to thank Professor Songcan Chen for useful discussions. This research work was supported by National Natural Science Foundation of China (Grant No. 60773061), Foundation for University Young Teacher by Zhejiang Province (2008) and Foundation of Zhejiang Educational Committee (Y200803408), respectively.

RIGHTS & PERMISSIONS

2014 Higher Education Press and Springer-Verlag Berlin Heidelberg
AI Summary AI Mindmap
PDF(350 KB)

Accesses

Citations

Detail

Sections
Recommended

/