Co-metric: a metric learning algorithm for data with multiple views

Qiang QIAN, Songcan CHEN

PDF(1126 KB)
PDF(1126 KB)
Front. Comput. Sci. ›› 2013, Vol. 7 ›› Issue (3) : 359-369. DOI: 10.1007/s11704-013-2110-x
RESEARCH ARTICLE

Co-metric: a metric learning algorithm for data with multiple views

Author information +
History +

Abstract

We address the problem of metric learning for multi-view data. Many metric learning algorithms have been proposed, most of them focus just on single view circumstances, and only a few deal with multi-view data. In this paper, motivated by the co-training framework, we propose an algorithm-independent framework, named co-metric, to learn Mahalanobis metrics in multi-view settings. In its implementation, an off-the-shelf single-view metric learning algorithm is used to learn metrics in individual views of a few labeled examples. Then the most confidently-labeled examples chosen from the unlabeled set are used to guide the metric learning in the next loop. This procedure is repeated until some stop criteria are met. The framework can accommodate most existing metric learning algorithms whether types-ofside- information or example-labels are used. In addition it can naturally deal with semi-supervised circumstances under more than two views. Our comparative experiments demonstrate its competiveness and effectiveness.

Keywords

multi-view learning / metric learning / algorithmindependent framework

Cite this article

Download citation ▾
Qiang QIAN, Songcan CHEN. Co-metric: a metric learning algorithm for data with multiple views. Front Comput Sci, 2013, 7(3): 359‒369 https://doi.org/10.1007/s11704-013-2110-x

References

[1]
Davis J, Kulis B, Jain P, Sra S, Dhillon I. Information-theoretic metric learning. In: Proceedings of the 24th International Conference on Machine Learning. 2007, 209-216
[2]
Weinberger K, Saul L. Distance metric learning for large margin nearest neighbor classification. The Journal of Machine Learning Research, 2009, 10: 207-244
[3]
Goldberger J, Roweis S, Hinton G, Salakhutdinov R. Neighbourhood components analysis. Advances in Neural Information Processing Systems, 2005, 513-520
[4]
Zheng H, Wang M, Li Z. Audio-visual speaker identification with multi-view distance metric learning. In: Proceedings of the 17th IEEE International Conference on Image Processing. 4561-4564
[5]
Zhai D, Chang H, Shan S, Chen X, Gao W. Multiview metric learning with global consistency and local smoothness. ACM Transactions on Intelligent Systems and Technology (TIST), 2012, 3(3): 53
CrossRef Google scholar
[6]
Blum A, Mitchell T. Combining labeled and unlabeled data with cotraining. In: Proceedings of the 11th Annual Conference on Computational Learning Theory. 1998, 92-100
[7]
Guo R, Chakraborty S. Bayesian adaptive nearest neighbor. Statistical Analysis and Data Mining, 2010, 3(2): 92-105
[8]
Holmes C, Adams N. A probabilistic nearest neighbour method for statistical pattern recognition. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2002, 64(2): 295-306
CrossRef Google scholar
[9]
Tomasev N, Radovanović M, Mladenić D, Ivanović M. A probabilistic approach to nearest-neighbor classification: naive hubness bayesian kNN. In: Proceedings of the 20th ACM International Conference on Information and Knowledge Management. 2011, 2173-2176
[10]
Xing E, Ng A, Jordan M, Russell S. Distance metric learning, with application to clustering with side-information. Advances in Neural Information Processing Systems. 2002, 505-512
[11]
Shalev-Shwartz S, Singer Y, Ng A. Online and batch learning of pseudo-metrics. In: Proceedings of the 21st International Conference on Machine Learning. 2004, 94-103
[12]
Globerson A, Roweis S. Metric learning by collapsing classes. Advances in Neural Information Processing Systems, 2006
[13]
Weinberger K, Saul L. Fast solvers and efficient implementations for distance metric learning. In: Proceedings of the 25th International Conference on Machine Learning. 2008, 1160-1167
CrossRef Google scholar
[14]
Zhou Z, Li M. Semi-supervised learning by disagreement. Knowledge and Information Systems, 2010, 24(3): 415-439
CrossRef Google scholar
[15]
Zhou Z. Unlabeled data and multiple views. In: Proceedings of the 1st IAPR TC3 Conference on Partially Supervised Learning. 2011, 1-7
[16]
Zhou Z, Chen K, Dai H. Enhancing relevance feedback in image retrieval using unlabeled data. ACM Transactions on Information Systems (TOIS), 2006, 24(2): 219-244
CrossRef Google scholar
[17]
Wang W, Zhou Z. Multi-view active learning in the non-realizable case. arXiv preprint arXiv:1005.5581, 2010
[18]
Nigam K, Ghani R. Analyzing the effectiveness and applicability of co-training. In: Proceedings of the 9th International Conference on Information and Knowledge Management(CIKM2000). 2000
[19]
Brefeld U, Scheffer T. Co-em support vector learning. In: Proceedings of the 21st International Conference on Machine Learning. 2004, 16-24
[20]
Kumar A, Daume III H. A co-training approach for multi-view spectral clustering. In: Proceedings of the 28th IEEE International Conference on Machine Learning. 2011
[21]
Yarowsky D. Unsupervised word sense disambiguation rivaling supervised methods. In: Proceedings of the 33rd Annual Meeting on Association for Computational Linguistics. 1995, 189-196
CrossRef Google scholar
[22]
Bickel S, Scheffer T. Multi-view clustering. In: Proceedings of the Fourth IEEE International Conference on Data Mining. 2004, 19-26
CrossRef Google scholar
[23]
Zhang M, Zhou Z. coTrade: confident co-training with data editing. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 2011, 41(6): 1612-1626
CrossRef Google scholar
[24]
Manocha S, Girolami M. An empirical analysis of the probabilistic k-nearest neighbour classifier. Pattern Recognition Letters, 2007, 28(13): 1818-1824
CrossRef Google scholar
[25]
Cucala L, Marin J, Robert C, Titterington D. A Bayesian reassessment of nearest-neighbor classification. Journal of the American Statistical Association, 2009, 104(485): 263-273
CrossRef Google scholar
[26]
Sun T, Chen S, Yang J, Shi P. A novel method of combined feature extraction for recognition. In: Proceedings of 8th IEEE International Conference on the Data Mining. 2008, 1043-1048
[27]
Frank A, Asuncion A. UCI machine learning repository, 2010. http://archive.ics.uci.edu/ml
[28]
Dumais S. Latent semantic analysis. Annual Review of Information Science and Technology, 2004, 38(1): 188-230
CrossRef Google scholar
[29]
Sa V R, Gallagher P, Lewis J, Malave V. Multi-view kernel construction. Machine Learning, 2010, 79(1): 47-71
CrossRef Google scholar
[30]
Balcan M, Blum A, Ke Y. Co-training and expansion: towards bridging theory and practice. Advances in Neural Information Processing Systems, 2004
[31]
Shawe-Taylor N, Kandola A. On kernel target alignment. In: Advances in Neural Information Processing Systems. 2002

RIGHTS & PERMISSIONS

2014 Higher Education Press and Springer-Verlag Berlin Heidelberg
AI Summary AI Mindmap
PDF(1126 KB)

Accesses

Citations

Detail

Sections
Recommended

/