Frontiers of Electrical and Electronic Engineering >
Tangent space learning and generalization
Received date: 14 Jul 2010
Accepted date: 07 Oct 2010
Published date: 05 Mar 2011
Copyright
Manifold learning has attracted considerable attention over the last decade, in which exploring the geometry and topology of the manifold is the central problem. Tangent space is a fundamental tool in discovering the geometry of the manifold. In this paper, we will first review canonical manifold learning techniques and then discuss two fundamental problems in tangent space learning. One is how to estimate the tangent space from random samples, and the other is how to generalize tangent space to ambient space. Previous studies in tangent space learning have mainly focused on how to fit tangent space, and one has to solve a global equation for obtaining the tangent spaces. Unlike these approaches, we introduce a novel method, called persistent tangent space learning (PTSL), which estimates the tangent space at each local neighborhood while ensuring that the tangent spaces vary smoothly on the manifold. Tangent space can be viewed as a point on Grassmann manifold. Inspired from the statistics on Grassmann manifold, we use intrinsic sample total variance to measure the variation of estimated tangent spaces at a single point, and thus, the generalization problem can be solved by estimating the intrinsic sample mean on Grassmann manifold. We validate our methods by various experimental results both on synthetic and real data.
Key words: tangent space learning; machine learning; manifold learning
Xiaofei HE , Binbin LIN . Tangent space learning and generalization[J]. Frontiers of Electrical and Electronic Engineering, 2011 , 6(1) : 27 -42 . DOI: 10.1007/s11460-011-0124-4
1 |
Pearson K. On lines and planes of closest fit to systems of points in space. Philosophical Magazine, 1901, 2(6): 559-572
|
2 |
Schölkopf B, Smola A, Müller K R. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 1998, 10(5): 1299-1319
|
3 |
Ham J, Lee D D, Mika S, Schölkopf B. A kernel view of the dimensionality reduction of manifolds. In: Proceedings of the 21st International Conference on Machine Learning. 2004, 369-276
|
4 |
Tenenbaum J, de Silva V, Langford J. A global geometric framework for nonlinear dimensionality reduction. Science, 2000, 290(5500): 2319-2323
|
5 |
Roweis S, Saul L. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000, 290(5500): 2323-2326
|
6 |
Belkin M, Niyogi P. Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Proceedings of Advances in Neural Information Processing Systems. 2001, 14: 585-591
|
7 |
de Silva V, Tenenbaum J B. Global versus local methods in nonlinear dimensionality reduction. In: Becker S, Thrun S, Obermayer K, eds. Proceedings of Advances in Neural Information Processing Systems. 2002, 15: 721-728
|
8 |
Zhang Z, Wang J. MLLE: Modified locally linear embedding using multiple weights. In: Schölkopf B, Platt J, Hoffman T, eds. Proceedings of Advances in Neural Information Processing Systems. 2007, 19: 1593-1600
|
9 |
Weinberger K Q, Sha F, Saul L K. Learning a kernel matrix for nonlinear dimensionality reduction. In: Proceedings of the 21th International Conference on Machine Learning. 2004, 106-113
|
10 |
Coifman R R, Lafon S, Lee A B, Maggioni M, Warner F, Zucker S. Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps. In: Proceedings of the National Academy of Sciences. 2005, 7426-7431
|
11 |
Chung F R K. Spectral Graph Theory. Regional Conference Series in Mathematics Vol 92. Providence: American Mathematical Society, 1997
|
12 |
Bengio Y, Paiement J, Vincent P, Delalleau O, Roux N L, Ouimet M. Out-of-sample extensions for LLE, Isomap, MDS, eigenmaps, and spectral clustering. In: Thrun S, Saul L, Schölkopf B, eds. Proceedings of Advances in Neural Information Processing Systems. 2004, 16: 177-184
|
13 |
Belkin M, Niyogi P. Convergence of Laplacian eigenmaps. In: Schöllkopf B, Platt J, Hoffman T, eds. Proceedings of Advances in Neural Information Processing Systems. Cambridge: MIT Press, 2007, 19: 129-136
|
14 |
Belkin M, Niyogi P, Sindhwani V. Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, 2006, 7: 2399-2434
|
15 |
He X, Ji M, Bao H. A unified active and semi-supervised learning framework for image compression. In: Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition. 2009, 65-72
|
16 |
He X. Laplacian regularized d-optimal design for active learning and its application to image retrieval. IEEE Transactions on Image Processing, 2010, 19(1): 254-263
|
17 |
He X, Niyogi P. Locality preserving projections. In: Proceedings of Advances in Neural Information Processing Systems. 2003, 153-160
|
18 |
He X, Yan S, Hu Y, Niyogi P, Zhang H J. Face recognition using Laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(3): 328-340
|
19 |
He X, Cai D, Niyogi P. Tensor subspace analysis. In: Proceedings of Advances in Neural Information Processing Systems. 2005, 18: 499-506
|
20 |
Donoho D L, Grimes C. Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences, 2003, 100(10): 5591-5596
|
21 |
Steinke F, Hein M. Non-parametric regression between manifolds. In: Koller D, Schuurmans D, Bengio Y, Bottou L, eds. Proceedings of Advances in Neural Information Processing Systems. 2008, 21: 1561-1568
|
22 |
Kim K I, Steinke F, Hein M. Semi-supervised regression using hessian energy with an application to semi-supervised dimensionality reduction. In: Bengio Y, Schuurmans D, Lafferty J, Williams C K I, Culotta A, eds. Advances in Neural Information Processing Systems. 2009, 22: 979-987
|
23 |
Zhang Z, Zha H. Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM Journal of Scientific Computing, 2004, 26(1): 313-338
|
24 |
Brand M. Charting a manifold. In: Proceedings of Advances in Neural Information Processing Systems. 2003, 15: 961-968
|
25 |
Lin T, Zha H. Riemannian manifold learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008, 30(5): 796-809
|
26 |
Dollár P, Rabaud V, Belongie S. Non-isometric manifold learning: analysis and an algorithm. In: Proceedings of the 24th International Conference on Machine learning. 2007, 227: 241-248
|
27 |
Dollár P, Belongie S, Rabaud V. Learning to traverse image manifolds. In: Schölkopf B, Platt J, Hoffman T, eds. Proceedings of Advances in Neural Information Processing Systems. 2007, 19: 361-368
|
28 |
Zomorodian A, Carlsson G. Computing persistent homology. Discrete and Computational Geometry, 2005, 33(2): 249-274
|
29 |
Niyogi P, Smale S, Weinberger S. Finding the homology of submanifolds with high confidence from random samples. Discrete and Computational Geometry, 2008, 39(1): 419-441
|
30 |
Eells J, Lemaire L. Selected Topics in Harmonic Maps. CBMS Regional Conference Series in Mathematics, Vol 50. Providence: American Mathematical Society, 1983
|
31 |
Bengio Y, Monperrus M. Non-local manifold tangent learning. In: Proceedings of Advances in Neural Information Processing Systems. 2005, 17: 129-136
|
32 |
Cai D, He X, Han J. Document clustering using locality preserving indexing. IEEE Transactions on Knowledge and Data Engineering, 2005, 17(12): 1624-1637
|
33 |
Lin B, He X, Zhou Y, Liu L, Lu K. Approximately harmonic projection: theoretical analysis and an algorithm. Pattern Recognition, 2010, 43(10): 3307-3313
|
34 |
Turk M, Pentland A. Eigenfaces for recognition. Journal of Cognitive Neuroscience, 1991, 3(1): 71-86
|
35 |
Belhumeur P N, Hepanha J P, Kriegman D J. Eigenfaces vs. Fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997, 19(7): 711-720
|
36 |
Atkinson A C, Donev A N. Optimum Experimental Designs. Oxford: Oxford University Press, 2007
|
37 |
Cheng L, Vishwanathan S V N. Learning to compress images and videos. In: Proceedings of the 24th International Conference on Machine Learning. 2007, 161-168
|
38 |
Achlioptas D. Random matrices in data analysis. In: Proceedings of the 15th European Conference on Machine Learning. 2004, 1-8
|
39 |
Ham J, Lee D D. Grassmann discriminant analysis: a unifying view on subspacebased learning. In: Proceedings of the 25th International Conference on Machine Learning. 2008, 376-383
|
40 |
Bhattacharya R, Patrangenaru V. Nonparametic estimation of location and dispersion on riemannian manifolds. Journal of Statistical Planning and Inference, 2002,108: 23-36
|
41 |
Eckart C, Young G. The approximation of one matrix by another of lower rank. Psychometrika, 1936, 1(3): 211-218
|
/
〈 | 〉 |