Advances in adaptive nonlinear manifolds and dimensionality reduction

Hujun YIN

Front. Electr. Electron. Eng. ›› 2011, Vol. 6 ›› Issue (1) : 72 -85.

PDF (384KB)
Front. Electr. Electron. Eng. ›› 2011, Vol. 6 ›› Issue (1) : 72 -85. DOI: 10.1007/s11460-011-0131-5
RESEARCH ARTICLE
RESEARCH ARTICLE

Advances in adaptive nonlinear manifolds and dimensionality reduction

Author information +
History +
PDF (384KB)

Abstract

Recent decades have witnessed a much increased demand for advanced, effective and efficient methods and tools for analyzing, understanding and dealing with data of increasingly complex, high dimensionality and large volume. Whether it is in biology, neuroscience, modern medicine and social sciences or in engineering and computer vision, data are being sampled, collected and cumulated in an unprecedented speed. It is no longer a trivial task to analyze huge amounts of high dimensional data. A systematic, automated way of interpreting data and representing them has become a great challenge facing almost all fields and research in this emerging area has flourished. Several lines of research have embarked on this timely challenge and tremendous progresses and advances have been made recently. Traditional and linear methods are being extended or enhanced in order to meet the new challenges. This paper elaborates on these recent advances and discusses various state-of-the-art algorithms proposed from statistics, geometry and adaptive neural networks. The developments mainly follow three lines: multidimensional scaling, eigen-decomposition as well as principal manifolds. Neural approaches and adaptive or incremental methods are also reviewed. In the first line, traditional multidimensional scaling (MDS) has been extended not only to be more adaptive such as neural scale, curvilinear component analysis (CCA) and visualization induced self-organizing map (ViSOM) for online learning, but also to be more local scaling such as Isomap for enhanced flexibility for nonlinear data sets. The second line extends linear principal component analysis (PCA) and has attracted a huge amount of interest and enjoyed flourishing advances with methods like kernel PCA (KPCA), locally linear embedding (LLE) and Laplacian eigenmap. The advantage is obvious: a nonlinear problem is transformed into a linear one and a unique solution can then be sought. The third line starts with the nonlinear principal curve and surface and links up with adaptive neural network approaches such as self-organizing map (SOM) and ViSOM. Many of these frameworks have been further improved and enhanced for incremental learning and mapping function generalization. This paper discusses these recent advances and their connections. Their application issues and implementation matters will also be briefly enlightened and commented on.

Keywords

dimensionality reduction / multidimensional scaling / nonlinear principal component analysis (PCA) / principal manifold / neural networks / selforganizing maps (SOM) / biologically inspired models / data projection / embedding and visualisation

Cite this article

Download citation ▾
Hujun YIN. Advances in adaptive nonlinear manifolds and dimensionality reduction. Front. Electr. Electron. Eng., 2011, 6(1): 72-85 DOI:10.1007/s11460-011-0131-5

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Fodor I K. A survey of dimension reduction techniques. Technical Report UCRL-ID-148494. San Francisco: Lawrence Livermore National Laboratory, 2002

[2]

Gorban A N, Kégl B, Wunsch D C, Zinovyev A. Principal Manifolds for Data Visualization and Dimension Reduction. Berlin: Springer, 2008

[3]

Van der Maaten L J P, Postma E O, van den Herik H J. Dimensionality reduction: a comparative review. Online Preprint (2008)

[4]

Yin H. Nonlinear dimensionality reduction and data visualization: A review. International Journal on Automation and Computing, 2007, 3(3): 294-303

[5]

Yin H, Huang W. Adaptive nonlinear manifolds and their applications to pattern recognition. Information Science, 2010, 180(14): 2649-2662

[6]

Cox T F, Cox M A A. Multidimensional Scaling. London: Chapman & Hall, 1994

[7]

Tenenbaum J B, de Silva V, Langford J C. A global geometric framework for nonlinear dimensionality reduction. Science, 2000, 290(5500): 2319-2323

[8]

Mao J, Jain A K. Artificial neural networks for feature extraction and multivariate data projection. IEEE Transactions on Neural Networks, 1995, 6(2): 296-317

[9]

Lowe D, Tipping M E. Feed-forward neural networks and topographic mappings for exploratory data analysis. Neural Computing & Applications, 1996, 4(2): 83-95

[10]

Malthouse E C. Limitations of nonlinear PCA as performed with generic neural networks. IEEE Transactions on Neural Networks, 1998, 9(1): 165-173

[11]

Kramer M A. Nonlinear principal component analysis using autoassociative neural networks. American Institute of Chemical Engineers Journal, 1991, 37(1): 233-243

[12]

Karhunen J, Joutsensalo J. Generalization of principal component analysis, optimization problems, and neural networks. Neural Networks, 1995, 8(4): 549-562

[13]

Schölkopf B, Smola A, Müller K R. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 1998, 10(5): 1299-1319

[14]

Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000, 290(5500): 2323-2326

[15]

Kouropteva O, Okun O, Pietikäinen M. Incremental locally linear embedding. Pattern Recognition, 2005, 38(10): 1764-1767

[16]

Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003, 15(6): 1373-1396

[17]

Weiss Y. Segmentation using eigenvectors: a unified view. In: Proceedings of IEEE International Conference on Computer Vision. 1999, 975-982

[18]

Hastie T, Stuetzle W. Principal curves. Journal of the American Statistical Association, 1989, 84(406): 502-516

[19]

Banfield J D, Raftery A E. Ice floe identification in satellite images using mathematical morphology and clustering about principal curves. Journal of the American Statistical Association, 1992, 87(417): 7-16

[20]

Delicado P. Another look at principal curves and surfaces. Journal of Multivariate Analysis, 2001, 77(1): 84-116

[21]

Kégl B, Krzyzak A, Linder T, Zeger K. A polygonal line algorithm for constructing principal curves. Advances in Neural Information Processing Systems, 1998, 11: 501-507

[22]

Chang K Y, Ghosh J. A unified model for probabilistic principal surfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2001, 23(1): 22-41

[23]

Hinton G E, Salakhutdinov R R. Reducing the dimensionality of data with neural networks. Science, 2006, 313(5786): 504-507

[24]

Kohonen T. Self-organized formation of topologically cor rect feature map. Biological Cybernetics, 1982, 43(1): 56-69

[25]

Kohonen T. Self-Organizing Maps. 2nd ed. Berlin: Springer, 1997

[26]

Willshaw D J, von der Malsburg C. How patterned neural connections can be set up by self-organization. Proceedings of the Royal Society of London. Series B. Biological Sciences, 1976, 194(1117): 431-445

[27]

Yin H. ViSOM-A novel method for multivariate data projection and structure visualization. IEEE Transactions on Neural Networks, 2002, 13(1): 237-243

[28]

Ritter H, Martinetz T, Schulten K. Neural Computation and Self-Organizing Maps: An Introduction. Menlo Park: Addison-Wesley Publishing Company, 1992

[29]

Yin H. Data visualization and manifold mapping using the ViSOM. Neural Networks, 2002, 15(8-9): 1005-1016

[30]

Yin H. On multidimensional scaling and the embedding of self-organizing maps. Neural Networks, 2008, 21(2-3): 160-169

[31]

Borg I, Groenen P J F. Modern Multidimensional Scaling: Theory and Applications, 2nd ed. New York: Springer, 2005

[32]

Sammon J W. A nonlinear mapping for data structure analysis. IEEE Transactions on Computers, 1969, 18(5): 401-409

[33]

Bronstein A M, Bronstein M M, Kimmel R. Generalized multidimensional scaling: A framework for isometryinvariant partial surface matching. Proceedings of the National Academy of Sciences of the United States of America, 2006, 103(5): 1168-1172

[34]

Lee R C T, Slagle J R, Blum H. A triangulation method for the sequential mapping of points from n-space to two-space. IEEE Transactions on Computers, 1977, 27(3): 288-292

[35]

De Ridder D, Duin R P W. Sammon mapping using neural networks: a comparison. Pattern Recognition Letters, 1997, 18(11-13): 1307-1316

[36]

Balasubramanian M, Schwartz E L. The Isomap algorithm and topological stability. Science, 2002, 295(5552): 7

[37]

De Silva V, Tenenbaum J B. Global versus local methods in nonlinear dimensionality reduction. Advances in Neural Information Processing Systems, 2003, 15: 705-712

[38]

Agrafiotis D K. Stochastic proximity embedding. Journal of Computational Chemistry, 2003, 24(10): 1215-1221

[39]

Demartines P, Hérault J. Curvilinear component analysis: a self-organizing neural network for nonlinear mapping of data sets. IEEE Transactions on Neural Networks, 1997, 8(1): 148-154

[40]

Law M H, Jain A K. Incremental nonlinear dimensionality reduction by manifold learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2006, 28(3): 377-391

[41]

Goodhill G J, Sejnowski T. A unifying objective function for topographic mappings. Neural Computation, 1997, 9(6): 1291-1303

[42]

Oja E. Neural networks, principal components, and subspaces. International Journal of Neural Systems, 1989, 1(1): 61-68

[43]

Sanger T D. Optimal unsupervised learning in a singlelayer linear feedforward network. Neural Networks, 1991, 2(6): 459-473

[44]

Ham J, Lee D D, Mika S, Schölkopf B. A kernel view of the dimensionality reduction of manifolds. In: Proceedings of the 21st International Conference on Machine Learning. 2004, 369-376

[45]

Saul L K, Roweis S T, Singer Y. Think globally, fit locally: unsupervised learning of nonlinear manifolds. Journal of Machine Learning Research, 2003, 4(2): 119-155

[46]

De Ridder D, Duin R P W. Locally linear embedding for classification. Technical Report PH-2002–01. Netherlands: Delft University of Technology, 2002

[47]

Bengio Y, Paiement J F, Vincent P, Delalleau O, Le Roux N, Ouimet M. Out-of-sample extensions for LLE, Isomap, MDS, eigenmaps, and spectral clustering. Advances in Neural Information Processing Systems, 2004, 16: 177-184

[48]

Donoho D L, Grimes C E. Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences of the United States of America, 2003, 100(10): 5591-5596

[49]

He X, Niyogi P. Locality preserving projections. Advances in Neural Information Processing Systems, 2003, 16: 152-160

[50]

Smola A J, Mika S, Schölkopf B, Williamson R C. Regularized principal manifolds. In: Proceedings of the 4th European Conference on Computational Learning Theory. 1999, 214-229

[51]

Yan S C, Xu D, Zhang B, Zhang H, Yang Q, Lin S. Graph embedding and extension: a general framework for dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(1): 40-51

[52]

Zhang Z, Zha H. Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM Journal on Scientific Computing, 2005, 26(1): 313-338

[53]

Weinberger K Q, Saul L K. Unsupervised learning of image manifolds by semidefinite programming. International Journal of Computer Vision, 2006, 70(1): 77-90

[54]

LeBlanc M, Tibshirani R J. Adaptive principal surfaces. Journal of the American Statistical Association, 1994, 89(425): 53-64

[55]

Tibshirani R. Principal curves revisited. Statistics and Computing, 1992, 2(4): 183-190

[56]

Bishop C M, Svensén M, Williams C K I. GTM: The generative topographic mapping. Neural Computation, 1998, 10(1): 215-235

[57]

Kohonen T. Self-organization and associative memory. Berlin: Springer-Verlag, 1984

[58]

Yin H, Allinson N M. On the distribution and convergence of the feature space in self-organizing maps. Neural Computation, 1995, 7(6): 1178-1187

[59]

Luttrell S P. Derivation of a class of training algorithms. IEEE Transactions on Neural Networks, 1990, 1(2): 229-232

[60]

Luttrell S P. A Bayesian analysis of self-organizing maps. Neural Computation, 1994, 6(5): 767-794

[61]

Durbin R, Mitchison G. A dimension reduction framework for understanding cortical maps. Nature, 1990, 343(6259): 644-647

[62]

Mitchison G. A type of duality between self-organizing maps and minimal wiring. Neural Computation, 1995, 7(1): 25-35

[63]

Ripley B D. Pattern Recognition and Neural Networks. Cambridge: Cambridge University Press, 1996

[64]

Flexer A. Limitations of self-organizing maps for vector quantization and multidimensional scaling. Advances in Neural Information Processing Systems, 1997, 10: 445-451

[65]

Yin H. Resolution enhancement for the ViSOM. In: Proceedings of Workshop on Self-Organizing Maps. 2003, 208-212

[66]

Wu S, Chow T W S. PRSOM: A new visualization method by hybridizing multidimensional scaling and self-organizing map. IEEE Transactions on Neural Networks, 2005, 16(6): 1362-1380

[67]

Estévez P A, Figueroa C J. Online data visualization using the neural gas network. Neural Networks, 2006, 19(6-7): 923-934

[68]

Kohonen T. The adaptive-subspace SOM (ASSOM) and its use for the implementation of invariant feature detection. In: Proceedings of International Conference on Artificial Neural Networks. 1995, 3-10

[69]

Yin H, Allinson N M. Self-organizing mixture networks for probability density estimation. IEEE Transactions on Neural Networks, 2001, 12(2): 405-411

[70]

Fyfe C. Two topographic maps for data visualization. Data Mining and Knowledge Discovery, 2007, 14(2): 207-224

[71]

Gorban A, Zinovyev A. Method of elastic maps and its applications in data visualization and data modeling. International Journal of Computing Anticipatory Systems, 2001, 12: 353-369

[72]

Tan X, Chen S, Zhou Z, Zhang F. Recognizing partially occluded, expression variant faces from single training image per person with SOM and soft k-NN ensemble. IEEE Transactions on Neural Networks, 2005, 16(4): 875-886

[73]

Fisher R A. The use of multiple measures in taxonomic problems. Annals of Eugenics, 1936, 7(2): 179-188

[74]

Kim M, Kim D, Lee S. Face recognition using the embedded HMM with second-order block specific observations. Pattern Recognition, 2003, 36(11): 2723-2735

[75]

Yang J, Zhang D, Frangi A F, Yang J. Two-dimensional PCA: A new approach to appearance-based face representation and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004, 26(1): 131-137

[76]

Huang W, Yin H. Nonlinear dimensionality reduction for face recognition. In: Proceedings of the 10th International Conference on Intelligent Data Engineering and Automated Learning. 2009, 424-432

[77]

Ji S, Ye J. Generalized linear discriminant analysis: a unified framework and efficient model selection. IEEE Transactions on Neural Networks, 2008, 19(10): 1768-1782

RIGHTS & PERMISSIONS

Higher Education Press and Springer-Verlag Berlin Heidelberg

AI Summary AI Mindmap
PDF (384KB)

811

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/