%A Hujun YIN %T Advances in adaptive nonlinear manifolds and dimensionality reduction %0 Journal Article %D 2011 %J Front. Electr. Electron. Eng. %J Frontiers of Electrical and Electronic Engineering %@ 2095-2732 %R 10.1007/s11460-011-0131-5 %P 72-85 %V 6 %N 1 %U {https://journal.hep.com.cn/fee/EN/10.1007/s11460-011-0131-5 %8 2011-03-05 %X

Recent decades have witnessed a much increased demand for advanced, effective and efficient methods and tools for analyzing, understanding and dealing with data of increasingly complex, high dimensionality and large volume. Whether it is in biology, neuroscience, modern medicine and social sciences or in engineering and computer vision, data are being sampled, collected and cumulated in an unprecedented speed. It is no longer a trivial task to analyze huge amounts of high dimensional data. A systematic, automated way of interpreting data and representing them has become a great challenge facing almost all fields and research in this emerging area has flourished. Several lines of research have embarked on this timely challenge and tremendous progresses and advances have been made recently. Traditional and linear methods are being extended or enhanced in order to meet the new challenges. This paper elaborates on these recent advances and discusses various state-of-the-art algorithms proposed from statistics, geometry and adaptive neural networks. The developments mainly follow three lines: multidimensional scaling, eigen-decomposition as well as principal manifolds. Neural approaches and adaptive or incremental methods are also reviewed. In the first line, traditional multidimensional scaling (MDS) has been extended not only to be more adaptive such as neural scale, curvilinear component analysis (CCA) and visualization induced self-organizing map (ViSOM) for online learning, but also to be more local scaling such as Isomap for enhanced flexibility for nonlinear data sets. The second line extends linear principal component analysis (PCA) and has attracted a huge amount of interest and enjoyed flourishing advances with methods like kernel PCA (KPCA), locally linear embedding (LLE) and Laplacian eigenmap. The advantage is obvious: a nonlinear problem is transformed into a linear one and a unique solution can then be sought. The third line starts with the nonlinear principal curve and surface and links up with adaptive neural network approaches such as self-organizing map (SOM) and ViSOM. Many of these frameworks have been further improved and enhanced for incremental learning and mapping function generalization. This paper discusses these recent advances and their connections. Their application issues and implementation matters will also be briefly enlightened and commented on.