RESEARCH ARTICLE

An investigation of several typical model selection criteria for detecting the number of signals

  • Shikui TU ,
  • Lei XU
Expand
  • Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China

Received date: 28 Feb 2011

Accepted date: 24 Mar 2011

Published date: 05 Jun 2011

Copyright

2014 Higher Education Press and Springer-Verlag Berlin Heidelberg

Abstract

Based on the problem of detecting the number of signals, this paper provides a systematic empiricalinvestigation on model selection performances of several classical criteria and recently developed methods (including Akaike’s information criterion (AIC), Schwarz’s Bayesian information criterion, Bozdogan’s consistent AIC, Hannan-Quinn information criterion, Minka’s (MK) principal component analysis (PCA) criterion, Kritchman & Nadler’s hypothesis tests (KN), Perry & Wolfe’s minimax rank estimation thresholding algorithm (MM), and Bayesian Ying-Yang (BYY) harmony learning), by varying signal-to-noise ratio (SNR) and training sample size N. A family of model selection indifference curves is defined by the contour lines of model selection accuracies, such that we can examine the joint effect of N and SNR rather than merely the effect of either of SNR and N with the other fixed as usually done in the literature. The indifference curves visually reveal that all methods demonstrate relative advantages obviously within a region of moderate N and SNR. Moreover, the importance of studying this region is also confirmed by an alternative reference criterion by maximizing the testing likelihood. It has been shown via extensive simulations that AIC and BYY harmony learning, as well as MK, KN, and MM, are relatively more robust than the others against decreasing N and SNR, and BYY is superior for a small sample size.

Cite this article

Shikui TU , Lei XU . An investigation of several typical model selection criteria for detecting the number of signals[J]. Frontiers of Electrical and Electronic Engineering, 2011 , 6(2) : 245 -255 . DOI: 10.1007/s11460-011-0146-y

1
Wax M, Kailath T. Detection of signals by information theoretic criteria. IEEE Transactions on Acoustics, Speech and Signal Processing, 1985: 33(2): 387-392

DOI

2
Schmidt R. Multiple emitter location and signal parameter estimation. IEEE Transactions on Antennas and Propagation, 1986, 34(3): 276-280

DOI

3
Tu S, Xu L. Theoretical analysis and comparison of several criteria on linear model dimension reduction. In: Proceedings of the 8th International Conference on Independent Component Analysis and Signal Separation. 2009, 154-162

DOI

4
Anderson T, Rubin H. Statistical inference in factor analysis. In: Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability. 1956, 5: 111-150

5
Tipping M E, Bishop C M. Mixtures of probabilistic principal component analyzers. Neural Computation, 1999, 11(2): 443-482

DOI

6
Akaike H. A new look at the statistical model identification. IEEE Transactions on Automatic Control, 1974, 19(6): 716-723

DOI

7
Minka T P. Automatic choice of dimensionality for PCA. Advances in Neural Information Processing Systems, 2001, 13: 598-604

8
Xu L. Bayesian-Kullback coupled Ying-Yang machines: unified learnings and new results on vector quantization. In: Proceedings of International Conference on Neural Information Processing. 1995, 977-988

9
Xu L. Bayesian Ying Yang learning. Scholarpedia, 2007, 2(3): 1809

DOI

10
Baik J, Silverstein J W. Eigenvalues of large sample covariance matrices of spiked population models. Journal of Multivariate Analysis, 2006, 97(6): 1382-1408

DOI

11
Johnstone I M. High dimensional statistical inference and random matrices. In: Proceedings of International Congress of Mathematicians. 2006, 1-28

12
Paul D. Asymptotics of sample eigenstruture for a large dimensional spiked covariance model. Statistica Sinica, 2007, 17(4): 1617-1642

13
Kritchman S, Nadler B. Determining the number of components in a factor model from limited noisy data. Chemometrics & Intelligent Laboratory Systems, 2008, 94(1): 19-32

DOI

14
Perry P O, Wolfe P J. Minimax rank estimation for subspace tracking. Selected Topics in Signal Proceesing, 2010, 4(3): 504-513

DOI

15
Hu X, Xu L. A comparative investigation on subspace dimension determination. Neural Networks, 2004, 17(8-9): 1051-1059

DOI

16
Chen P, Wu T J, Yang J. A comparative study of model selection criteria for the number of signals. IET Radar, Sonar and Navigation, 2008, 2(3): 180-188

DOI

17
Zhang Q T, Wong K, Yip P, Reilly J. Statistical analysis of the performance of information theoretic criteria in the detection of the number of signals in array processing. IEEE Transactions on Acoustics, Speech and Signal Processing, 1989, 37(10): 1557-1567

DOI

18
Xu W, Kaveh M. Analysis of the performance and sensitivity of eigendecomposition-based detectors. IEEE Transactions on Signal Processing, 1995, 43(6): 1413-1426

DOI

19
Liavas A, Regalia P. On the behavior of information theoretic criteria for model order selection. IEEE Transactions on Signal Processing, 2001, 49(8): 1689-1695

DOI

20
Fishler E, Grosmann M, Messer H. Detection of signals by information theoretic criteria: general asymptotic performance analysis. IEEE Transactions on Signal Processing, 2002, 50(5): 1027-1036

DOI

21
Fishler E, Poor H. Estimation of the number of sources in unbalanced arrays via information theoretic criteria. IEEE Transactions on Signal Processing, 2005, 53(9): 3543-3553

DOI

22
Nadakuditi R, Edelman A. Sample eigenvalue based detection of high-dimensional signals in white noise using relatively few samples. IEEE Transactions on Signal Processing, 2008, 56(7): 2625-2638

DOI

23
Rissanen J. Modelling by the shortest data description. Automatica, 1978, 14(5): 465-471

DOI

24
Hoyle D C. Automatic PCA dimension selection for high dimensional data and small sample sizes. Journal of Machine Learning Research, 2008, 9(12): 2733-2759

25
Bishop C M. Variational principal components. In: Proceedings of the Ninth International Conference on Artificial Neural Networks. 1999, 1: 509-514

DOI

26
Schwarz G. Estimating the dimension of a model. Annals of Statistics, 1978, 6(2): 461-464

DOI

27
Bozdogan H. Model selection and Akaike’s Information Criterion (AIC): the general theory and its analytical extensions. Psychometrika, 1987, 52(3): 345-370

DOI

28
Hannan E, Quinn B. The determination of the order of an autoregression. Journal of the Royal Statistical Society. Series B, 1979, 41(2): 190-195

29
Johnstone I M. On the distribution of the largest eigenvalue in principal component anslysis. Annals of Statistics, 2001, 29(2): 295-327

DOI

30
Xu L. Bayesian Ying-Yang system, best harmony learning, and five action circling. Frontiers of Electrical and Electronic Engineering in China, 2010, 5(3): 281-328

DOI

31
Tu S, Xu L. Parameterizations make different model selections: empirical findings from factor analysis. Frontiers of Electrical and Electronic Engineering in China (in Press)

32
Xu L. Codimensional matrix pairing perspective of BYY harmony learning: hierarchy of bilinear systems, joint decomposition of data-covariance, and applications of network biology. Frontiers of Electrical and Electronic Engineering in China, 2011, 6(1): 86-119

DOI

Outlines

/