SRMD: Sparse Random Mode Decomposition
Nicholas Richardson, Hayden Schaeffer, Giang Tran
SRMD: Sparse Random Mode Decomposition
Signal decomposition and multiscale signal analysis provide many useful tools for time-frequency analysis. We proposed a random feature method for analyzing time-series data by constructing a sparse approximation to the spectrogram. The randomization is both in the time window locations and the frequency sampling, which lowers the overall sampling and computational cost. The sparsification of the spectrogram leads to a sharp separation between time-frequency clusters which makes it easier to identify intrinsic modes, and thus leads to a new data-driven mode decomposition. The applications include signal representation, outlier removal, and mode decomposition. On benchmark tests, we show that our approach outperforms other state-of-the-art decomposition methods.
Sparse random features / Signal decomposition / Short-time Fourier transform
[1.] |
|
[2.] |
|
[3.] |
|
[4.] |
|
[5.] |
|
[6.] |
|
[7.] |
|
[8.] |
|
[9.] |
Chen, Z., Schaeffer, H.: Conditioning of random feature matrices: double descent and generalization error. arXiv:2110.11477 (2021)
|
[10.] |
|
[11.] |
|
[12.] |
E, W., Ma, C., Wojtowytsch, S., Wu, L.: Towards a mathematical understanding of neural network-based machine learning: what we know and what we don’t. arXiv:2009.10713 (2020)
|
[13.] |
|
[14.] |
Foucart, S., Rauhut, H.: A Mathematical Introduction to Compressive Sensing. Springer, New York (2013)
|
[15.] |
Frankle, J., Carbin, M.: The lottery ticket hypothesis: finding sparse, trainable neural networks. arXiv:1803.03635 (2018)
|
[16.] |
|
[17.] |
|
[18.] |
|
[19.] |
|
[20.] |
Hashemi, A., Schaeffer, H., Shi, R., Topcu, U., Tran, G., Ward, R.: Generalization bounds for sparse random feature expansions. arXiv:2103.03191 (2021)
|
[21.] |
Hastie, T., Tibshirani, R., Wainwright, M.: Statistical Learning with Sparsity: the Lasso and Generalizations. Chapman and Hall/CRC, USA (2019)
|
[22.] |
|
[23.] |
|
[24.] |
|
[25.] |
|
[26.] |
|
[27.] |
|
[28.] |
|
[29.] |
|
[30.] |
Mazumder, R., Radchenko, P., Dedieu, A.: Subset selection with shrinkage: sparse linear modeling when the SNR is low. arXiv:1708.03288 (2017)
|
[31.] |
Mei, S., Misiakiewicz, T., Montanari, A.: Generalization error of random features and kernel methods: hypercontractivity and kernel matrix concentration. arXiv:2101.10588 (2021)
|
[32.] |
Moosmann, F., Triggs, B., Jurie, F.: Randomized clustering forests for building fast and discriminative visual vocabularies. In: NIPS. NIPS (2006)
|
[33.] |
Muradeli, J.: ssqueezepy. GitHub Repository. https://github.com/OverLordGoldDragon/ssqueezepy/ (2020)
|
[34.] |
|
[35.] |
Pele, O., Werman, M.: Fast and robust earth mover’s distances. In: 2009 IEEE 12th International Conference on Computer Vision, pp. 460–467. IEEE (2009)
|
[36.] |
Rahimi, A., Recht, B.: Random features for large-scale kernel machines. In: NIPS, vol. 3, pp. 5. Citeseer (2007)
|
[37.] |
Rahimi, A., Recht, B.: Uniform approximation of functions with random bases. In: 2008 46th Annual Allerton Conference on Communication, Control, and Computing, pp. 555–561. IEEE (2008)
|
[38.] |
|
[39.] |
Rudi, A, Rosasco, L.: Generalization properties of learning with random features. In: NIPS, pp. 3215–3225 (2017)
|
[40.] |
Saha, E., Schaeffer, H., Tran, G.: HARFE: hard-ridge random feature expansion. arXiv:2202.02877 (2022)
|
[41.] |
|
[42.] |
|
[43.] |
|
[44.] |
|
[45.] |
Torres, M.E., Colominas, M.A., Schlotthauer, G., Flandrin, P.: A complete ensemble empirical mode decomposition with adaptive noise. In: 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4144–4147. IEEE (2011)
|
[46.] |
|
[47.] |
|
[48.] |
Xie, Y., Shi, B., Schaeffer, H., Ward, R.: SHRIMP: sparser random feature models via iterative magnitude pruning. arXiv:2112.04002 (2021)
|
[49.] |
|
[50.] |
|
/
〈 | 〉 |