SRMD: Sparse Random Mode Decomposition

Nicholas Richardson, Hayden Schaeffer, Giang Tran

Communications on Applied Mathematics and Computation ›› 2023, Vol. 6 ›› Issue (2) : 879-906. DOI: 10.1007/s42967-023-00273-x
Original Paper

SRMD: Sparse Random Mode Decomposition

Author information +
History +

Abstract

Signal decomposition and multiscale signal analysis provide many useful tools for time-frequency analysis. We proposed a random feature method for analyzing time-series data by constructing a sparse approximation to the spectrogram. The randomization is both in the time window locations and the frequency sampling, which lowers the overall sampling and computational cost. The sparsification of the spectrogram leads to a sharp separation between time-frequency clusters which makes it easier to identify intrinsic modes, and thus leads to a new data-driven mode decomposition. The applications include signal representation, outlier removal, and mode decomposition. On benchmark tests, we show that our approach outperforms other state-of-the-art decomposition methods.

Keywords

Sparse random features / Signal decomposition / Short-time Fourier transform

Cite this article

Download citation ▾
Nicholas Richardson, Hayden Schaeffer, Giang Tran. SRMD: Sparse Random Mode Decomposition. Communications on Applied Mathematics and Computation, 2023, 6(2): 879‒906 https://doi.org/10.1007/s42967-023-00273-x

References

[1.]
Abbott BP, et al.. Observation of gravitational waves from a binary black hole merger. Phys. Rev. Lett., 2016, 116: 061102,
CrossRef Google scholar
[2.]
Auger F, Flandrin P, Lin Y-T, McLaughlin S, Meignen S, Oberlin T, Wu H-T. Time-frequency reassignment and synchrosqueezing: an overview. IEEE Sig. Process. Mag., 2013, 30(6): 32-41,
CrossRef Google scholar
[3.]
Bach F. On the equivalence between kernel quadrature rules and random feature expansions. J. Mach. Learn. Res., 2017, 18(21): 1-38
[4.]
Bertsimas D, Van Parys B. Sparse high-dimensional regression: exact scalable algorithms and phase transitions. Ann. Statist., 2020, 48(1): 300-323,
CrossRef Google scholar
[5.]
Block H-D. The perceptron: a model for brain functioning. I. Rev. Mod. Phys., 1962, 34(1): 123,
CrossRef Google scholar
[6.]
Cai TT, Xu G, Zhang J. On recovery of sparse signals via $\ell ^1$ minimization. IEEE Trans. Inf. Theory, 2009, 55(7): 3388-3397,
CrossRef Google scholar
[7.]
Candes EJ, Tao T. Near-optimal signal recovery from random projections: universal encoding strategies?. IEEE Trans. Inf. Theory, 2006, 52(12): 5406-5425,
CrossRef Google scholar
[8.]
Carvalho VR, Moraes MF, Braga AP, Mendes EM. Evaluating five different adaptive decomposition methods for EEG signal seizure detection and classification. Biomed. Sig. Process. Control, 2020, 62,
CrossRef Google scholar
[9.]
Chen, Z., Schaeffer, H.: Conditioning of random feature matrices: double descent and generalization error. arXiv:2110.11477 (2021)
[10.]
Daubechies I, Lu J, Wu HT. Synchrosqueezed wavelet transforms: an empirical mode decomposition-like tool. Appl. Comput. Harmon. Anal., 2011, 30(2): 243-261,
CrossRef Google scholar
[11.]
Dragomiretskiy K, Zosso D. Variational mode decomposition. IEEE Trans. Sig. Process., 2013, 62(3): 531-544,
CrossRef Google scholar
[12.]
E, W., Ma, C., Wojtowytsch, S., Wu, L.: Towards a mathematical understanding of neural network-based machine learning: what we know and what we don’t. arXiv:2009.10713 (2020)
[13.]
Flandrin P, Rilling G, Goncalves P. Empirical mode decomposition as a filter bank. IEEE Sig. Process. Lett., 2004, 11(2): 112-114,
CrossRef Google scholar
[14.]
Foucart, S., Rauhut, H.: A Mathematical Introduction to Compressive Sensing. Springer, New York (2013)
[15.]
Frankle, J., Carbin, M.: The lottery ticket hypothesis: finding sparse, trainable neural networks. arXiv:1803.03635 (2018)
[16.]
Gilles J. Empirical wavelet transform. IEEE Trans. Sig. Process., 2013, 61(16): 3999-4010,
CrossRef Google scholar
[17.]
Gilles J, Heal K. A parameterless scale-space approach to find meaningful modes in histograms-Application to image and spectrum segmentation. Int. J. Wavelets Multiresolution Inf. Process., 2014, 12(6): 2456-2464,
CrossRef Google scholar
[18.]
Gilles J, Tran G, Osher S. 2D empirical transforms, wavelets, ridgelets, and curvelets revisited. SIAM J. Imaging Sci., 2014, 7(1): 157-186,
CrossRef Google scholar
[19.]
Goldstein Tom, Osher Stanley. The split Bregman method for L1-regularized problems. SIAM J. Imaging Sci., 2009, 2(2): 323-343,
CrossRef Google scholar
[20.]
Hashemi, A., Schaeffer, H., Shi, R., Topcu, U., Tran, G., Ward, R.: Generalization bounds for sparse random feature expansions. arXiv:2103.03191 (2021)
[21.]
Hastie, T., Tibshirani, R., Wainwright, M.: Statistical Learning with Sparsity: the Lasso and Generalizations. Chapman and Hall/CRC, USA (2019)
[22.]
Hazimeh H, Mazumder R. Fast best subset selection: coordinate descent and local combinatorial optimization algorithms. Oper. Res., 2020, 68(5): 1517-1537,
CrossRef Google scholar
[23.]
Hou TY, Shi Z. Adaptive data analysis via sparse time-frequency representation. Adv. Adapt. Data Anal., 2011, 3(1/2): 1-28,
CrossRef Google scholar
[24.]
Huang NE, Shen Z, Long SR, Wu MC, Shih HH, Zheng Q, Yen NC, Tung CC, Liu HH. The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. R. Soc. Lond. Proc. Ser. A Math. Phys. Eng. Sci., 1998, 454(1971): 903-995,
CrossRef Google scholar
[25.]
Huang Z, Zhang J, Zhao T, Sun Y. Synchrosqueezing S-transform and its application in seismic spectral decomposition. IEEE Trans. Geosci. Remote Sens., 2015, 54(2): 817-825,
CrossRef Google scholar
[26.]
Li Z, Ton J-F, Oglic D, Sejdinovic D. Towards a unified analysis of random Fourier features. J. Mach. Learn. Res., 2021, 22(108): 108
[27.]
Liu W, Chen W. Recent advancements in empirical wavelet transform and its applications. IEEE Access, 2019, 7: 103770-103780,
CrossRef Google scholar
[28.]
Luedtke J. A branch-and-cut decomposition algorithm for solving chance-constrained mathematical programs with finite support. Math. Program., 2014, 146(1): 219-244,
CrossRef Google scholar
[29.]
Maass W, Markram H. On the computational power of circuits of spiking neurons. J. Comput. System Sci., 2004, 69(4): 593-616,
CrossRef Google scholar
[30.]
Mazumder, R., Radchenko, P., Dedieu, A.: Subset selection with shrinkage: sparse linear modeling when the SNR is low. arXiv:1708.03288 (2017)
[31.]
Mei, S., Misiakiewicz, T., Montanari, A.: Generalization error of random features and kernel methods: hypercontractivity and kernel matrix concentration. arXiv:2101.10588 (2021)
[32.]
Moosmann, F., Triggs, B., Jurie, F.: Randomized clustering forests for building fast and discriminative visual vocabularies. In: NIPS. NIPS (2006)
[33.]
Muradeli, J.: ssqueezepy. GitHub Repository. https://github.com/OverLordGoldDragon/ssqueezepy/ (2020)
[34.]
Pele O, Werman M. Forsyth D, Torr P, Zisserman A. A linear time histogram metric for improved sift matching. Computer Vision – ECCV 2008, 2008 Berlin, Heidelberg Springer 495-508,
CrossRef Google scholar
[35.]
Pele, O., Werman, M.: Fast and robust earth mover’s distances. In: 2009 IEEE 12th International Conference on Computer Vision, pp. 460–467. IEEE (2009)
[36.]
Rahimi, A., Recht, B.: Random features for large-scale kernel machines. In: NIPS, vol. 3, pp. 5. Citeseer (2007)
[37.]
Rahimi, A., Recht, B.: Uniform approximation of functions with random bases. In: 2008 46th Annual Allerton Conference on Communication, Control, and Computing, pp. 555–561. IEEE (2008)
[38.]
Rahimi A, Recht B. Weighted sums of random kitchen sinks: replacing minimization with randomization in learning. Adv. Neural Inf. Process. Syst., 2008, 21: 1313-1320
[39.]
Rudi, A, Rosasco, L.: Generalization properties of learning with random features. In: NIPS, pp. 3215–3225 (2017)
[40.]
Saha, E., Schaeffer, H., Tran, G.: HARFE: hard-ridge random feature expansion. arXiv:2202.02877 (2022)
[41.]
Sriperumbudur BK, Szabo Z. Optimal rates for random Fourier features. Adv. Neural Inf. Process. Syst., 2015, 1144–1152: 2015
[42.]
Thakur G, Brevdo E, Fučkar NS, Wu H-T. The synchrosqueezing algorithm for time-varying spectral analysis: robustness properties and new paleoclimate applications. Sig. Process., 2013, 93(5): 1079-1094,
CrossRef Google scholar
[43.]
Thakur G, Wu H-T. Synchrosqueezing-based recovery of instantaneous frequency from nonuniform samples. SIAM J. Math. Anal., 2011, 43(5): 2078-2095,
CrossRef Google scholar
[44.]
Tibshirani R. Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B, 1996, 58(1): 267-288,
CrossRef Google scholar
[45.]
Torres, M.E., Colominas, M.A., Schlotthauer, G., Flandrin, P.: A complete ensemble empirical mode decomposition with adaptive noise. In: 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4144–4147. IEEE (2011)
[46.]
Wu Z, Huang NE. Ensemble empirical mode decomposition: a noise-assisted data analysis method. Adv. Adapt. Data Anal., 2009, 1(1): 1-41,
CrossRef Google scholar
[47.]
Xie W, Deng X. Scalable algorithms for the sparse ridge regression. SIAM J. Optimiz., 2020, 30(4): 3359-3386,
CrossRef Google scholar
[48.]
Xie, Y., Shi, B., Schaeffer, H., Ward, R.: SHRIMP: sparser random feature models via iterative magnitude pruning. arXiv:2112.04002 (2021)
[49.]
Yang H. Synchrosqueezed wave packet transforms and diffeomorphism based spectral analysis for 1D general mode decompositions. Appl. Comput. Harmon. Anal., 2015, 39(1): 33-66,
CrossRef Google scholar
[50.]
Yen IE-H, Lin T-W, Lin S-D, Ravikumar PK, Dhillon IS. Sparse random feature algorithm as coordinate descent in Hilbert space. Adv. Neural Inf. Process. Syst., 2014, 2: 2456-2464
Funding
Natural Sciences and Engineering Research Council of Canada(RGPIN 50503-10842); Air Force Office of Scientific Research(MURI FA9550-21-1-0084); Directorate for Mathematical and Physical Sciences(1752116); Natural Sciences and Engineering Research Council of Canada

Accesses

Citations

Detail

Sections
Recommended

/