The Coefficient Estimation of Tensor Autoregression Based on TR Decomposition

Yu-Hang Li , Ju-Li Zhang

Communications on Applied Mathematics and Computation ›› 2025, Vol. 7 ›› Issue (5) : 1940 -1958.

PDF
Communications on Applied Mathematics and Computation ›› 2025, Vol. 7 ›› Issue (5) : 1940 -1958. DOI: 10.1007/s42967-024-00424-8
Original Paper
research-article

The Coefficient Estimation of Tensor Autoregression Based on TR Decomposition

Author information +
History +
PDF

Abstract

With the advent of tensor-valued time series data, tensor autoregression appears in many fields, in which the coefficient estimation is confronted with the problem of dimensional disaster. Based on the tensor ring (TR) decomposition, an autoregression model with one order for tensor-valued responses is proposed in this paper. A randomized method, TensorSketch, is applied to the TR autoregression model for estimating the coefficient tensor. Convergence and some properties of the proposed methods are given. Finally, some numerical experiment results on synthetic data and real data are given to illustrate the effectiveness of the proposed method.

Keywords

Tensor autoregression / Tensor ring (TR) decomposition / TensorSketch / Alternating least squares method / Randomized algorithm / 65F10 / 65N22

Cite this article

Download citation ▾
Yu-Hang Li, Ju-Li Zhang. The Coefficient Estimation of Tensor Autoregression Based on TR Decomposition. Communications on Applied Mathematics and Computation, 2025, 7(5): 1940-1958 DOI:10.1007/s42967-024-00424-8

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

BaiZ-Z, PanJ-YMatrix Analysis and Computations, 2021, Philadelphia. SIAM.

[2]

BaiZ-Z, WangL, WuW-T. On convergence rata of the randomized Gauss-Seidel method. Linear Algebra Appl., 2021, 611: 237-252

[3]

BaiZ-Z, WuW-T. On convergence of the randomized Kaczmarz method. Linear Algebra Appl., 2018, 553: 252-269

[4]

BaiZ-Z, WuW-T. On greedy randomized coordinate descent methods for solving large linear least-squares problems. Numer. Linear Algebra Appl., 2019, 26: 1-15

[5]

CandèsEJ, PlanY. Tight oracle inequalities for low-rank matrix recovery from a minimal number of noisy random measurements. IEEE Transactions on Information Theory, 2011, 57: 2342-2359

[6]

Chen, E.-Y., Chen, R.: Modeling dynamic transport network with matrix factor models: with an application to international trade flow. arXiv:1901.00769 (2019)

[7]

ChenH, RaskuttiG, YuanM. Non-convex projected gradient descent for generalized low-rank tensor regression. J. Mach. Learn. Res., 2019, 20: 1-37

[8]

Chen, Z., Jiang, H., Yu, G., Qi, L.: Low-rank tensor train decomposition using tensor Sketch. arXiv: 2309.08093 (2023)

[9]

CichockiA, MandicD, De LathauwerL, ZhouG-X, ZhaoQ-B, CaiafaC, PhanHA. Tensor decompositions for signal processing applications: from two-way to multiway component analysis. IEEE Signal Proc. Mag., 2015, 32: 145-163

[10]

FamaEF, FrenchKR. A five-factor asset pricing model. J. Financ. Econ., 2015, 116: 1-22

[11]

French, K.R.: Data library: U.S. research returns data. Available at http://mba.tuck.darmouth.edu/pages/faculty/ken.french/data_library.html (2020)

[12]

GazagnadouN, IbrahimM, GowerRM. RidgeSketch: a fast sketching based solver for large scale ridge regression. SIAM J. Matrix Anal. Appl., 2022, 43: 1440-1468

[13]

HuangH-Y, LiuY-P, LiuJ-N, ZhuC. Provable tensor ring completion. Signal Process., 2020, 171: 107-486

[14]

HuangH-Y, LiuY-P, LongZ, ZhuC. Robust low-rank tensor ring completion. IEEE Transactions on Computational Imaging, 2020, 6: 1117-1126

[15]

KongD-H, AnB-G, ZhangJ-W, ZhuH-T. L$2$RM: low-rank linear regression models for high-dimensional matrix responses. J. Am. Stat. Assoc., 2020, 115(529): 403-424

[16]

LiX-S, XuD, ZhouH, LiL-X. Tucker tensor regression and neuroimaging analysis. Stat. Biosci., 2018, 10(3): 520-545

[17]

LiuY-PTensors for Data Processing Theory, Methods and Applications, 2021, New York. Academic Press.

[18]

LiuY-P, LiuJ-N, LongZ, ZhuCTensor Computation for Data Analysis, 2022, Berlin. Springer.

[19]

LiuY-P, LiuJ-N, ZhuC. Low-rank tensor train coefficient array estimation for tensor-on-tensor regression. IEEE Transaction on Neural Networks and Learning Systems, 2020, 31(12): 5402-5411

[20]

LockEF. Tensor-on-tensor regression. J. Comput. Graph. Stat., 2018, 27(3): 638-647

[21]

Ma, L.-J., Solomonik, E.: Fast and accurate randomized algorithms for low-rank tensor decompositions. arXiv: 2104.01101 (2021)

[22]

OseledetsIV. Tensor-train decomposition. SIAM J. Sci. Comput., 2011, 33: 2295-2317

[23]

PaghR. Compressed matrix multiplication. ACM Transactions on Computation Theory, 2013, 5: 1-17

[24]

Rigollet, P., Hütter, J.C.: High dimensional statistics. Massachusetts Institute of Technology: MIT OpenCourseWare, https://ocw.mit.edu (2015)

[25]

RudelsonM, VershyninR. Hanson-Wright inequality and sub-Gaussian concentration. Electron. Commun. Prob., 2013, 18: 1-9

[26]

Si, Y.-F., Zhang, Y.-Y., Li, G.-D.: An efficient tensor regression for high-dimensional data. arXiv: 2205.13734 (2022)

[27]

TangL, YuY-J, ZhangY-J, LiH-Y. Sketch-and-project methods for tensor linear systems. Numer. Linear Algebra Appl., 2023, 302e2470

[28]

VirtaJ, LiB, NordhausenK, OjaH. Independent component analysis for tensor-valued data. J. Multivariate Anal., 2017, 162: 172-192

[29]

WainwrightMJHigh-Dimensional Statistics: a Non-asymptotic Viewpoint, 2019, Cambridge. Cambridge University Press.

[30]

WaldenAT, SerroukhA. Wavelet analysis of matrix-valued time-series. Proc. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci., 2002, 458: 157-179

[31]

WangD, ZhengY, LiG-D. High-dimensional low-rank tensor autoregressive time series modeling. J. Econometr., 2024, 2381105544

[32]

WangD, ZhengY, LianH, LiG-D. High-dimensional vector autoregressive time series modeling via tensor decomposition. J. Am. Stat Assoc., 2022, 117: 1338-1356

[33]

YuD, DengL, SeideF. The deep tensor neural network with applications to large vocabulary speech recognition. IEEE Transactions on Audio, Speech, and Language Processing, 2013, 21(2): 388-396

[34]

Yu, Y.-J., Li, H.-Y.: Practical sketching-based randomized tensor ring decomposition. arXiv: 2209.05647 (2022)

[35]

ZhangA-R, XiaD. Tensor SVD: statistical and computational limits. IEEE Transactions on Information Theory, 2018, 64: 7311-7338

[36]

Zhao, Q.-B., Sugiyama, M., Yuan, L.-H., Cichocki, A.: Learning efficient tensor representations with ring structure networks. ICASSP 8608–8612 (2019)

[37]

Zhao, Q.-B., Zhou, G.-X., Xie, S.-L., Zhang, L.-Q., Cichocki, A.: Tensor Ring Decomposition. arXiv:1606.05535 (2016)

[38]

ZhouH, LiL-X, ZhuH-T. Tensor regression with applications in neuroimaging data analysis. J. Am. Stat. Assoc., 2013, 108: 540-552

RIGHTS & PERMISSIONS

Shanghai University

AI Summary AI Mindmap
PDF

130

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/