Deep learning for time series forecasting: a survey of recent advances

Kaiyuan LIAO , Xiwei XUAN , Kwan-Liu MA

Front. Comput. Sci. ›› 2026, Vol. 20 ›› Issue (11) : 2011359

PDF (2257KB)
Front. Comput. Sci. ›› 2026, Vol. 20 ›› Issue (11) : 2011359 DOI: 10.1007/s11704-025-50947-3
Artificial Intelligence
REVIEW ARTICLE

Deep learning for time series forecasting: a survey of recent advances

Author information +
History +
PDF (2257KB)

Abstract

Time series forecasting plays a critical role in numerous real-world applications, such as finance, healthcare, transportation, and scientific computing. In recent years, deep learning has become a powerful tool for modeling complex temporal patterns and improving forecasting accuracy. This survey provides an overview of recent deep learning approaches for time series forecasting, involving various architectures including RNNs, CNNs, GNNs, transformers, large language models, MLP-based models, and diffusion models. We first identify key challenges in the field, such as temporal dependency, efficiency, and cross-variable dependency, which drive the development of forecasting techniques. Then, the general advantages and limitations of each architecture are discussed to contextualize their adaptation in time series forecasting. Furthermore, we highlight promising design trends like multi-scale modeling, decomposition, and frequency-domain techniques, which are shaping the future of the field. This paper serves as a compact reference for researchers and practitioners seeking to understand the current landscape and future trajectory of deep learning in time series forecasting.

Graphical abstract

Keywords

time series forecasting / deep learning / neural networks / survey / long term forecasting

Cite this article

Download citation ▾
Kaiyuan LIAO, Xiwei XUAN, Kwan-Liu MA. Deep learning for time series forecasting: a survey of recent advances. Front. Comput. Sci., 2026, 20(11): 2011359 DOI:10.1007/s11704-025-50947-3

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Deng Z, Kong Z . Interpretable fault diagnosis for cyberphysical systems: a learning perspective. Computer, 2021, 54( 9): 30–38

[2]

Sezer O B, Gudelek M U, Ozbayoglu A M . Financial time series forecasting with deep learning: a systematic literature review: 2005−2019. Applied Soft Computing, 2020, 90: 106181

[3]

Waqas M, Humphries U W, Hlaing P T . Time series trend analysis and forecasting of climate variability using deep learning in Thailand. Results in Engineering, 2024, 24: 102997

[4]

Kaushik S, Choudhury A, Sheron P K, Dasgupta N, Natarajan S, Pickett L A, Dutt V . AI in healthcare: time-series forecasting using statistical, neural, and ensemble architectures. Frontiers in Big Data, 2020, 3: 4

[5]

Monteil J, Dekusar A, Gambella C, Lassoued Y, Mevissen M . On model selection for scalable time series forecasting in transport networks. IEEE Transactions on Intelligent Transportation Systems, 2022, 23( 7): 6699–6708

[6]

Pei X, Yuan M, Mao G, Pang Z . Application of multivariate time-series model for high performance computing (HPC) fault prediction. PLoS One, 2023, 18( 10): e0281519

[7]

Cheng C, Sa-Ngasoongsong A, Beyca O, Le T, Yang H, Kong Z, Bukkapatnam S T S . Time series forecasting for nonlinear and non-stationary processes: a review and comparative study. IIE Transactions, 2015, 47( 10): 1053–1071

[8]

Deng Z, Eshima S P, Nabity J, Kong Z . Causal signal temporal logic for the environmental control and life support system’s fault analysis and explanation. IEEE Access, 2023, 11: 26471–26482

[9]

Xuan X, Ono J P, Gou L, Ma K L, Ren L . AttributionScanner: a visual analytics system for model validation with metadata-free slice finding. IEEE Transactions on Visualization and Computer Graphics, 2025, 31( 10): 7436–7447

[10]

Siami-Namini S, Tavakoli N, Namin A S. A comparison of ARIMA and LSTM in forecasting time series. In: Proceedings of the 17th IEEE International Conference on Machine Learning and Applications (ICMLA). 2018, 1394−1401

[11]

Jia Y, Lin Y, Hao X, Lin Y, Guo S, Wan H. WITRAN: water-wave information transmission and recurrent acceleration network for long-range time series forecasting. In: Proceedings of the 37th International Conference on Neural Information Processing Systems. 2023, 544

[12]

Wu H, Hu T, Liu Y, Zhou H, Wang J, Long M. TimesNet: temporal 2D-variation modeling for general time series analysis. In: Proceedings of the 11th International Conference on Learning Representations. 2023

[13]

Huang Q, Shen L, Zhang R, Ding S, Wang B, Zhou Z, Wang Y. CrossGNN: confronting noisy multivariate time series via cross interaction refinement. In: Proceedings of the 37th International Conference on Neural Information Processing Systems. 2023, 2031

[14]

Chen P, Zhang Y, Cheng Y, Shu Y, Wang Y, Wen Q, Yang B, Guo C. Pathformer: multi-scale transformers with adaptive pathways for time series forecasting. In: Proceedings of the 12th International Conference on Learning Representations. 2024

[15]

Jin M, Wang S, Ma L, Chu Z, Zhang J Y, Shi X, Chen P Y, Liang Y, Li Y F, Pan S, Wen Q. Time-LLM: time series forecasting by reprogramming large language models. In: Proceedings of the 12th International Conference on Learning Representations. 2024

[16]

Kollovieh M, Ansari A F, Bohlke-Schneider M, Zschiegner J, Wang H, Wang Y. Predict, refine, synthesize: Self-guiding diffusion models for probabilistic time series forecasting. In: Proceedings of the 37th International Conference on Neural Information Processing Systems. 2023, 1232

[17]

Oreshkin B N, Carpov D, Chapados N, Bengio Y. N-BEATS: neural basis expansion analysis for interpretable time series forecasting. In: Proceedings of the 8th International Conference on Learning Representations. 2020

[18]

Deng Z, Prugsanapan P, Kong Z. Anytime communication: a human-AI collaboration framework for causal-based root cause analysis. In: Proceedings of the AIAA SCITECH 2025 Forum. 2025, 2252

[19]

Xuan X, Wang X, He W, Ono J P, Gou L, Ma K L, Ren L . VISTA: a visual analytics framework to enhance foundation model-generated data labels. IEEE Transactions on Visualization and Computer Graphics, 2025, 31( 10): 6991–7003

[20]

Buizza R . The value of probabilistic prediction. Atmospheric Science Letters, 2008, 9( 2): 36–42

[21]

Pascanu R, Mikolov T, Bengio Y. On the difficulty of training recurrent neural networks. In: Proceedings of the 30th International Conference on Machine Learning. 2013, 1310−1318

[22]

Han L, Ye H J, Zhan D C . The capacity and robustness trade-off: revisiting the channel independent strategy for multivariate time series forecasting. IEEE Transactions on Knowledge and Data Engineering, 2024, 36( 11): 7129–7142

[23]

Zeng A, Chen M, Zhang L, Xu Q. Are transformers effective for time series forecasting? In: Proceedings of the 37th AAAI Conference on Artificial Intelligence and 35th Conference on Innovative Applications of Artificial Intelligence and Thirteenth Symposium on Educational Advances in Artificial Intelligence. 2023, 11121−11128

[24]

Liu Y, Hu T, Zhang H, Wu H, Wang S, Ma L, Long M. iTransformer: inverted transformers are effective for time series forecasting. In: Proceedings of the 12th International Conference on Learning Representations. 2024

[25]

Zhang W, Liu H, Zha L, Zhu H, Liu J, Dou D, Xiong H. MugRep: a multi-task hierarchical graph representation learning framework for real estate appraisal. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining. 2021, 3937−3947

[26]

Rubanova Y, Chen R T Q, Duvenaud D K. Latent ordinary differential equations for irregularly-sampled time series. In: Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019, 32

[27]

De Brouwer E, Simm J, Arany A, Moreau Y. GRU-ODE-bayes: continuous modeling of sporadically-observed time series. In: Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019, 663

[28]

Salinas D, Flunkert V, Gasthaus J, Januschowski T . DeepAR: probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 2020, 36( 3): 1181–1191

[29]

Ioffe S, Szegedy C. Batch normalization: accelerating deep network training by reducing internal covariate shift. In: Proceedings of the 32nd International Conference on International Conference on Machine Learning. 2015, 448−456

[30]

Chapados N. Effective Bayesian modeling of groups of related count time series. In: Proceedings of the 31st International Conference on Machine Learning. 2014, 1395−1403

[31]

Liu L, Liu X, Gao J, Chen W, Han J. Understanding the difficulty of training transformers. In: Proceedings of 2020 Conference on Empirical Methods in Natural Language Processing. 2020, 5747−5763

[32]

Dosovitskiy A, Beyer L, Kolesnikov A, Weissenborn D, Zhai X, Unterthiner T, Dehghani M, Minderer M, Heigold G, Gelly S, Uszkoreit J, Houlsby N. An image is worth 16x16 words: Transformers for image recognition at scale. In: Proceedings of the 9th International Conference on Learning Representations. 2021

[33]

Godahewa R, Bergmeir C, Webb G I, Hyndman R J, Montero-Manso P. Monash time series forecasting archive. In: Proceedings of the 1st Neural Information Processing Systems Track on Datasets and Benchmarks. 2021

[34]

Hochreiter S, Schmidhuber J . Long short-term memory. Neural Computation, 1997, 9( 8): 1735–1780

[35]

Chung J, Gulcehre C, Cho K, Bengio Y. Empirical evaluation of gated recurrent neural networks on sequence modeling. 2014, arXiv preprint arXiv: 1412.3555

[36]

Zhou H, Zhang S, Peng J, Zhang S, Li J, Xiong H, Zhang W. Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the 35th AAAI Conference on Artificial Intelligence. 2021, 11106−11115

[37]

Zhou T, Ma Z, Wen Q, Wang X, Sun L, Jin R. FEDformer: frequency enhanced decomposed transformer for long-term series forecasting. In: Proceedings of the 39th International Conference on Machine Learning. 2022, 27268−27286

[38]

Lin S, Lin W, Wu W, Zhao F, Mo R, Zhang H. SegRNN: segment recurrent neural network for long-term time series forecasting. 2023, arXiv preprint arXiv: 2308.11200

[39]

Jaeger H. The “echo state” approach to analysing and training recurrent neural networks-with an erratum note. Bonn: German National Research Center for Information Technology, 2001

[40]

Matzner F, Mráz F. Locally connected echo state networks for time series forecasting. In: Proceedings of the 13th International Conference on Learning Representations. 2025

[41]

Kong Y, Wang Z, Nie Y, Zhou T, Zohren S, Liang Y, Sun P, Wen Q. Unlocking the power of LSTM for long term time series forecasting. In: Proceedings of the 39th AAAI Conference on Artificial Intelligence. 2025, 11968−11976

[42]

Beck M, Pöppel K, Spanring M, Auer A, Prudnikova O, Kopp M, Klambauer G, Brandstetter J, Hochreiter S. xLSTM: Extended long short-term memory. In: Proceedings of the 38th International Conference on Neural Information Processing Systems. 2024, 3417

[43]

Deng Z, Xuan X, Ma K L, Kong Z. A reliable framework for human-in-the-loop anomaly detection in time series. 2024, arXiv preprint arXiv: 2405.03234

[44]

Xuan X, Zhang X, Kwon O H, Ma K L . VAC-CNN: a visual analytics system for comparative studies of deep convolutional neural networks. IEEE Transactions on Visualization and Computer Graphics, 2022, 28( 6): 2326–2337

[45]

Bai S, Kolter J Z, Koltun V. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. 2018, arXiv preprint arXiv: 1803.01271

[46]

Liu M, Zeng A, Chen M, Xu Z, Lai Q, Ma L, Xu Q. SCINet: time series modeling and forecasting with sample convolution and interaction. In: Proceedings of the 36th International Conference on Neural Information Processing Systems. 2022, 421

[47]

Wang H, Peng J, Huang F, Wang J, Chen J, Xiao Y. MICN: multi-scale local and global context modeling for long-term series forecasting. In: Proceedings of the 11th International Conference on Learning Representations. 2023

[48]

Li C, Li M, Diao R. TVNet: a novel time series analysis method based on dynamic convolution and 3D-variation. In: Proceedings of the 13th International Conference on Learning Representations. 2025

[49]

Bai L, Yao L, Li C, Wang X, Wang C. Adaptive graph convolutional recurrent network for traffic forecasting. In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020, 1494

[50]

Cao D, Wang Y, Duan J, Zhang C, Zhu X, Huang C, Tong Y, Xu B, Bai J, Tong J, Zhang Q. Spectral temporal graph neural network for multivariate time-series forecasting. In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020, 1491

[51]

Li Y, Yu R, Shahabi C, Liu Y. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. In: Proceedings of the 6th International Conference on Learning Representations. 2018

[52]

Yu B, Yin H, Zhu Z. Spatio-temporal graph convolutional networks: a deep learning framework for traffic forecasting. In: Proceedings of the 27th International Joint Conference on Artificial Intelligence. 2018, 3634−3640

[53]

Chen Y, Segovia-Dominguez I, Coskunuzer B, Gel Y R. TAMP-S2GCNets: coupling time-aware multipersistence knowledge representation with spatio-supra graph convolutional networks for time-series forecasting. In: Proceedings of the 10th International Conference on Learning Representations. 2022

[54]

Yi K, Zhang Q, Fan W, He H, Hu L, Wang P, An N, Cao L, Niu Z. FourierGNN: Rethinking multivariate time series forecasting from a pure graph perspective. In: Proceedings of the 37th International Conference on Neural Information Processing Systems. 2023, 3050

[55]

Zhang W, Yin C, Liu H, Zhou X, Xiong H. Irregular multivariate time series forecasting: a transformable patching graph neural networks approach. In: Proceedings of the 41st International Conference on Machine Learning. 2024, 2489

[56]

Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Kaiser Ł, Polosukhin I. Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017, 6000−6010

[57]

Wu H, Xu J, Wang J, Long M. Autoformer: decomposition transformers with Auto-Correlation for long-term series forecasting. In: Proceedings of the 35th Conference on Neural Information Processing Systems. 2021

[58]

Wen Q, Zhou T, Zhang C, Chen W, Ma Z, Yan J, Sun L. Transformers in time series: a survey. In: Proceedings of the 32nd International Joint Conference on Artificial Intelligence. 2023, 6778−6786

[59]

Nie Y, Nguyen N H, Sinthong P, Kalagnanam J. A time series is worth 64 words: Long-term forecasting with transformers. In: Proceedings of the 11th International Conference on Learning Representations. 2023

[60]

Zhang Y, Yan J. Crossformer: transformer utilizing cross-dimension dependency for multivariate time series forecasting. In: Proceedings of the 11th International Conference on Learning Representations. 2023

[61]

Wang X, Zhou T, Wen Q, Gao J, Ding B, Jin R. CARD: channel aligned robust blend transformer for time series forecasting. In: Proceedings of the 12th International Conference on Learning Representations. 2024

[62]

Das A, Leach A, Sen R, Yu R, Kong W. Long horizon forecasting with tide: time-series dense encoder. Transactions on Machine Learning Research, 2023

[63]

Ilbert R, Odonnat A, Feofanov V, Virmaux A, Paolo G, Palpanas T, Redko I. SAMformer: unlocking the potential of transformers in time series forecasting with sharpness-aware minimization and channel-wise attention. In: Proceedings of the 41st International Conference on Machine Learning. 2024, 841

[64]

Kim T, Kim J, Tae Y, Park C, Choi J H, Choo J. Reversible instance normalization for accurate time-series forecasting against distribution shift. In: Proceedings of the 10th International Conference on Learning Representations. 2022

[65]

Zhang H, Wu C, Zhang Z, Zhu Y, Lin H, Zhang Z, Sun Y, He T, Mueller J, Manmatha R, Li M, Smola A. ResNeSt: split-attention networks. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022, 2735−2745

[66]

Zamir S W, Arora A, Khan S, Hayat M, Khan F S, Yang M H. Restormer: efficient transformer for high-resolution image restoration. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022, 5718−5729

[67]

Xuan X, Deng Z, Ma K L. ReME: a data-centric framework for training-free open-vocabulary segmentation. 2025, arXiv preprint arXiv: 2506.21233

[68]

Liu Y, Qin G, Huang X, Wang J, Long M. Timer-XL: long-context transformers for unified time series forecasting. In: Proceedings of the 13th International Conference on Learning Representations. 2025

[69]

Spathis D, Kawsar F . The first step is the hardest: pitfalls of representing and tokenizing temporal data for large language models. Journal of the American Medical Informatics Association, 2024, 31( 9): 2151–2158

[70]

Zhou T, Niu P, Wang X, Sun L, Jin R. One fits all: power general time series analysis by pretrained LM. In: Proceedings of the 37th International Conference on Neural Information Processing Systems. 2023, 1877

[71]

Cao D, Jia F, Arik S Ö, Pfister T, Zheng Y, Ye W, Liu Y. TEMPO: prompt-based generative pre-trained transformer for time series forecasting. In: Proceedings of the 12th International Conference on Learning Representations. 2024

[72]

Liu C, Xu Q, Miao H, Yang S, Zhang L, Long C, Li Z, Zhao R. TimeCMA: towards LLM-empowered multivariate time series forecasting via cross-modality alignment. In: Proceedings of the 39th AAAI Conference on Artificial Intelligence. 2025, 18780−18788

[73]

Zhang T, Zhang Y, Cao W, Bian J, Yi X, Zheng S, Li J. Less is more: fast multivariate time series forecasting with light sampling-oriented MLP structures. 2022, arXiv preprint arXiv: 2207.01186

[74]

Chen S A, Li C L, Yoder N, Arik S O, Pfister T. TSMixer: an all-MLP architecture for time series forecasting. 2023, arXiv preprint arXiv: 2303.06053

[75]

Yi K, Zhang Q, Fan W, Wang S, Wang P, He H, Lian D, An N, Cao L, Niu Z. Frequency-domain MLPs are more effective learners in time series forecasting. In: Proceedings of the 37th International Conference on Neural Information Processing Systems. 2023, 3349

[76]

Han L, Chen X Y, Ye H J, Zhan D C. SOFTS: efficient multivariate time series forecasting with series-core fusion. In: Proceedings of the 38th International Conference on Neural Information Processing Systems. 2024, 2046

[77]

Wang S, Wu H, Shi X, Hu T, Luo H, Ma L, Zhang J Y, Zhou J. TimeMixer: decomposable multiscale mixing for time series forecasting. In: Proceedings of the 12th International Conference on Learning Representations. 2024

[78]

Hu Y, Liu P, Zhu P, Cheng D, Dai T. Adaptive multi-scale decomposition framework for time series forecasting. In: Proceedings of the 39th AAAI Conference on Artificial Intelligence. 2025, 17359−17367

[79]

Ni R, Lin Z, Wang S, Fanti G. Mixture-of-linear-experts for long-term time series forecasting. In: Proceedings of the 27th International Conference on Artificial Intelligence and Statistics. 2024, 4672−4680

[80]

Rasul K, Seward C, Schuster I, Vollgraf R. Autoregressive denoising diffusion models for multivariate probabilistic time series forecasting. In: Proceedings of the 38th International Conference on Machine Learning. 2021, 8857−8868

[81]

Li Y, Lu X, Wang Y, Dou D. Generative time series forecasting with diffusion, denoise, and disentanglement. In: Proceedings of the 36th International Conference on Neural Information Processing Systems. 2022, 1672

[82]

Gu A, Goel K, C. Efficiently modeling long sequences with structured state spaces. In: Proceedings of the 10th International Conference on Learning Representations. 2022

[83]

Rombach R, Blattmann A, Lorenz D, Esser P, Ommer B. High-resolution image synthesis with latent diffusion models. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022, 10674−10685

[84]

Feng S, Miao C, Zhang Z, Zhao P. Latent diffusion transformer for probabilistic time series forecasting. In: Proceedings of the 38th AAAI Conference on Artificial Intelligence. 2024, 11979−11987

[85]

Gao J, Cao Q, Chen Y. Auto-regressive moving diffusion models for time series forecasting. In: Proceedings of the 39th AAAI Conference on Artificial Intelligence. 2025, 16727−16735

[86]

Li Q, Zhang Z, Yao L, Li Z, Zhong T, Zhang Y. Diffusion-based decoupled deterministic and uncertain framework for probabilistic multivariate time series forecasting. In: Proceedings of the 13th International Conference on Learning Representations. 2025

[87]

Zhou T, Ma Z, Wang X, Wen Q, Sun L, Yao T, Yin W, Jin R. FiLM: frequency improved Legendre memory model for long-term time series forecasting. In: Proceedings of the 36th International Conference on Neural Information Processing Systems. 2022, 921

[88]

Voelker A R, Kajić I, Eliasmith C. Legendre memory units: continuous-time representation in recurrent neural networks. In: Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019, 1395

[89]

Shibo F, Feng W, Gao X, Zhao P, Shen Z. TS-LIF: a temporal segment spiking neuron network for time series forecasting. In: Proceedings of the 13th International Conference on Learning Representations. 2025

[90]

Hu J, Man Y, Qiu X, Chou Y, Cai Y, Qiao N, Tian Y, Xu B, Li G. High-performance temporal reversible spiking neural networks with O(L) training memory and O(1) inference cost. In: Proceedings of the 41st International Conference on Machine Learning. 2024, 19516−19530

[91]

Ahamed M A, Cheng Q S. TimeMachine: a time series is worth 4 mambas for long-term forecasting. In: Proceedings of the 27th European Conference on Artificial Intelligence, 19−24 October 2024, Santiago de Compostela, Spain-Including 13th Conference on Prestigious Applications of Intelligent Systems. 2024, 1688−1695

[92]

Gu A, Dao T. Mamba: linear-time sequence modeling with selective state spaces. 2024, arXiv preprint arXiv: 2312.00752

[93]

Zheng R, Bai H, Ding W. KooNPro: a variance-aware Koopman probabilistic model enhanced by neural process for time series forecasting. In: Proceedings of the 13th International Conference on Learning Representations. 2025

[94]

Koopman B O . Hamiltonian systems and transformation in Hilbert space. Proceedings of the National Academy of Sciences of the United States of America, 1931, 17( 5): 315–318

[95]

Truchan H, Kalfar C, Ahmadi Z. LTBoost: boosted hybrids of ensemble linear and gradient algorithms for the long-term time series forecasting. In: Proceedings of the 33rd ACM International Conference on Information and Knowledge Management. 2024, 2271−2281

[96]

Huang S, Zhao Z, Li C, Bai L. TimeKAN: KAN-based frequency decomposition learning architecture for long-term time series forecasting. In: Proceedings of the 13th International Conference on Learning Representations. 2025

[97]

Liu Z, Wang Y, Vaidya S, Ruehle F, Halverson J, Soljačić M, Hou T Y, Tegmark M. KAN: Kolmogorov-Arnold networks. In: Proceedings of the 13th International Conference on Learning Representations. 2025

[98]

Cai W, Liang Y, Liu X, Feng J, Wu Y. MSGNet: learning multi-scale inter-series correlations for multivariate time series forecasting. In: Proceedings of the 38th AAAI Conference on Artificial Intelligence. 2024, 11141−11149

[99]

Yu G, Zou J, Hu X, Aviles-Rivero A I, Qin J, Wang S. Revitalizing multivariate time series forecasting: Learnable decomposition with inter-series dependencies and intra-series variations modeling. In: Proceedings of the 41st International Conference on Machine Learning. 2024, 2385

[100]

Chen H, Luong V, Mukherjee L, Singh V. SimpleTM: a simple baseline for multivariate time series forecasting. In: Proceedings of the 13th International Conference on Learning Representations. 2025

[101]

Xuan X, Deng Z, Lin H T, Ma K L. SLIM: spuriousness mitigation with minimal human annotations. In: Proceedings of the 18th European Conference on Computer Vision. 2025, 215−231

RIGHTS & PERMISSIONS

The Author(s) 2025. This article is published with open access at link.springer.com and journal.hep.com.cn

AI Summary AI Mindmap
PDF (2257KB)

Supplementary files

Highlights

496

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/