A survey and benchmark evaluation for neural-network-based lossless universal compressors toward multi-source data

Hui SUN , Huidong MA , Feng LING , Haonan XIE , Yongxia SUN , Liping YI , Meng YAN , Cheng ZHONG , Xiaoguang LIU , Gang WANG

Front. Comput. Sci. ›› 2025, Vol. 19 ›› Issue (7) : 197360

PDF (1413KB)
Front. Comput. Sci. ›› 2025, Vol. 19 ›› Issue (7) : 197360 DOI: 10.1007/s11704-024-40300-5
Artificial Intelligence
REVIEW ARTICLE

A survey and benchmark evaluation for neural-network-based lossless universal compressors toward multi-source data

Author information +
History +
PDF (1413KB)

Abstract

As various types of data grow explosively, large-scale data storage, backup, and transmission become challenging, which motivates many researchers to propose efficient universal compression algorithms for multi-source data. In recent years, due to the emergence of hardware acceleration devices such as GPUs, TPUs, DPUs, and FPGAs, the performance bottleneck of neural networks (NN) has been overcome, making NN-based compression algorithms increasingly practical and popular. However, the research survey for the NN-based universal lossless compressors has not been conducted yet, and there is also a lack of unified evaluation metrics. To address the above problems, in this paper, we present a holistic survey as well as benchmark evaluations. Specifically, i) we thoroughly investigate NN-based lossless universal compression algorithms toward multi-source data and classify them into 3 types: static pre-training, adaptive, and semi-adaptive. ii) We unify 19 evaluation metrics to comprehensively assess the compression effect, resource consumption, and model performance of compressors. iii) We conduct experiments more than 4600 CPU/GPU hours to evaluate 17 state-of-the-art compressors on 28 real-world datasets across data types of text, images, videos, audio, etc. iv) We also summarize the strengths and drawbacks of NN-based lossless data compressors and discuss promising research directions. We summarize the results as the NN-based Lossless Compressors Benchmark (NNLCB, See fahaihi.github.io/NNLCB website), which will be updated and maintained continuously in the future.

Graphical abstract

Keywords

lossless compression / benchmark evaluation / universal compressors / neural networks / deep learning

Cite this article

Download citation ▾
Hui SUN, Huidong MA, Feng LING, Haonan XIE, Yongxia SUN, Liping YI, Meng YAN, Cheng ZHONG, Xiaoguang LIU, Gang WANG. A survey and benchmark evaluation for neural-network-based lossless universal compressors toward multi-source data. Front. Comput. Sci., 2025, 19(7): 197360 DOI:10.1007/s11704-024-40300-5

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Schaller R R . Moore’s law: past, present and future. IEEE Spectrum, 1997, 34( 6): 52–59

[2]

Rydning J. Global DataSphere, Data Marketplaces, and Data as a Service. See idc. com/getdoc.jsp?containerId=IDC_P38353 website, 2023

[3]

Sun H, Ma H, Zheng Y, Xie H, Wang X, Liu X. SR2C: a structurally redundant short reads collapser for optimizing DNA data compression. In: Proceedings of the 29th IEEE International Conference on Parallel and Distributed Systems. 2023, 60−67

[4]

Ji Z, Zhou J R, Jiang L, Wu Q H . Overview of DNA sequence data compression techniques. Acta Electronica Sinica, 2010, 38( 5): 1113–1121

[5]

Numanagić I, Bonfield J K, Hach F, Voges J, Ostermann J, Alberti C, Mattavelli M, Sahinalp S C . Comparison of high-throughput sequencing data compression tools. Nature Methods, 2016, 13( 12): 1005–1008

[6]

Kredens K V, Martins J V, Dordal O B, Ferrandin M, Herai R H, Scalabrin E E, Ávila B C . Vertical lossless genomic data compression tools for assembled genomes: a systematic literature review. PLoS One, 2020, 15( 5): e0232942

[7]

Sun H, Zheng Y, Xie H, Ma H, Liu X, Wang G . PMFFRC: a large-scale genomic short reads compression optimizer via memory modeling and redundant clustering. BMC Bioinformatics, 2023, 24( 1): 454

[8]

Sun H, Zheng Y, Xie H, Ma H, Zhong C, Yan M, Liu X, Wang G . PQSDC: a parallel lossless compressor for quality scores data via sequences partition and run-length prediction mapping. Bioinformatics, 2024, 40( 5): btae323

[9]

Ai D, Lu H Y, Yang Y R, Liu Y H, Lu J, Liu Y . A brief overview 3D point cloud data compression technology. Journal of Xi’an University of Posts and Telecommunications, 2021, 26( 1): 90–96

[10]

Chen X, Tian J, Beaver I, Freeman C, Yan Y, Wang J, Tao D . FCBench: cross-domain benchmarking of lossless compression for floating-point data. Proceedings of the VLDB Endowment, 2024, 17( 6): 1418–1431

[11]

Mishra D, Singh S K, Singh R K . Deep architectures for image compression: a critical review. Signal Processing, 2022, 191: 108346

[12]

Jamil S, Piran M J, Rahman M U, Kwon O J . Learning-driven lossy image compression: a comprehensive survey. Engineering Applications of Artificial Intelligence, 2023, 123: 106361

[13]

Bourai N E H, Merouani H F, Djebbar A . Deep learning-assisted medical image compression challenges and opportunities: systematic review. Neural Computing and Applications, 2024, 36( 17): 10067–10108

[14]

Tian T, Wang H . Large-scale video compression: recent advances and challenges. Frontiers of Computer Science, 2018, 12( 5): 825–839

[15]

Im S K, Ghandi M M . Improved rate-distortion optimized video coding using non-integer bit estimation and multiple Lambda search. Frontiers of Computer Science, 2016, 10( 1): 157–166

[16]

Lasse C. The official website of the XZ compressor. See tukaani.org/xz/ website, 2015

[17]

Meta. Zstandard-Fast real-time compression algorithm. See facebook/zstd: Zstandard - Fast real-time compression algorithm website, 2024

[18]

Google. Brotli compression format. See github.com/google/brotli website, 2024

[19]

IlyaGrebnov. High performance block-sorting data compression library. See github.com/IlyaGrebnov/libbsc website, 2024

[20]

Michael. szip homepage. See compressconsult.com/szip/ website, 2002

[21]

Julian S. The official website of the Bzip2 compressor. See sourceware.org/bzip2/ website, 2019

[22]

Mahoney M. Incremental journaling backup utility and archiver. See mattmahoney.net/dc/zpaq website, 2016

[23]

mathieuchartier. MCMfile compressor. See github.com/mathieuchartier/mcm website, 2016

[24]

Margaritov. Hutter prize submission 2021a: STARLIT + cmix. See github.com/amargaritov/starlit website, 2021

[25]

Aslanyurek M, Mesut A. A static dictionary-based approach to compressing short texts. In: Proceedings of the 6th International Conference on Computer Science and Engineering. 2021, 342−347

[26]

Kasneci E, Sessler K, Kuchemann S, Bannert M, Dementieva D, Fischer F, Gasser U, Groh G, Günnemann S, Hüllermeier E, Krusche S, Kutyniok G, Michaeli T, Nerdel C, Pfeffer J, Poquet O, Sailer M, Schmidt A, Seidel T, Stadler M, Weller J, Kuhn J, Kasneci G . ChatGPT for good? On opportunities and challenges of large language models for education. Learning and Individual Differences, 2023, 103: 102274

[27]

Wei C, Wang Y C, Wang B, Kuo C C J. An overview on language models: recent developments and outlook. 2023, arXiv preprint arXiv: 2303.05759

[28]

Thirunavukarasu A J, Ting D S J, Elangovan K, Gutierrez L, Tan T F, Ting D S W . Large language models in medicine. Nature Medicine, 2023, 29( 8): 1930–1940

[29]

Hochreiter S, Schmidhuber J . Long short-term memory. Neural Computation, 1997, 9( 8): 1735–1780

[30]

Yu Y, Si X, Hu C, Zhang J . A review of recurrent neural networks: LSTM cells and network architectures. Neural Computation, 2019, 31( 7): 1235–1270

[31]

Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Kaiser Ł, Polosukhin I. Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017, 6000−6010

[32]

Huang Y, Xu J, Lai J, Jiang Z, Chen T, Li Z, Yao Y, Ma X, Yang L, Chen H, Li S, Zhao P. Advancing transformer architecture in long-context large language models: a comprehensive survey. 2024, arXiv preprint arXiv: 2311.12351

[33]

Radford A, Wu J, Child R, Luan D, Amodei D, Sutskever I . Language models are unsupervised multitask learners. OpenAI Blog, 2019, 1( 8): 9

[34]

Devlin J, Chang M W, Lee K, Toutanova K. BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2018, 4171−4186

[35]

Gu A, Dao T. Mamba: linear-time sequence modeling with selective state spaces. 2024, arXiv preprint arXiv: 2312.00752

[36]

Beck M, Pöppel K, Spanring M, Auer A, Prudnikova O, Kopp M, Klambauer G, Brandstetter J, Hochreiter S. xLSTM: extended long short-term memory. 2024, arXiv preprint arXiv: 2405.04517

[37]

Mao Y, Cui Y, Kuo T W, Xue C J. TRACE: a fast transformer-based general-purpose lossless compressor. In: Proceedings of ACM Web Conference 2022. 2022, 1829−1838

[38]

Mao Y, Cui Y, Kuo T W, Xue C J. Accelerating general-purpose lossless compression via simple and scalable parameterization. In: Proceedings of the 30th ACM International Conference on Multimedia. 2022, 3205−3213

[39]

Mao Y, Li J, Cui Y, Xue J C. Faster and stronger lossless compression with optimized autoregressive framework. In: Proceedings of the 60th ACM/IEEE Design Automation Conference. 2023, 1−6

[40]

Zhong C, Sun H . Parallel algorithm for sensitive sequence recognition from long-read genome data with high error rate. Journal on Communications, 2023, 44( 2): 160–171

[41]

Sayood K. Introduction to Data Compression. 5th ed. Sydney: Morgan Kaufmann, 2017

[42]

Shannon C E . A mathematical theory of communication. The Bell System Technical Journal, 1948, 27( 3): 379–423

[43]

Moffat A . Huffman coding. ACM Computing Surveys (CSUR), 2019, 52( 4): 85

[44]

Langdon G G . An introduction to arithmetic coding. IBM Journal of Research and Development, 1984, 28( 2): 135–149

[45]

Ziv J, Lempel A . A universal algorithm for sequential data compression. IEEE Transactions on Information Theory, 1977, 23( 3): 337–343

[46]

Schindler M. A fast block-sorting algorithm for lossless data compression. In: Proceedings of 1997 Data Compression Conference. 1997, 469

[47]

Capon J . A probabilistic model for run-length coding of pictures. IRE Transactions on Information Theory, 1959, 5( 4): 157–163

[48]

Smith C A . A survey of various data compression techniques. International Journal of Recent Technology Engineering, 2010, 2( 1): 1–20

[49]

Jayasankar U, Thirumal V, Ponnurangam D . A survey on data compression techniques: from the perspective of data quality, coding schemes, data type and applications. Journal of King Saud University- Computer and Information Sciences, 2021, 33( 2): 119–140

[50]

Chiarot G, Silvestri C . Time series compression survey. ACM Computing Surveys, 2023, 55( 10): 1–32

[51]

Holtz K. The evolution of lossless data compression techniques. In: Proceedings of WESCON ’93. 1993, 140−145

[52]

Kimura N, Latifi S. A survey on data compression in wireless sensor networks. In: Proceedings of International Conference on Information Technology: Coding and Computing. 2005, 8−13

[53]

Chew L W, Ang L M, Seng K P. Survey of image compression algorithms in wireless sensor networks. In: Proceedings of 2008 International Symposium on Information Technology. 2008, 1−9

[54]

Me S S, Vijayakuymar V R, Anuja R . A survey on various compression methods for medical images. International Journal of Intelligent Systems and Applications (IJISA), 2012, 4( 3): 13–19

[55]

Hosseini M. Data compression algorithms and their applications. See scribd.com/document/77511910/Data-Compression-Algorithms-and-Their-Applications website, 2012

[56]

Srisooksai T, Keamarungsi K, Lamsrichan P, Araki K . Practical data compression in wireless sensor networks: a survey. Journal of Network and Computer Applications, 2012, 35( 1): 37–59

[57]

Sharma N, Kaur J, Kaur N . A review on various Lossless text data compression techniques. Research Cell: An International Journal of Engineering Sciences, 2014, 2: 58–63

[58]

Zhu Z, Zhang Y, Ji Z, He S, Yang X . High-throughput DNA sequence data compression. Briefings in Bioinformatics, 2015, 16( 1): 1–15

[59]

Hernaez M, Pavlichin D, Weissman T, Ochoa I . Genomic data compression. Annual Review of Biomedical Data Science, 2019, 2: 19–37

[60]

Kryukov K, Ueda M T, Nakagawa S, Imanishi T . Sequence compression benchmark (SCB) database—A comprehensive evaluation of reference-free compressors for FASTA-formatted sequences. GigaScience, 2020, 9( 7): giaa072

[61]

Gilmary R, Venkatesan A, Vaiyapuri G . Compression techniques for DNA sequences: a thematic review. Journal of Computing Science and Engineering, 2021, 15( 2): 59–71

[62]

Sun H, Ma H, Zheng Y, Xie H, Yan M, Zhong C. LRCB: a comprehensive benchmark evaluation of reference-free lossless compression tools for genomics sequencing long reads data. In: Proceedings of 2024 Data Compression Conference. 2024, 584

[63]

Singh B, Kaur A, Singh J . A review of ECG data compression techniques. International Journal of Computer Applications, 2015, 116( 11): 39–44

[64]

Rajankar S O, Talbar S N . An electrocardiogram signal compression techniques: a comprehensive review. Analog Integrated Circuits and Signal Processing, 2019, 98( 1): 59–74

[65]

Kumar P, Parmar A . Versatile approaches for medical image compression: a review. Procedia Computer Science, 2020, 167: 1380–1389

[66]

Patidar G, Kumar S, Kumar D. A review on medical image data compression techniques. In: Proceedings of the 2nd International Conference on Data, Engineering and Applications. 2020, 1−6

[67]

Seeli D J J, Thanammal K K. A comparative review and analysis of medical image encryption and compression techniques. Multimedia Tools and Applications, 2024

[68]

Wen L, Zhou K, Yang S, Li L . Compression of smart meter big data: a survey. Renewable and Sustainable Energy Reviews, 2018, 91: 59–69

[69]

Prokop K, Bień A, Barczentewicz S . Compression techniques for real-time control and non-time-critical big data in smart grids: a review. Energies, 2023, 16( 24): 8077

[70]

Tcheou M P, Lovisolo L, Ribeiro M V, da Silva E A B, Rodrigues M A M, Romano J M T, Diniz P S R . The compression of electric signal waveforms for smart grids: state of the art and future trends. IEEE Transactions on Smart Grid, 2014, 5( 1): 291–302

[71]

Sheltami T, Musaddiq M, Shakshuki E . Data compression techniques in wireless sensor networks. Future Generation Computer Systems, 2016, 64: 151–162

[72]

Sandhya Rani I, Venkateswarlu B. A systematic review of different data compression technique of cloud big sensing data. In: Proceedings of the 2nd International Conference on Computer Networks and Communication Technologies. 2020, 222−228

[73]

Ketshabetswe K L, Zungeru A M, Mtengi B, Lebekwe C K, Prabaharan S R S . Data compression algorithms for wireless sensor networks: a review and comparison. IEEE Access, 2021, 9: 136872–136891

[74]

Correa J D A, Pinto A S R, Montez C . Lossy data compression for IoT sensors: a review. Internet of Things, 2022, 19: 100516

[75]

De Romarategui D G F. Compressing network data with deep learning. Universitat Politècnica de Catalunya, Dissertation, 2024

[76]

Kaur R, Chana I, Bhattacharya J . Data deduplication techniques for efficient cloud storage management: a systematic review. The Journal of Supercomputing, 2018, 74( 5): 2035–2085

[77]

Cappello F, Di S, Li S, Liang X, Gok A M, Tao D, Yoon C H, Wu X C, Alexeev Y, Chong F T . Use cases of lossy compression for floating-point data in scientific data sets. The International Journal of High Performance Computing Applications, 2019, 33( 6): 1201–1220

[78]

Schmidhuber J, Heil S . Sequential neural text compression. IEEE Transactions on Neural Networks, 1996, 7( 1): 142–146

[79]

Goyal M, Tatwawadi K, Chandak S, Ochoa I. DZip: improved general-purpose lossless compression based on novel neural network modeling. In: Proceedings of 2020 Data Compression Conference. 2020, 372−372

[80]

Mahoney M V. Fast text compression with neural networks. In: Proceedings of the 13th International Florida Artificial Intelligence Research Society Conference. 2000, 230−234

[81]

Delétang G, Ruoss A, Duquenne P A, Catt E, Genewein T, Mattern C, Grau-Moya J, Wenliang L K, Aitchison M, Orseau L, Hutter M, Veness J. Language modeling is compression. In: Proceedings of the 12th International Conference on Learning Representations. 2024

[82]

Goyal M, Tatwawadi K, Chandak S, Ochoa I. DeepZip: lossless data compression using recurrent neural networks. In: Proceedings of 2019 Data Compression Conference. 2019, 575

[83]

Liu Q, Xu Y, Li Z. DecMac: a deep context model for high efficiency arithmetic coding. In: Proceedings of 2019 International Conference on Artificial Intelligence in Information and Communication. 2019, 438−443

[84]

Valmeekam C S K, Narayanan K, Kalathil D, Chamberland J F, Shakkottai S. LLMZip: lossless text compression using large language models. 2023, arXiv preprint arXiv: 2306.04050

[85]

Byronknoll. Cmix. See github.com/byronknoll/cmix website, 2024

[86]

Bell T, Witten I H, Cleary J G . Modeling for text compression. ACM Computing Surveys (CSUR), 1989, 21( 4): 557–591

[87]

Burrows M, Wheeler D J. A block-sorting lossless data compression algorithm. Palo Alto: Systems Research Center, 1994: 124

[88]

Mahoney M V. Adaptive weighing of context models for lossless data compression. See mattmahoney.net/dc/cs200516.pdf, 2005

[89]

Gailly J L, Adler M, GZip offical website. See gnu.org/software/gzip/manual/ website, 2023

[90]

Roshal E. RAR offical website. See rarlab.com/ website, 2024

[91]

LZMA2 Official Website. LZMA2. See 7-zip website, 2024

[92]

Boutell T. RFC2083: Png (portable network graphics) specification version 1.0. RFC Editor. See dl.acm.org/doi/pdf/10.17487/RFC2083, 1997

[93]

Coalson J. Free lossless audio codec. See xiph.org/flac website, 2023

[94]

Hoffmann J, Borgeaud S, Mensch A, Buchatskaya E, Cai T, Rutherford E, de Las Casas D, Hendricks L A, Welbl J, Clark A, Hennigan T, Noland E, Millican K, van den Driessche G, Damoc B, Guy A, Osindero S, Simonyan K, Elsen E, Rae J W, Vinyals O, Sifre L. Training compute-optimal large language models. 2022, arXiv preprint arXiv: 2203.15556

[95]

Zaheer M, Guruganesh G, Dubey A, Ainslie J, Alberti C, Ontanon S, Pham P, Ravula A, Wang Q, Yang L, Ahmed A. Big bird: Transformers for longer sequences. In: Proceedings of the 34th Conference on Neural Information Processing Systems. 2020, 17283−17297

[96]

Mahoney M. Large text compression benchmark. See mattmahoney.net/dc/text website, 2006

[97]

Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M, Berg A C, Fei-Fei L . Imagenet large scale visual recognition challenge. International Journal of Computer Vision, 2015, 115( 3): 211–252

[98]

Panayotov V, Chen G, Povey D, Khudanpur S. Librispeech: an ASR corpus based on public domain audio books. In: Proceedings of 2015 IEEE International Conference on Acoustics, Speech and Signal processing. 2015, 5206−5210

[99]

Touvron H, Lavril T, Izacard G, Martinet X, Lachaux M A, Lacroix T, Rozière B, Goyal N, Hambro E, Azhar F, Rodriguez A, Joulin A, Grave E, Lample G. LlaMA: open and efficient foundation language models. 2023, arXiv preprint arXiv: 2302.13971

[100]

Gailly J L, Adler M. zlib. See zlib.net/ website, 2024

[101]

Rhatushnyak A. PAQ8H. See mattmahoney.net/dc/paq website, 2006

[102]

byronknoll. Lstm-compress. See github.com/byronknoll/lstm-compress website, 2017

[103]

Bellard F. NNCP. See bellard.org/nncp/nncp website, 2019

[104]

Likhosherstov V, Choromanski K, Davis J, Song X, Weller A. Sub-linear memory: How to make performers slim. In: Proceedings of the 35th Conference on Neural Information Processing Systems. 2021, 6707−6719

[105]

Knoll B. TensorFlow-compress. See github.com/byronknoll/tensorflow-compress website, 2020

[106]

Ma Y, Yu D, Wu T, Wang H . PaddlePaddle: an open-source deep learning platform from industrial practice. Frontiers of Data & Computing, 2019, 1( 1): 105–115

[107]

Pang B, Nijkamp E, Wu Y N . Deep learning with TensorFlow: a review. Journal of Educational and Behavioral Statistics, 2020, 45( 2): 227–248

[108]

Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, Desmaison A, Köpf A, Yang E, DeVito Z, Raison M, Tejani A, Chilamkurthy S, Steiner B, Fang L, Bai J, Chintala S. PyTorch: an imperative style, high-performance deep learning library. In: Proceedings of the 33rd Conference on Neural Information Processing Systems. 2019, 32

[109]

Wang D, Cui W . An efficient graph data compression model based on the germ quotient set structure. Frontiers of Computer Science, 2022, 16( 6): 166617

[110]

Xing Y, Li G, Wang Z, Feng B, Song Z, Wu C . GTZ: a fast compression and cloud transmission tool optimized for FASTQ files. BMC Bioinformatics, 2017, 18( 16): 549

[111]

Deorowicz S. Silesia corpus. See github.com/MiloszKrajewski/SilesiaCorpus website, 2018

[112]

LeCun Y, Bottou L, Bengio Y, Haffner P . Gradient-based learning applied to document recognition. Proceedings of the IEEE, 1998, 86( 11): 2278–2324

[113]

Krizhevsky A. Learning multiple layers of features from tiny images. See cs.toronto.edu/~kriz/learning-features-2009-TR.pdf website, 2009

[114]

Image Compression Benchmark official website. See imagecompression.info/ website, 2015

[115]

Piczak K J. ESC: dataset for environmental sound classification. In: Proceedings of the 23rd ACM international conference on Multimedia. 2015, 1015−1018

[116]

Warden P. Speech commands: a dataset for limited-vocabulary speech recognition. 2018, arXiv preprint arXiv: 1804.03209

[117]

Ito K, Johnson L. The LJ speech dataset. See keithito.com/LJ-Speech-Dataset website, 2017

[118]

Pratas D, Pinho A J. A DNA sequence corpus for compression benchmark. In: Proceedings of the 12th International Conference on Practical Applications of Computational Biology & Bioinformatics. 2019, 208−215

[119]

Geer L Y, Marchler-Bauer A, Geer R C, Han L, He J, He S, Liu C, Shi W, Bryant S H . The NCBI biosystems database. Nucleic Acids Research, 2010, 38( suppl_1): D492–D496

[120]

PBzip2. PBzip2. See launchpad.net/pbzip2 website, 2009

[121]

Takehiro K. SnZip Official Website. See github.com/kubo/snzip website, 2021

[122]

PPMD Official Website. PPMD. See 7-zip website, 2010

[123]

Barina D, Klima O. X3: lossless data compressor. In: Proceedings of 2022 Data Compression Conference. 2022, 441

[124]

Barina D . Experimental lossless data compressor. Microprocessors and Microsystems, 2023, 98: 104803

[125]

LZ4. LZ4 official website. See github.com/lz4/lz4 website, 2024

[126]

Qin L, Sun J. Model compression for data compression: neural network based lossless compressor made practical. In: Proceedings of 2023 Data Compression Conference. 2023, 52−61

[127]

Tibshirani R . Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 1996, 58( 1): 267–288

RIGHTS & PERMISSIONS

The Author(s) 2025. This article is published with open access at link.springer.com and journal.hep.com.cn

AI Summary AI Mindmap
PDF (1413KB)

Supplementary files

Highlights

Supplementary materials

1377

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/