Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation
Jiaqi SHI , Xulong ZHANG , Xiaoyang QU , Junfei XIE , Jianzong WANG
Front. Inform. Technol. Electron. Eng ›› 2025, Vol. 26 ›› Issue (10) : 1793 -1808.
Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation
Financial large language models (FinLLMs) offer immense potential for financial applications. While excessive deployment expenditures and considerable inference latency constitute major obstacles, as a prominent compression methodology, knowledge distillation (KD) offers an effective solution to these difficulties. A compre-hensive survey is conducted in this work on how KD interacts with FinLLMs, covering three core aspects: strategy, application, and evaluation. At the strategy level, this review introduces a structured taxonomy to comparatively analyze existing distillation pathways. At the application level, this review puts forward a logical upstream-midstream-downstream framework to systematically explain the practical value of distilled models in the financial field. At the evaluation level, to tackle the absence of standards in the financial field, this review constructs a comprehensive evaluation framework that proceeds from multiple dimensions such as financial accuracy, reasoning fidelity, and robustness. In summary, this research aims to provide a clear roadmap for this interdisciplinary field, to accelerate the development of distilled FinLLMs.
Financial large language models (FinLLMs) / Knowledge distillation / Model compression / Quantitative trading
Zhejiang University Press
/
| 〈 |
|
〉 |