Why not transform chat large language models to non-English?
Xiang GENG , Ming ZHU , Jiahuan LI , Zhejian LAI , Wei ZOU , Shuaijie SHE , Jiaxin GUO , Xiaofeng ZHAO , Yinglu LI , Yuang LI , Chang SU , Yanqing ZHAO , Xinglin LYU , Min ZHANG , Jiajun CHEN , Hao YANG , Shujian HUANG
Front. Comput. Sci. ›› 2026, Vol. 20 ›› Issue (7) : 2007356
Why not transform chat large language models to non-English?
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
Hu E J, Shen Y, Wallis P, Allen-Zhu Z, Li Y, Wang S, Wang L, Chen W. LoRA: low-rank adaptation of large language models. In: Proceedings of the 10th International Conference on Learning Representations. 2022 |
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
Higher Education Press
/
| 〈 |
|
〉 |