LTDDA: Large Language Model-Enhanced Text Truth Discovery with Dual Attention
Xiu FANG , Zhihong CUI , Guohao SUN , Jinhu LU
Journal of Donghua University(English Edition) ›› 2025, Vol. 42 ›› Issue (6) : 699 -710.
LTDDA: Large Language Model-Enhanced Text Truth Discovery with Dual Attention
Existing text truth discovery methods fail to address two challenges: the inherent long-distance dependencies and thematic diversity of long texts; the inherent subjective sentiment that obscures objective evaluation of source reliability.To address these challenges, a novel truth discovery method named large language model (LLM)-enhanced text truth discovery with dual attention (LTDDA) is proposed.First, LLMs generate embedded representations of text claims, and enhance the feature space to tackle long-distance dependencies and thematic diversity.Then, the complex relationship between source reliability and claim credibility is captured by integrating semantic and sentiment features.Finally, dual-layer attention is applied to extract key semantic information and assign consistent weights to similar sources, resulting in accurate truth outputs.Extensive experiments on three realworld datasets demonstrate that the effectiveness of LTDDA outperforms that of state-of-the-art methods, providing new insights for building more reliable and accurate text truth discovery systems.
large language model (LLM) / truth discovery / attention mechanism
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
/
| 〈 |
|
〉 |