Continual document-level relation extraction with partial labeling compensation
Xinyi WANG , Zitao WANG , Yuan FENG , Wei HU
Front. Comput. Sci. ›› 2027, Vol. 21 ›› Issue (1) : 2101305
Document-level relation extraction (RE) aims to identify the relations between entities across multiple sentences. In real life, new relations constantly emerge in new texts, raising the challenge to continually learn the new relations while avoiding forgetting the learned relations. Previous continual RE works have primarily focused on the continual learning of sentence-level RE, where each entity pair is associated with one single sentence and annotated with one relation. However, emerging relations may exist between entity pairs spanning multiple sentences or between entity pairs with pre-existing relations, necessitating the application of continual learning to document-level RE. To this end, we consider continual document-level RE and propose a novel model named CDRE to alleviate the partial labeling problem that severely degrades the performance of RE models. Specifically, we propose multi-binary knowledge distillation to transfer the knowledge of learned relations from the previously trained model to the current model. We introduce asymmetric training to coordinate the influence of positive samples and samples with learned yet unannotated relations. Furthermore, we explore the correlation between relations to augment label generation for re-annotating the learned and newly emerging relations in current and memorized samples, respectively. To simulate real-world scenarios, we construct two benchmark datasets derived from two widely-used document-level RE datasets. Experimental results on the datasets validate the superiority of our model CDRE in coping with continual document-level RE.
document-level relation extraction / continual learning / knowledge distillation / asymmetric training / correlation / label generation
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
|
| [35] |
|
| [36] |
|
| [37] |
|
| [38] |
|
| [39] |
|
| [40] |
|
| [41] |
|
| [42] |
|
| [43] |
|
| [44] |
|
| [45] |
|
| [46] |
|
| [47] |
|
| [48] |
|
| [49] |
|
| [50] |
|
| [51] |
|
Higher Education Press
/
| 〈 |
|
〉 |