Simplified multi-view graph neural network for multilingual knowledge graph completion
Bingbing DONG , Chenyang BU , Yi ZHU , Shengwei JI , Xindong WU
Front. Comput. Sci. ›› 2025, Vol. 19 ›› Issue (7) : 197324
Simplified multi-view graph neural network for multilingual knowledge graph completion
Knowledge graph completion (KGC) aims to fill in missing entities and relations within knowledge graphs (KGs) to address their incompleteness. Most existing KGC models suffer from knowledge coverage as they are designed to operate within a single KG. In contrast, Multilingual KGC (MKGC) leverages seed pairs from different language KGs to facilitate knowledge transfer and enhance the completion of the target KG. Previous studies on MKGC based on graph neural networks (GNNs) have primarily focused on using relation-aware GNNs to capture the combined features of neighboring entities and relations. However, these studies still have some shortcomings, particularly in the context of MKGCs. First, each language’s specific semantics, structures, and expressions contribute to the increased heterogeneity of the KG. Therefore, the completion of MKGCs necessitates a thorough consideration of the heterogeneity of the KG and the effective integration of its heterogeneous features. Second, MKGCs typically have a large graph scale due to the need to store and manage information from multiple languages. However, current relation-aware GNNs often inherit complex GNN operations, resulting in unnecessary complexity. Therefore, it is necessary to simplify GNN operations. To address these limitations, we propose a Simplified Multi-view Graph Neural Network (SM-GNN) for MKGC. SM-GNN incorporates two simplified multi-view GNNs as components. One GNN is utilized for learning multi-view graph features to complete the KG. The other generates new alignment pairs, facilitating knowledge transfer between different views of the KG. We simplify the two multi-view GNNs by retaining feature propagation while discarding linear transformation and nonlinear activation to reduce unnecessary complexity and effectively leverage graph contextual information. Extensive experiments demonstrate that our proposed model outperforms competing baselines. The code and dataset are available at the website of github.com/dbbice/SM-GNN.
multi-view / knowledge graph / graph neural network / multilingual knowledge graph completion
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
Wu X, Jiang T, Zhu Y, Bu C. Knowledge graph for China’s genealogy. In: Proceedings of 2020 IEEE International Conference on Knowledge Graph. 2020, 529−535 |
| [5] |
Bu C, Zhang J, Yu X, Wu L, Wu X. Which companies are likely to invest: knowledge-graph-based recommendation for investment promotion. In: Proceedings of 2022 IEEE International Conference on Data Mining. 2022, 11−20 |
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
Yao L, Mao C S, Luo Y. KG-BERT: Bert for knowledge graph completion. 2019, arXiv preprint arXiv:1909.03193 |
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
|
| [35] |
|
| [36] |
|
| [37] |
|
| [38] |
|
| [39] |
|
| [40] |
|
| [41] |
|
| [42] |
Wang Z, Lv Q, Lan X, Zhang Y. Cross-lingual knowledge graph alignment via graph convolutional networks. In: Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing. 2018, 349−357 |
| [43] |
|
| [44] |
|
| [45] |
|
| [46] |
|
| [47] |
Devlin J, Chang M-W, Lee K, Toutanova K. BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2019, 4171−4186 |
| [48] |
|
| [49] |
|
Higher Education Press
/
| 〈 |
|
〉 |