Exploring & exploiting high-order graph structure for sparse knowledge graph completion

Tao HE, Ming LIU, Yixin CAO, Zekun WANG, Zihao ZHENG, Bing QIN

PDF(17469 KB)
PDF(17469 KB)
Front. Comput. Sci. ›› 2025, Vol. 19 ›› Issue (2) : 192306. DOI: 10.1007/s11704-023-3521-y
Artificial Intelligence
RESEARCH ARTICLE

Exploring & exploiting high-order graph structure for sparse knowledge graph completion

Author information +
History +

Abstract

Sparse Knowledge Graph (KG) scenarios pose a challenge for previous Knowledge Graph Completion (KGC) methods, that is, the completion performance decreases rapidly with the increase of graph sparsity. This problem is also exacerbated because of the widespread existence of sparse KGs in practical applications. To alleviate this challenge, we present a novel framework, LR-GCN, that is able to automatically capture valuable long-range dependency among entities to supplement insufficient structure features and distill logical reasoning knowledge for sparse KGC. The proposed approach comprises two main components: a GNN-based predictor and a reasoning path distiller. The reasoning path distiller explores high-order graph structures such as reasoning paths and encodes them as rich-semantic edges, explicitly compositing long-range dependencies into the predictor. This step also plays an essential role in densifying KGs, effectively alleviating the sparse issue. Furthermore, the path distiller further distills logical reasoning knowledge from these mined reasoning paths into the predictor. These two components are jointly optimized using a well-designed variational EM algorithm. Extensive experiments and analyses on four sparse benchmarks demonstrate the effectiveness of our proposed method.

Graphical abstract

Keywords

knowledge graph completion / graph neural networks / reinforcement learning

Cite this article

Download citation ▾
Tao HE, Ming LIU, Yixin CAO, Zekun WANG, Zihao ZHENG, Bing QIN. Exploring & exploiting high-order graph structure for sparse knowledge graph completion. Front. Comput. Sci., 2025, 19(2): 192306 https://doi.org/10.1007/s11704-023-3521-y

Tao He is currently a PhD student in the Research Center for Social Computing and Information Retrieval, Harbin Institute of Technology, China. He received the BS and MS degrees from Harbin Institute of Technology, China. His research interests are knowledge reasoning and question answering, which include knowledge graph completion, knowledge graph question answering, and video question answering

Ming Liu received the PhD degree from the School of Computer Science and Technology, Harbin Institute of Technology, China in 2010. He is a full professor of the Department of Computer Science, and the faculty member of Social Computing and Information Retrieval (HIT-SCIR), Harbin Institute of Technology, China. His research interests include knowledge graph, machine reading comprehension

Yixin Cao is an assistant professor with Singapore Management University, Singapore. Before that, he was a research assistant professor of Nanyang Technology University, Singapore. He also was a research fellow with NExT++, National University of Singapore (NUS). He received his PhD degree in Computer Science from Tsinghua University, China in 2018. His research areas span natural language processing, knowledge graph, recommendation and knowledge-patched LLMs

Zekun Wang is currently a PhD student in the Social Computing and Information Retrieval research center, Harbin Institute of Technology, China. He received the BS degree from Harbin Institute of Technology, China. His research interests are efficient pretrained models

Zihao Zheng is currently a PhD student in the Social Computing and Information Retrieval research center, Harbin Institute of Technology, China. He received the BS degree from Harbin Institute of Technology, China. His research interests are information extraction and multimodal learning, which include relation extraction, named entity recognition and multimodal extraction

Bing Qin received the PhD degree from the School of Computer Science and Technology, Harbin Institute of Technology, China in 2005. She is a full professor of the Department of Computer Science, and the director of the Research Center for Social Computing and Information Retrieval (HIT-SCIR), Harbin Institute of Technology, China. Her research interests include natural language processing, information extraction, document-level discourse analysis, and sentiment analysis

References

[1]
Lv X, Han X, Hou L, Li J, Liu Z, Zhang W, Zhang Y, Kong H, Wu S. Dynamic anticipation and completion for multi-hop reasoning over sparse knowledge graph. In: Proceedings of 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2020, 5694−5703
[2]
Chen W, Cao Y, Feng F, He X, Zhang Y. Explainable sparse knowledge graph completion via high-order graph reasoning network. 2022, arXiv preprint arXiv: 2207.07503
[3]
Xu X, Zhu Y, Wang X, Zhang N. How to unleash the power of large language models for few-shot relation extraction? In: Proceedings of the 4th Workshop on Simple and Efficient Natural Language Processing (SustaiNLP). 2023, 190−200
[4]
Sui D, Zeng X, Chen Y, Liu K, Zhao J. Joint entity and relation extraction with set prediction networks. IEEE Transactions on Neural Networks and Learning Systems, 2023, 1–12, doi: 10.1109/TNNLS.2023.3264735
[5]
Cao S, Shi J, Pan L, Nie L, Xiang Y, Hou L, Li J, He B, Zhang H. KQA Pro: a dataset with explicit compositional programs for complex question answering over knowledge base. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2022, 6101−6119
[6]
Galkin M, Zhu Z, Ren H, Tang J. Inductive logical query answering in knowledge graphs. 2022, arXiv preprint arXiv: 2210.08008
[7]
Li D, Li Y, Zhang J, Li K, Wei C, Cui J, Wang B. C3KG: a Chinese commonsense conversation knowledge graph. In: Proceedings of Findings of the Association for Computational Linguistics: ACL 2022. 2022, 1369−1383
[8]
Fei Z, Zhou X, Gui T, Zhang Q, Huang X. LFKQG: a controlled generation framework with local fine-tuning for question generation over knowledge bases. In: Proceedings of the 29th International Conference on Computational Linguistics. 2022, 6575−6585
[9]
Tan Z, Chen Z, Feng S, Zhang Q, Zheng Q, Li J, Luo M. KRACL: contrastive learning with graph context modeling for sparse knowledge graph completion. In: Proceedings of the ACM Web Conference 2023. 2023, 2548−2559
[10]
Jin D, Gong Y, Wang Z, Yu Z, He D, Huang Y, Wang W. Graph neural network for higher-order dependency networks. In: Proceedings of the ACM Web Conference 2022. 2022, 1622−1630
[11]
Yang C, Liu M, Zheng V W, Han J. Node, motif and Subgraph: leveraging network functional blocks through structural convolution. In: Proceedings of 2018 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM). 2018, 47−52
[12]
Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks. In: Proceedings of the 5th International Conference on Learning Representations. 2017
[13]
Topping J, Di Giovanni F, Chamberlain B P, Dong X, Bronstein M M. Understanding over-squashing and bottlenecks on graphs via curvature. In: Proceedings of the 10th International Conference on Learning Representations. 2022
[14]
Lin X V, Socher R, Xiong C. Multi-hop knowledge graph reasoning with reward shaping. In: Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing. 2018, 3243−3253
[15]
Richardson M, Domingos P . Markov logic networks. Machine Learning, 2006, 62( 1): 107–136
[16]
Bishop C M. Pattern Recognition and Machine Learning. New York: Springer, 2006
[17]
Vashishth S, Sanyal S, Nitin V, Talukdar P. Composition-based multi-relational graph convolutional networks. In: Proceedings of the 8th International Conference on Learning Representations. 2020
[18]
Qu M, Tang J. Probabilistic logic neural networks for reasoning. In: Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019, 7712−7722
[19]
Bordes A, Usunier N, Garcia-Durán A, Weston J, Yakhnenko O. Translating embeddings for modeling multi-relational data. In: Proceedings of the 26th International Conference on Neural Information Processing Systems. 2013, 2787−2795
[20]
Sun Z, Deng Z H, Nie J Y, Tang J. Rotate: Knowledge graph embedding by relational rotation in complex space. In: Proceedings of the 7th International Conference on Learning Representations. 2019
[21]
Trouillon T, Welbl J, Riedel S, Gaussier É, Bouchard G. Complex embeddings for simple link prediction. In: Proceedings of the 33rd International Conference on International Conference on Machine Learning. 2016, 2071−2080
[22]
Balažević I, Allen C, Hospedales T. TuckER: Tensor factorization for knowledge graph completion. In: Proceedings of 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019, 5185−5194
[23]
Dettmers T, Minervini P, Stenetorp P, Riedel S. Convolutional 2D knowledge graph embeddings. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence. 2018, 221
[24]
Shang C, Tang Y, Huang J, Bi J, He X, Zhou B. End-to-end structure-aware convolutional networks for knowledge base completion. In: Proceedings of the 37th AAAI Conference on Artificial Intelligence. 2019, 3060−3067
[25]
Zhu Z, Zhang Z, Xhonneux L P A C, Tang J. Neural bellman-ford networks: A general graph neural network framework for link prediction. In: Proceedings of the 35th Conference on Neural Information Processing Systems. 2021, 29476−29490
[26]
Zhang Y, Yao Q. Knowledge graph reasoning with relational digraph. In: Proceedings of the ACM Web Conference 2022. 2022, 912−924
[27]
Sun Z, Vashishth S, Sanyal S, Talukdar P, Yang Y. A re-evaluation of knowledge graph completion methods. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020, 5516−5522
[28]
Rossi A, Barbosa D, Firmani D, Matinata A, Merialdo P . Knowledge graph embedding for link prediction: a comparative analysis. ACM Transactions on Knowledge Discovery from Data, 2021, 15( 2): 14
[29]
Yang B, Yih W T, He X, Gao J, Deng L. Embedding entities and relations for learning and inference in knowledge bases. In: Proceedings of the 3rd International Conference on Learning Representations. 2015
[30]
Schlichtkrull M, Kipf T N, Bloem P, Van Den Berg R, Titov I, Welling M. Modeling relational data with graph convolutional networks. In: Proceedings of the 15th European Semantic Web Conference. 2018, 593−607
[31]
Li R, Cao Y, Zhu Q, Bi G, Fang F, Liu Y, Li Q. How does knowledge graph embedding extrapolate to unseen data: a semantic evidence view. In: Proceedings of the 37th AAAI Conference on Artificial Intelligence. 2022, 5781−5791
[32]
Wan G, Pan S, Gong C, Zhou C, Haffari G. Reasoning like human: hierarchical reinforcement learning for knowledge graph reasoning. In: Proceedings of the 29th International Joint Conference on Artificial Intelligence. 2021, 1926−1932
[33]
He T, Jiang T, Zheng Z, Zhu H, Zhang J, Liu M, Zhao S, Qin B. VEM2L: a plug-and-play framework for fusing text and structure knowledge on sparse knowledge graph completion. 2022, arXiv preprint arXiv: 2207.01528
[34]
Galárraga L, Teflioudi C, Hose K, Suchanek F M. Fast rule mining in ontological knowledge bases with AMIE+. The VLDB Journal, 2015, 24(6): 707−730
[35]
Qu M, Chen J, Xhonneux L P, Bengio Y, Tang J . RNNlogic: learning logic rules for reasoning on knowledge graphs. 2020, arXiv preprint arXiv: 2010, 04029,
[36]
Niu G, Zhang Y, Li B, Cui P, Liu S, Li J, Zhang X. Rule-guided compositional representation learning on knowledge graphs. In: Proceedings of the 34th AAAI conference on artificial intelligence. 2020, 2950−2958
[37]
Niu G, Li B, Zhang Y, Pu S. Perform like an engine: A closed-loop neural-symbolic learning framework for knowledge graph inference. In: Proceedings of the 29th International Conference on Computational Linguistics. 2021, 1391−1400
[38]
Xu J, Zhang J, Ke X, Dong Y, Chen H, Li C, Liu Y. P-INT: a path-based interaction model for few-shot knowledge graph completion. In: Proceedings of Findings of the Association for Computational Linguistics: EMNLP 2021. 2021, 385−394
[39]
Yang F, Yang Z, Cohen W W. Differentiable learning of logical rules for knowledge base reasoning. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017, 2316−2325
[40]
Sadeghian A, Armandpour M, Ding P, Wang D Z. DRUM: End-to-end differentiable rule mining on knowledge graphs. In: Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019, 1375
[41]
Wang P W, Stepanova D, Domokos C, Kolter J Z. Differentiable learning of numerical rules in knowledge graphs. In: Proceedings of the 8th International Conference on Learning Representations. 2020
[42]
Zhang D, Yuan Z, Liu H, Lin X, Xiong H. Learning to walk with dual agents for knowledge graph reasoning. In: Proceedings of the 37th AAAI Conference on Artificial Intelligence. 2022, 5932−5941
[43]
Rumelhart D E, Hinton G E, Williams R J . Learning representations by back-propagating errors. Nature, 1986, 323( 6088): 533–536
[44]
Zhang Y, Chen X, Yang Y, Ramamurthy A, Li B, Qi Y, Song L. Efficient probabilistic logic reasoning with graph neural networks. In: Proceedings of the 8th International Conference on Learning Representations. 2020

Acknowledgements

The research in this article was supported by the National Key R&D Program of China (2022YFF0903301), the National Natural Science Foundation of China (Grant Nos. U22B2059, 61976073, 62276083), the Shenzhen Foundational Research Funding (JCYJ20200109113441941), and the Major Key Project of PCL (PCL2021A06).

Competing interests

The authors declare that they have no competing interests or financial conflicts to disclose.

RIGHTS & PERMISSIONS

2025 Higher Education Press
AI Summary AI Mindmap
PDF(17469 KB)

Accesses

Citations

Detail

Sections
Recommended

/