Text-augmented long-term relation dependency learning for knowledge graph representation

Quntao Zhu , Mengfan Li , Yuanjun Gao , Yao Wan , Xuanhua Shi , Hai Jin

High-Confidence Computing ›› 2025, Vol. 5 ›› Issue (4) : 100315

PDF
High-Confidence Computing ›› 2025, Vol. 5 ›› Issue (4) :100315 DOI: 10.1016/j.hcc.2025.100315
Research Articles
research-article

Text-augmented long-term relation dependency learning for knowledge graph representation

Author information +
History +
PDF

Abstract

Knowledge graph (KG) representation learning aims to map entities and relations into a low-dimensional representation space, showing significant potential in many tasks. Existing approaches follow two categories: (1) Graph-based approaches encode KG elements into vectors using structural score functions. (2) Text-based approaches embed text descriptions of entities and relations via pre-trained language models (PLMs), further fine-tuned with triples. We argue that graph-based approaches struggle with sparse data, while text-based approaches face challenges with complex relations. To address these limitations, we propose a unified Text-Augmented Attention-based Recurrent Network, bridging the gap between graph and natural language. Specifically, we employ a graph attention network based on local influence weights to model local structural information and utilize a PLM based prompt learning to learn textual information, enhanced by a mask-reconstruction strategy based on global influence weights and textual contrastive learning for improved robustness and generalizability. Besides, to effectively model multi-hop relations, we propose a novel semantic-depth guided path extraction algorithm and integrate cross-attention layers into recurrent neural networks to facilitate learning the long-term relation dependency and offer an adaptive attention mechanism for varied-length information. Extensive experiments demonstrate that our model exhibits superiority over existing models across KG completion and question-answering tasks.

Keywords

Knowledge graph representation / Graph attention network / Pre-trained language model / Attention-based recurrent network / Masked autoencoder / Contrastive learning

Cite this article

Download citation ▾
Quntao Zhu, Mengfan Li, Yuanjun Gao, Yao Wan, Xuanhua Shi, Hai Jin. Text-augmented long-term relation dependency learning for knowledge graph representation. High-Confidence Computing, 2025, 5(4): 100315 DOI:10.1016/j.hcc.2025.100315

登录浏览全文

4963

注册一个新账户 忘记密码

CRediT authorship contribution statement

Quntao Zhu: Methodology, Conceptualization. Mengfan Li: Software, Methodology. Yuanjun Gao: Resources. Yao Wan: Project administration. Xuanhua Shi: Supervision, Methodology. Hai Jin: Writing - review & editing.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

This work was supported in part by National Key R&D Program of China (2020AAA0108501).

References

[1]

X. Wang, X. He, Y. Cao, M. Liu, T.-S. Chua, KGAT: Knowledge graph attention network for recommendation,in:Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD 2019, 2019, pp. 950-958.

[2]

X. Zhao, H. Chen, Z. Xing, C. Miao, Brain-inspired search engine assistant based on knowledge graph, IEEE Trans. Neural Networks Learn. Syst. (2021).

[3]

G. Wang, Y. Zeng, R.-H. Li, H. Qin, X. Shi, Y. Xia, X. Shang, L. Hong, Temporal graph cube, IEEE Trans. Knowl. Data Eng. 35 (12) (2023) 13015-13030.

[4]

S. Pan, L. Luo, Y. Wang, C. Chen, J. Wang, X. Wu, Unifying large language models and knowledge graphs: A roadmap, IEEE Trans. Knowl. Data Eng. (2024).

[5]

Z. Sun, Z. Deng, J. Nie, J. Tang, RotatE: Knowledge graph embedding by relational rotation in complex space,in:Proceedings of the 7th International Conference on Learning Representations, ICLR 2019, 2019.

[6]

A. Bordes, N. Usunier, A. Garcia-Duran, J. Weston, O. Yakhnenko, Translating embeddings for modeling multi-relational data, in:Proceedings of the 26th International Conference on Neural Information Processing Systems, Vol. 26, NIPS 2013, 2013, pp. 2787-2795.

[7]

L. Wang, W. Zhao, Z. Wei, J. Liu, SimKGC: Simple contrastive knowledge graph completion with pre-trained language models,in:Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics, ACL 2022, 2022, pp. 4281-4294.

[8]

J. Devlin, M.-W. Chang, K. Lee, K. Toutanova, Bert: Pre-training of deep bidirectional transformers for language understanding,in:Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2019, 2019, pp. 4171-4186.

[9]

H. Shomer, W. Jin, W. Wang, J. Tang, Toward degree bias in embedding-based knowledge graph completion, in:Proceedings of the Web Conference 2023, WWW 2023, 2023, pp. 705-715.

[10]

L. Guo, Z. Sun, W. Hu, Learning to exploit long-term relational dependencies in knowledge graphs, in:Proceedings of the 36th International Conference on Machine Learning, ICML 2019, 2019, pp. 2505-2514.

[11]

Y. Zhang, Q. Yao, L. Chen, Interstellar: Searching recurrent architecture for knowledge graph embedding,in:Proceedings of the 34th International Conference on Neural Information Processing Systems, NeurIPS 2020, 2020, pp. 10030-10040.

[12]

Z. Zhu, X. Yuan, M. Galkin, L.-P. Xhonneux, M. Zhang, M. Gazeau, J. Tang, A* net: A scalable path-based reasoning approach for knowledge graphs, Adv. Neural Inf. Process. Syst. 36 (2024).

[13]

Z. Wang, J. Zhang, J. Feng, Z. Chen, Knowledge graph embedding by translating on hyperplanes, in:Proceedings of the 28th AAAI Conference on Artificial Intelligence, Vol. 28, AAAI 2014, 2014, pp. 1112-1119.

[14]

T. Trouillon, J. Welbl, S. Riedel, E. Gaussier, G. Bouchard, Complex embeddings for simple link prediction, in:Proceedings of the 33rd International Conference on Machine Learning, Vol. 48, ICML 2016, 2016, pp. 2071-2080.

[15]

B. Yang, S.W. Yih, X. He, J. Gao, L. Deng, Embedding entities and relations for learning and inference in knowledge bases, in:Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015, 2015.

[16]

L. Yao, C. Mao, Y. Luo, KG-BERT: BERT for knowledge graph completion, 2019, CoRR abs/1909.03193, arXiv:1909.03193.

[17]

K. Liang, Y. Liu, S. Zhou, W. Tu, Y. Wen, X. Yang, X. Dong, X. Liu, Knowledge graph contrastive learning based on relation-symmetrical structure, IEEE Trans. Knowl. Data Eng. 36 (1) (2023) 226-238.

[18]

Z. Zhang, J. Cai, Y. Zhang, J. Wang, Learning hierarchy-aware knowledge graph embeddings for link prediction, in:Proceedings of the 34th AAAI Conference on Artificial Intelligence, AAAI 2020, 2020, pp. 3065-3072.

[19]

W. Liu, A. Daruna, Z. Kira, S. Chernova, Path ranking with attention to type hierarchies, in:Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, 2020, pp. 2893-2900.

[20]

W. Xiong, T. Hoang, W.Y. Wang, DeepPath: A reinforcement learning method for knowledge graph reasoning,in:Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, EMNLP 2017, 2018, pp. 564-573.

[21]

A. Sadeghian, M. Armandpour, P. Ding, D.Z. Wang, DRUM: End-to-end differentiable rule mining on knowledge graphs,in:Proceedings of the 33th International Conference on Neural Information Processing Systems, Vol. 32, NeurIPS 2019, 2019, pp. 15321-15331.

[22]

P.G. Omran, K. Wang, Z. Wang, An embedding-based approach to rule learning in knowledge graphs, IEEE Trans. Knowl. Data Eng. 33 (4) (2019) 1348-1359.

[23]

H. Sun, T. Bedrax-Weiss, W. Cohen, PullNet: Open domain question answering with iterative retrieval on knowledge bases and text,in:Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, EMNLP-IJCNLP 2019, 2019, pp. 2380-2390.

[24]

H. Sun, B. Dhingra, M. Zaheer, K. Mazaitis, R. Salakhutdinov, W. Cohen, Open domain question answering using early fusion of knowledge bases and text, in:Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, EMNLP 2018, 10, pp. 4231-4242.

[25]

A. Bordes, J. Weston, N. Usunier, Open question answering with weakly supervised embedding models, in:Proceedings of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases, ECML-PKDD 2014, in:Lecture Notes in Computer Science, vol. 8724, 2014, pp. 165-180.

[26]

A. Saxena, A. Tripathi, P. Talukdar, Improving multi-hop question answering over knowledge graphs using knowledge base embeddings, in:Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020, 2020, pp. 4498-4507.

[27]

Y. Qiu, K. Zhang, Y. Wang, X. Jin, L. Bai, S. Guan, X. Cheng, Hierarchical query graph generation for complex question answering over knowledge graph, in:Proceedings of the 29th ACM International Conference on Information & Knowledge Management, CIKM 2020, 2020, pp. 1285-1294.

[28]

A. Saha, G.A. Ansari, A. Laddha, K. Sankaranarayanan, S. Chakrabarti, Complex program induction for querying knowledge bases in the absence of gold programs, Trans. Assoc. Comput. Linguist. 7 (2019) 185-200.

[29]

Z. Zhang, J. Wang, J. Chen, S. Ji, F. Wu, Cone: Cone embeddings for multi-hop reasoning over knowledge graphs,in:Proceedings of the 34th International Conference on Neural Information Processing Systems, Vol. 34, NeurIPS 2021, 2021, pp. 19172-19183.

[30]

H. Ren, J. Leskovec, Beta embeddings for multi-hop logical reasoning in knowledge graphs, in:Proceedings of the 34th International Conference on Neural Information Processing Systems, Vol. 33, NeurIPS 2020, 2020, pp. 19716-19726.

[31]

B. Perozzi, R. Al-Rfou, S. Skiena, DeepWalk: online learning of social representations,in:Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2014, 2014, pp. 701-710.

[32]

K. Toutanova, D. Chen, P. Pantel, H. Poon, P. Choudhury, M. Gamon, Representing text for joint embedding of text and knowledge bases, in:Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, EMNLP 2015, 2015, pp. 1499-1509.

[33]

T. Dettmers, P. Minervini, P. Stenetorp, S. Riedel, Convolutional 2D knowledge graph embeddings, in:Proceedings of the 32nd AAAI Conference on Artificial Intelligence, Vol. 32, AAAI 2018, 2018, pp. 1811-1818.

[34]

X. Wang, T. Gao, Z. Zhu, Z. Zhang, Z. Liu, J. Li, J. Tang, KEPLER: A unified model for knowledge embedding and pre-trained language representation, Trans. Assoc. Comput. Linguist. 9 (2021) 176-194.

[35]

F. Mahdisoltani, J. Biega, F. Suchanek, Yago3: A knowledge base from multilingual wikipedias,in:Proceedings of the 7th Biennial Conference on Innovative Data Systems Research, 2014, pp. 3977-3986.

[36]

W. Yih, M. Richardson, C. Meek, M.-W. Chang, J. Suh, The value of semantic parse labeling for knowledge base question answering, in:Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016, 2016, pp. 201-206.

[37]

A. Miller, A. Fisch, J. Dodge, A. Karimi, A. Bordes, J. Weston, Key-value memory networks for directly reading documents, in:Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016, 2016, pp. 1400-1409.

[38]

D. Vrandečić, M. Krötzsch, Wikidata: A free collaborative knowledge base, Commun. ACM 57 (10) (2014) 78-85.

[39]

G. Niu, B. Li, Y. Zhang, S. Pu, CAKE: A scalable commonsense-aware framework for multi-view knowledge graph completion,in:Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics, ACL 2022, 2022, pp. 2867-2877.

[40]

R. Li, J. Zhao, C. Li, D. He, Y. Wang, Y. Liu, H. Sun, S. Wang, W. Deng, Y. Shen, et al., House: Knowledge graph embedding with householder parameterization,in:Proceedings of the 39th International Conference on Machine Learning, ICML 2022, PMLR, 2022, pp. 13209-13224.

[41]

Y. Lin, Z. Liu, H. Luan, M. Sun, S. Rao, S. Liu, Modeling relation paths for representation learning of knowledge bases, in:Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, EMNLP 2015, 2015, pp. 705-714.

[42]

S. Vashishth, S. Sanyal, V. Nitin, P. Talukdar, Composition-based multirelational graph convolutional networks, in:Proceedings of the 8th International Conference on Learning Representations, ICLR 2020, 2020.

[43]

Z. Zhang, J. Wang, J. Ye, F. Wu, Rethinking graph convolutional networks in knowledge graph completion, in:Proceedings of the Web Conference 2022, WWW 2022, 2022, pp. 798-807.

[44]

R. Li, Y. Cao, Q. Zhu, G. Bi, F. Fang, Y. Liu, Q. Li, How does knowledge graph embedding extrapolate to unseen data: a semantic evidence view,in:Proceedings of the 36th AAAI Conference on Artificial Intelligence, Vol. 36, AAAI 2022, 2022, pp. 5781-5791.

[45]

B. Kim, T. Hong, Y. Ko, J. Seo, Multi-task learning for knowledge graph completion with pre-trained language models, in:Proceedings of the 28th International Conference on Computational Linguistics, COLING 2020, 2020, pp. 1737-1743.

[46]

B. Wang, T. Shen, G. Long, T. Zhou, Y. Wang, Y. Chang, Structure-augmented text representation learning for efficient knowledge graph completion, in:Proceedings of the Web Conference 2021, WWW 2021, 2021, pp. 1737-1748.

[47]

Y. Zhang, H. Dai, Z. Kozareva, A.J. Smola, L. Song, Variational reasoning for question answering with knowledge graph, in:Proceedings of the 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, 2018, pp. 6069-6076.

[48]

H. Dai, X. Peng, X. Shi, L. He, Q. Xiong, H. Jin, Reveal training performance mystery between TensorFlow and PyTorch in the single GPU environment, Sci. China Inf. Sci. 65 (2022) 1-17.

PDF

246

Accesses

0

Citation

Detail

Sections
Recommended

/