Extended context-based semantic communication system for text transmission

Yueling Liu , Shengteng Jiang , Yichi Zhang , Kuo Cao , Li Zhou , Boon-Chong Seet , Haitao Zhao , Jibo Wei

›› 2024, Vol. 10 ›› Issue (3) : 568 -576.

PDF
›› 2024, Vol. 10 ›› Issue (3) :568 -576. DOI: 10.1016/j.dcan.2022.09.023
Research article
research-article

Extended context-based semantic communication system for text transmission

Author information +
History +
PDF

Abstract

Context information is significant for semantic extraction and recovery of messages in semantic communication. However, context information is not fully utilized in the existing semantic communication systems since relationships between sentences are often ignored. In this paper, we propose an Extended Context-based Semantic Communication (ECSC) system for text transmission, in which context information within and between sentences is explored for semantic representation and recovery. At the encoder, self-attention and segment-level relative attention are used to extract context information within and between sentences, respectively. In addition, a gate mechanism is adopted at the encoder to incorporate the context information from different ranges. At the decoder, Transformer-XL is introduced to obtain more semantic information from the historical communication processes for semantic recovery. Simulation results show the effectiveness of our proposed model in improving the semantic accuracy between transmitted and recovered messages under various channel conditions.

Keywords

semantic communication / extended context / Transformer-XL

Cite this article

Download citation ▾
Yueling Liu, Shengteng Jiang, Yichi Zhang, Kuo Cao, Li Zhou, Boon-Chong Seet, Haitao Zhao, Jibo Wei. Extended context-based semantic communication system for text transmission. , 2024, 10(3): 568-576 DOI:10.1016/j.dcan.2022.09.023

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

W. Weaver, C. Shannon, Recent contributions to the mathematical theory of communication, Etc A Rev. Gen. Semant. 10 (4) (1953) 261-281.

[2]

L. Floridi, Outline of a theory of strongly semantic information, Minds Mach. 14 (2004) 197-221.

[3]

J. Bao, P. Basu, M. Dean, C. Partridge, A. Swami, W. Leland, J.A. Hendler, Towards a Theory of Semantic Communication, IEEE Network Science Workshop, 2011, pp. 110-117, 2011.

[4]

B. Güler, A. Yener, A. Swami, The semantic communication game, IEEE Trans. Cogn. Commun. Netw. 4 (4) (2018) 787-802.

[5]

X. Luo, H.-H. Chen, Q. Guo, Semantic communications: overview, open issues, and future research directions, IEEE Wireless Commun. 29 (1) (2022) 210-219.

[6]

Q. Lan, D. Wen, Z. Zhang, Q. Zeng, X. Chen, P. Popovski, K. Huang, What is semantic communication? a view on conveying meaning in the era of machine intelligence, J. Commun. Netw. 6 (4) (2021) 336-371.

[7]

Y. Zhang, H. Zhao, J. Wei, J. Zhang, M.F. Flanagan, J. Xiong, Context-based semantic communication via dynamic programming, IEEE Trans. Cognit. Commun. Netw. 8 (3) (2022) 1453-1467.

[8]

N. Farsad, M. Rao, A. Goldsmith, Deep learning for joint source-channel coding of text, in: IEEE International Conference on Acoustics, Speech and Signal Processing, 2018, pp. 2326-2330, 2018.

[9]

E. Bourtsoulatze, D. Burth Kurka, D. Gündüz, Deep joint source-channel coding for wireless image transmissio, IEEE Trans. Cogn. Commun. Netw. 5 (3) (2019) 567-579.

[10]

H. Xie, Z. Qin, G.Y. Li, B.H. Juang, Deep learning enabled semantic communication systems, IEEE Trans. Signal Process. 69 (2021) 2663-2675.

[11]

E.C. Strinati, S. Barbarossa, 6G networks: beyond shannon towards semantic and goal-oriented communications, Comput. Network. 190 (2021) 1389, 1286.

[12]

Y. Wang, M. Chen, W. Saad, T. Luo, S. Cui, H.V. Poor, Performance optimization for semantic communications: an attention-based learning approach, in: IEEE Global Communications Conference, 2021, pp. 1-6, 2021.

[13]

F. Zhou, Y. Li, X. Zhang, Q. Wu, X. Lei, R.Q. Hu,Cognitive Semantic Communication Systems Driven by Knowledge Graph, in: ICC 2022, IEEE International Conference on Communications, Seoul, Korea, 2022, pp. 4860-4865, https://doi.org/10.1109/ICC45855.2022.9838470.

[14]

S. Jiang, Y. Liu, Y. Zhang, P. Luo, K. Cao, J. Xiong, H. Zhao, J. Wei, Reliable semantic communication system enabled by knowledge graph, Entropy 24 (6).

[15]

D. Huang, X. Tao, F. Gao, J. Lu, Deep learning-based image semantic coding for semantic communications, in: IEEE Global Communications Conference, 2021, pp. 1-6.

[16]

Z. Weng, Z. Qin, Semantic communication systems for speech transmission, IEEE J. Sel. Area. Commun. 39 (8) (2021) 2434-2444.

[17]

H. Xie, Z. Qin, G.Y. Li, Task-oriented multi-user semantic communications for VQA, IEEE Commun. Lett. 11 (3) (2022) 553-557.

[18]

P. Xiang, Q. Zhijin, H. Danlan, X. Tao, L. Jianhua, L. Guangyi, P. Chengkang, A Robust Deep Learning Enabled Semantic Communication System for Text, ArXiv abs/2206.02596.

[19]

Wireless Innovation Forum Top 10 Most Wanted Wireless Innovations, 2015 https://www.wirelessinnovation.org/assets/workproducts/Reports.

[20]

K. Tang, H. Zhang, B. Wu, W. Luo, W. Liu,Learning to compose dynamic tree structures for visual contexts, in:IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019, pp. 6619-6628.

[21]

N.N. Yusof, A. Mohamed, S. Abdul-Rahman, A review of contextual information for context-based approach in sentiment analysis, Int. J. Mach. Learn. Comput. 8 (2018) 399-403.

[22]

Z. Zheng, X. Yue, S. Huang, J. Chen, A. Birch,Towards making the most of context in neural machine translation, in:International Joint Conference on Artificial Intelligence, 2020, pp. 3983-3989.

[23]

Z. Dai, Z. Yang, Y. Yang, J. Carbonell, Q. Le, R. Salakhutdinov, Transformer-XL: attentive language models beyond a fixed-length context,in: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019, pp. 2978-2988.

[24]

Q. Liu, M. J. Kusner, P. Blunsom, A Survey on Contextual Embeddings, arXiv abs/ 2003.07278.

[25]

B. Ahmadnia, B. Dorr,Enhancing phrase-based statistical machine translation by learning phrase representations using long short-term memory network, in:Proceedings of the International Conference on Recent Advances in Natural Language Processing, 2019, pp. 25-32.

[26]

A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez, L. Kaiser,Attention is all you need, in:The 31st International Conference on Neural Information Processing Systems, 2017, pp. 5998-6008.

[27]

Y. Tay, M. Dehghani, D. Bahri, D. Metzler, Efficient transformers: a survey, ACM Comput. Surv. 55(6)1-29, No.109.

[28]

J. Devlin, M.-W. Chang, K. Lee, K. Toutanova, BERT: pre-training of deep bidirectional transformers for language understanding,in: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2019, pp. 4171-4186.

[29]

T. B. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. J. Henighan, R. Child, A. Ramesh, D. M. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, D. Amodei, Language Models Are Few-Shot Learners, ArXiv abs/2005.14165.

[30]

Z. Tan, S. Wang, Z. Yang, G. Chen, X. Huang, M. Sun, Y. Liu, Neural machine translation: a review of methods, resources, and tools, AI Open 1 (2020) 5-21.

[31]

K. Papineni, S. Roukos, T. Ward, W.-J. Zhu, Bleu: a method for automatic evaluation of machine translation,in: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, 2002, pp. 311-318.

[32]

S. Banerjee, A. Lavie, METEOR: an automatic metric for MT evaluation with improved correlation with human judgments,in: Proceedings of the ACL Workshop on Intrinsic and Extrinsic Evaluation Measures for Machine Translation And/or Summarization, 2005, pp. 65-72.

[33]

P. Shaw, J. Uszkoreit, A. Vaswani,Self-attention with relative position representations, in:Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2018, pp. 464-468.

[34]

P. Koehn, Europarl: a parallel corpus for statistical machine translation, MT summit 5 (2005) 79-86.

[35]

Microsoft,Neural network intelligence 2021 (accessed 12 March 2020).

AI Summary AI Mindmap
PDF

132

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/