Exploiting comments information to improve legal public opinion news abstractive summarization

Yuxin HUANG , Zhengtao YU , Yan XIANG , Zhiqiang YU , Junjun GUO

Front. Comput. Sci. ›› 2022, Vol. 16 ›› Issue (6) : 166333

PDF (10044KB)
Front. Comput. Sci. ›› 2022, Vol. 16 ›› Issue (6) : 166333 DOI: 10.1007/s11704-021-0561-z
Artificial Intelligence
RESEARCH ARTICLE

Exploiting comments information to improve legal public opinion news abstractive summarization

Author information +
History +
PDF (10044KB)

Abstract

Automatically generating a brief summary for legal-related public opinion news (LPO-news, which contains legal words or phrases) plays an important role in rapid and effective public opinion disposal. For LPO-news, the critical case elements which are significant parts of the summary may be mentioned several times in the reader comments. Consequently, we investigate the task of comment-aware abstractive text summarization for LPO-news, which can generate salient summary by learning pivotal case elements from the reader comments. In this paper, we present a hierarchical comment-aware encoder (HCAE), which contains four components: 1) a traditional sequenceto-sequence framework as our baseline; 2) a selective denoising module to filter the noisy of comments and distinguish the case elements; 3) a merge module by coupling the source article and comments to yield comment-aware context representation; 4) a recoding module to capture the interaction among the source article words conditioned on the comments. Extensive experiments are conducted on a large dataset of legal public opinion news collected from micro-blog, and results show that the proposed model outperforms several existing state-of-the-art baseline models under the ROUGE metrics.

Graphical abstract

Keywords

legal public opinion news / abstractive summarization / comment / comment-aware context / case elements / bi-directional attention

Cite this article

Download citation ▾
Yuxin HUANG, Zhengtao YU, Yan XIANG, Zhiqiang YU, Junjun GUO. Exploiting comments information to improve legal public opinion news abstractive summarization. Front. Comput. Sci., 2022, 16(6): 166333 DOI:10.1007/s11704-021-0561-z

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Nallapati R, Zhou B W, Santos D C, Guçehre Ç, Xiang B. Abstractive text summarization using sequence-to-sequence RNNs and beyond. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning. 2016, 280– 290

[2]

Gu J T, Lu Z D, Li H, Li V O. Incorporating copying mechanism in sequence-to-sequence learning. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016, 1631− 1640

[3]

Zhou Q Y, Yang N, Wei F R, Zhou M. Selective encoding for abstractive sentence summarization. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017, 1095− 1104

[4]

Xu H Y , Wang Z Q , Zhang Y F , Weng X L , Wang Z J , Zhou G D . Document structure model for survey generation using neural network. Frontiers of Computer Science, 2021, 15( 4): 1– 10

[5]

Jadhav A, Rajan V. Extractive summarization with SWAP-NET: Sentences and words from alternating pointer networks. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2018, 142– 151

[6]

Wang H, Wang X, Xiong W H, Yu M, Guo X X, Chang S Y, Wang W Y. Self-supervised learning for contextualized extractive summarization. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019, 2221− 2227

[7]

Cho S W, Lebanoff L, Foroosh H, Liu F. Improving the similarity measure of determinantal point processes for extractive multi-document summarization. 2019, arXiv preprint arXiv: 1906.00072

[8]

Zhao W X , Wen J R , Li X M . Generating timeline summaries with social media attention. Frontiers of Computer Science, 2016, 10( 4): 702– 716

[9]

Rush A M, Chopra S, Weston J. A neural attention model for abstractive sentence summarization. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015, 379– 389

[10]

Vinyals O, Fortunato M, Jaitly N. Pointer networks. Advances in neural information processing systems, 2015, 2692− 2700

[11]

See A, Liu P J, Manning C D. Get to the point: Summarization with pointer-generator networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017, 1073− 1083

[12]

Song K Q, Zhao L, Liu F. Structure-infused copy mechanisms for abstractive summarization. 2018, arXiv preprint arXiv: 1806.05658

[13]

Zhang X X, Lapata M. Sentence simplification with deep reinforcement learning. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017, 584– 594

[14]

Pasunuru R, Bansal M. Multi-reward reinforced summarization with saliency and entailment. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers). 2018, 646– 653

[15]

Zeng W Y, Luo W J, Fidler S, Urtasun R. Efficient summarization with read-again and copy mechanism. 2016, arXiv preprint arXiv: 1611.03382

[16]

Xia Y C, Tian F, Wu L J, Lin J X, Qin T, Yu N H, Liu T Y. Deliberation networks: Sequence generation beyond one-pass decoding. Advances in Neural Information Processing Systems, 2017, 1784− 1794

[17]

Chen Y C, Bansal M. Fast abstractive summarization with reinforce-selected sentence rewriting. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2018, 675– 686

[18]

Hsu W T, Lin C K, Lee M Y, Min K R, Tang J, Sun M. A unified model for extractive and abstractive summarization using inconsistency loss. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2018, 132– 141

[19]

Hu M S, Sun A X, Lim E P. Comments-oriented document summarization: understanding documents with readers’ feedback. In: Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval. 2008, 291–298

[20]

Yang Z, Cai K K, Tang J, Zhang L, Su Z, Li J Z. Social context summarization. In: Proceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval. 2011, 255– 264

[21]

Nguyen M T, Tran C X, Tran D V, Nguyen M L. Solscsum: A linked sentence-comment dataset for social context summarization. In: Proceedings of the 25th ACM International on Conference on Information and Knowledge Management. 2016, 2409− 2412

[22]

Nguyen M T, Lai D V, Do P K, Tran D V, Le Nguyen M. Vsolscsum: Building a vietnamese sentence-comment dataset for social context summarization. In: Proceedings of the 12th Workshop on Asian Language Resources (ALR12). 2016, 38– 48

[23]

Li P J, Bing L D, Lam W, Li H, Liao Y. Reader-aware multi-document summarization via sparse coding. In: Proceedings of Twenty-Fourth International Joint Conference on Artificial Intelligence. 2015

[24]

Li P J, Bing L D, Lam W. Reader-aware multi-document summarization: An enhanced model and the first dataset. In: Proceedings of the Workshop on New Frontiers in Summarization. 2017, 91– 99

[25]

Gao S, Chen X Y, Li P J, Ren Z C, Bing L D, Zhao D Y, Yan R. Abstractive text summarization by incorporating reader comments. In: Proceedings of the AAAI Conference on Artificial Intelligence. 2019, 6399− 6406

[26]

Gao S, Chen X Y, Ren Z C, Zhao D Y, Yan R. From standard summarization to new tasks and beyond: Summarization with manifold information. In: Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, IJCAI-20. 2020, 4854− 4860

[27]

Bhattacharya P, Hiware K, Rajgaria S, Pochhi N, Ghosh K, Ghosh S. A comparative study of summarization algorithms applied to legal case judgments. In: Proceedings of European Conference on Information Retrieval. 2019, 413– 428

[28]

Jain D , Borah M D , Biswas A . Summarization of legal documents: Where are we now and the way forward. Computer Science Review, 2021, 40 : 100388–

[29]

Hachey B , Grover C . Extractive summarisation of legal texts. Artificial Intelligence and Law, 2006, 14( 4): 305– 345

[30]

Kumar R , Raghuveer K . Legal document summarization using latent dirichlet allocation. Int. J. of Computer Science and Telecommunications, 2012, 3 : 114– 117

[31]

Galgani F, Compton P, Hoffmann A. Combining different summarization techniques for legal text. In: Proceedings of the Workshop on Innovative Hybrid Approaches to the Processing of Textual Data. 2012, 115– 123

[32]

Acharya H R , Bhat A D , Avinash K , Srinath R . Legonet-classification and extractive summarization of indian legal judgments with capsule networks and sentence embeddings. Journal of Intelligent & Fuzzy Systems, 2020, ( Preprint): 1– 10

[33]

Elnaggar A, Gebendorfer C, Glaser I, Matthes F. Multi-task deep learning for legal document translation, summarization and multi-label classification. In: Proceedings of the 2018 Artificial Intelligence and Cloud Computing Conference. 2018, 9– 15

[34]

Manor L, Li J J. Plain English summarization of contracts. In: Proceedings of the Natural Legal Language Processing Workshop 2019. 2019, 1– 11

[35]

Han P Y , Gao S X , Yu Z T , Huang Y X , Guo J J . Case-involved public opinion news summarization with case elements guidance. Journal of Chinese Information Processing, 2020, 34( 5): 56– 63

[36]

Huang Y X , Yu Z T , Guo J J , Yu Z Q , Xian Y T . Legal public opinion news abstractive summarization by incorporating topic information. International Journal of Machine Learning and Cybernetics, 2020, 1– 12

[37]

Hochreiter S, Schmidhuber J. Lstm can solve hard long time lag problems. Advances in neural information processing systems, 1997, 473– 479

[38]

Wang K, Quan X J, Wang R. BiSET: Bi-directional selective encoding with template for abstractive summarization. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019, 2153− 2162

[39]

Kalchbrenner N, Grefenstette E, Blunsom P. A convolutional neural network for modelling sentences. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2014, 655– 665

[40]

Seo M, Kembhavi A, Farhadi A, Hajishirzi H. Bidirectional attention flow for machine comprehension. 2016, arXiv preprint arXiv: 1611.01603

[41]

Gulcehre C, Ahn S, Nallapati R, Zhou B W, Bengio Y. Pointing the unknown words. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016, 140– 149

[42]

Zhang Y, Yu Z T, Mao C L, Huang Y X, Gao S X. Correlation analysis of law-related news combining bidirectional attention flow of news title and body. Journal of Intelligent & Fuzzy Systems, (Preprint): 1– 13

[43]

Lin C Y. ROUGE: A package for automatic evaluation of summaries. Text Summarization Branches Out, 2004, 74– 81

[44]

Adam P, Sam G, Soumith C, Gregory C, Edward Y, Zachary D, Ze-Ming L, Alban D, Luca A, Adam L. Automatic differentiation in pytorch. In: Proceedings of Neural Information Processing Systems. 2017

[45]

Hu Z K, Li X, Tu C C, Liu Z Y, Sun M S. Few-shot charge prediction with discriminative legal attributes. In: Proceedings of the 27th International Conference on Computational Linguistics. 2018, 487– 498

[46]

Kingma D P, Ba J. Adam: A method for stochastic optimization. 2014, arXiv preprint arXiv: 1412.6980

[47]

Lin J Y, Sun X, Ma S M, Su Q. Global encoding for abstractive summarization. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). 2018, 163– 169

[48]

Xu W R , Li C L , Lee M H , Zhang C . Multi-task learning for abstractive text summarization with key information guide network. EURASIP Journal on Advances in Signal Processing, 2020, 2020 : 1– 11

[49]

Li H R, Zhu J N, Zhang J J, Zong C Q, He X D. Keywords-guided abstractive sentence summarization. In: Proceedings of the AAAI Conference on Artificial Intelligence. 2020, 8196− 8203

[50]

Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Kaiser L, Polosukhin I. Attention is all you need. Advances in Neural Information Processing Systems 30, 2017, 5998− 6008

[51]

Klein G, Kim Y, Deng Y T, Nguyen V, Senellart J, Rush A. OpenNMT: Neural machine translation toolkit. In: Proceedings of the 13th Conference of the Association for Machine Translation in the Americas (Volume 1: Research Papers). 2018, 177– 184

RIGHTS & PERMISSIONS

Higher Education Press

AI Summary AI Mindmap
PDF (10044KB)

Supplementary files

Highlights

1671

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/