D2-GCN: a graph convolutional network with dynamic disentanglement for node classification

Shangwei WU, Yingtong XIONG, Hui LIANG, Chuliang WENG

PDF(9090 KB)
PDF(9090 KB)
Front. Comput. Sci. ›› 2025, Vol. 19 ›› Issue (1) : 191305. DOI: 10.1007/s11704-023-3339-7
Artificial Intelligence
RESEARCH ARTICLE

D2-GCN: a graph convolutional network with dynamic disentanglement for node classification

Author information +
History +

Abstract

Classic Graph Convolutional Networks (GCNs) often learn node representation holistically, which ignores the distinct impacts from different neighbors when aggregating their features to update a node’s representation. Disentangled GCNs have been proposed to divide each node’s representation into several feature units. However, current disentangling methods do not try to figure out how many inherent factors the model should assign to help extract the best representation of each node. This paper then proposes D2-GCN to provide dynamic disentanglement in GCNs and present the most appropriate factorization of each node’s mixed features. The convergence of the proposed method is proved both theoretically and experimentally. Experiments on real-world datasets show that D2-GCN outperforms the baseline models concerning node classification results in both single- and multi-label tasks.

Graphical abstract

Keywords

graph convolutional networks / dynamic disentanglement / label entropy / node classification

Cite this article

Download citation ▾
Shangwei WU, Yingtong XIONG, Hui LIANG, Chuliang WENG. D2-GCN: a graph convolutional network with dynamic disentanglement for node classification. Front. Comput. Sci., 2025, 19(1): 191305 https://doi.org/10.1007/s11704-023-3339-7

Shangwei Wu is currently a PhD candidate at East China Normal University (ECNU), China. His research interests include graph neural networks and deep learning systems

Yingtong Xiong received the ME degree at East China Normal University (ECNU), China in 2023. Her research interests include graph neural networks and deep learning systems

Hui Liang is pursuing the ME degree at East China Normal University (ECNU), China. His research interests include graph neural networks and deep learning systems

Chuliang Weng received the PhD degree at Shanghai Jiao Tong University (SJTU), China in 2004. He is currently a full professor at East China Normal University (ECNU), China. Before joining ECNU, he was an associate professor at SJTU and later worked at Huawei Central Research Institute, China. He was also a visiting research scientist at Columbia University, USA. His research interests include parallel and distributed systems, system virtualization and cloud computing, storage systems, operating systems, and system security

References

[1]
Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks. In: Proceedings of the 5th International Conference on Learning Representations. 2017
[2]
Rong Y, Huang W, Xu T, Huang J. DropEdge: towards deep graph convolutional networks on node classification. In: Proceedings of the 8th International Conference on Learning Representations. 2020
[3]
Zhang M, Chen Y. Link prediction based on graph neural networks. In: Proceedings of the 32nd International Conference on Neural Information Processing Systems. 2018, 5171−5181
[4]
Yun S, Kim S, Lee J, Kang J, Kim H J. Neo-GNNs: neighborhood overlap-aware graph neural networks for link prediction. In: Proceedings of the 35th International Conference on Neural Information Processing Systems. 2021, 13683−13694
[5]
Zhang M, Cui Z, Neumann M, Chen Y. An end-to-end deep learning architecture for graph classification. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence. 2018, 4438−4445
[6]
Yang Y, Feng Z, Song M, Wang X. Factorizable graph convolutional networks. In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020
[7]
Wu S, Xiong Y, Weng C . Dynamic depth-width optimization for capsule graph convolutional network. Frontiers of Computer Science, 2023, 17( 6): 176346
[8]
Liu K, Sun X, Jia L, Ma J, Xing H, Wu J, Gao H, Sun Y, Boulnois F, Fan J . Chemi-Net: a molecular graph convolutional network for accurate drug property prediction. International Journal of Molecular Sciences, 2019, 20( 14): 3389
[9]
Sun M, Zhao S, Gilvary C, Elemento O, Zhou J, Wang F . Graph convolutional networks for computational drug development and discovery. Briefings in Bioinformatics, 2020, 21( 3): 919–935
[10]
Jin W, Stokes J M, Eastman R T, Itkin Z, Zakharov A V, Collins J J, Jaakkola T S, Barzilay R . Deep learning identifies synergistic drug combinations for treating COVID-19. Proceedings of the National Academy of Sciences of the United States of America, 2021, 118( 39): e2105070118
[11]
Ying R, He R, Chen K, Eksombatchai P, Hamilton W L, Leskovec J. Graph convolutional neural networks for web-scale recommender systems. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2018, 974−983
[12]
Fan W, Ma Y, Li Q, He Y, Zhao E, Tang J, Yin D. Graph neural networks for social recommendation. In: Proceedings of the World Wide Web Conference. 2019, 417−426
[13]
He X, Deng K, Wang X, Li Y, Zhang Y, Wang M. LightGCN: simplifying and powering graph convolution network for recommendation. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval. 2020, 639−648
[14]
Higgins I, Matthey L, Pal A, Burgess C, Glorot X, Botvinick M, Mohamed S, Lerchner A. beta-VAE: learning basic visual concepts with a constrained variational framework. In: Proceedings of the 5th International Conference on Learning Representations. 2017
[15]
Song J, Chen Y, Ye J, Wang X, Shen C, Mao F, Song M. DEPARA: deep attribution graph for deep knowledge transferability. In: Proceedings of 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020, 3922−3930
[16]
Alemi A A, Fischer I, Dillon J V, Murphy K. Deep variational information bottleneck. In: Proceedings of the 5th International Conference on Learning Representations. 2017
[17]
Lipton Z C . The mythos of model interpretability: in machine learning, the concept of interpretability is both important and slippery. Queue, 2018, 16( 3): 31–57
[18]
Ma J, Cui P, Kuang K, Wang X, Zhu W. Disentangled graph convolutional networks. In: Proceedings of the 36th International Conference on Machine Learning. 2019, 4212−4221
[19]
Liu Y, Wang X, Wu S, Xiao Z. Independence promoted graph disentangled networks. In: Proceedings of the 34th AAAI Conference on Artificial Intelligence. 2020, 4916−4923
[20]
Sen P, Namata G, Bilgic M, Getoor L, Gallagher B, Eliassi-Rad T . Collective classification in network data. AI Magazine, 2008, 29( 3): 93–106
[21]
Wasserman S, Faust K. Centrality and prestige. In: Wasserman S, Faust K. Social Network Analysis: Methods and Applications. Cambridge: Cambridge University Press, 1994, 169−219
[22]
Chan P K, Stolfo S J. Learning with non-uniform class and cost distributions: effects and a distributed multi-classifier approach. In: Proceedings of the Work Shop Notes KDD-98 Workshop on Distributed Data Mining. 1998, 1−9
[23]
Brodersen K H, Ong C S, Stephan K E, Buhmann J M. The balanced accuracy and its posterior distribution. In: Proceedings of the 20th International Conference on Pattern Recognition. 2010, 3121−3124
[24]
Luo W, Li Y, Urtasun R, Zemel R S. Understanding the effective receptive field in deep convolutional neural networks. In: Proceedings of the 29th Advances in Neural Information Processing Systems. 2016, 4898−4906
[25]
Sutskever I, Vinyals O, Le Q V. Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural Information Processing Systems. 2014, 3104−3112
[26]
Kullback S, Leibler R A . On information and sufficiency. The Annals of Mathematical Statistics, 1951, 22( 1): 79–86
[27]
Shannon C E . A mathematical theory of communication. The Bell System Technical Journal, 1948, 27( 3): 379–423
[28]
Breitkreutz B J, Stark C, Reguly T, Boucher L, Breitkreutz A, Livstone M, Oughtred R, Lackner D H, Bähler J, Wood V, Dolinski K, Tyers M . The BioGRID interaction database: 2008 update. Nucleic Acids Research, 2008, 36: D637–D640
[29]
Liberzon A, Subramanian A, Pinchback R, Thorvaldsdóttir H, Tamayo P, Mesirov J P . Molecular signatures database (MSigDB) 3. 0. Bioinformatics, 2011, 27( 12): 1739–1740
[30]
Mahoney M. Large text compression benchmark. 2023
[31]
Toutanova K, Klein D, Manning C D, Singer Y. Feature-rich part-of-speech tagging with a cyclic dependency network. In: Proceedings of the North American Chapter of the Association for Computational Linguistics. 2003, 252−259
[32]
Tang L, Liu H. Relational learning via latent social dimensions. In: Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2009, 817−826
[33]
McAuley J, Leskovec J. Image labeling on a network: using social-network metadata for image classification. In: Proceedings of the 12th European Conference on Computer Vision. 2012, 828−841
[34]
Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y. Graph attention networks. In: Proceedings of the 6th International Conference on Learning Representations. 2018
[35]
Perozzi B, Al-Rfou R, Skiena S. DeepWalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2014, 701−710
[36]
Tang J, Qu M, Wang M, Zhang M, Yan J, Mei Q. LINE: large-scale information network embedding. In: Proceedings of the 24th International Conference on World Wide Web. 2015, 1067−1077
[37]
Grover A, Leskovec J. node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2016, 855−864
[38]
Rosenblatt F . The perceptron: a probabilistic model for information storage and organization in the brain. Psychological Review, 1958, 65( 6): 386–408
[39]
Belkin M, Niyogi P, Sindhwani V . Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, 2006, 7: 2399–2434
[40]
Weston J, Ratle F, Mobahi H, Collobert R. Deep learning via semi-supervised embedding. In: Montavon G, Orr G B, Müller K R. Neural Networks: Tricks of the Trade. Berlin, Heidelberg: Springer, 2012
[41]
Zhu X, Ghahramani Z, Lafferty J. Semi-supervised learning using Gaussian fields and harmonic functions. In: Proceedings of the 20th International Conference on Machine Learning. 2003, 912−919
[42]
Lu Q, Getoor L. Link-based classification. In: Proceedings of the 20th International Conference on Machine Learning. 2003, 496−503
[43]
Yang Z, Cohen W, Salakhutdinov R. Revisiting semi-supervised learning with graph embeddings. In: Proceedings of the 33rd International Conference on Machine Learning. 2016, 40−48
[44]
Defferrard M, Bresson X, Vandergheynst P. Convolutional neural networks on graphs with fast localized spectral filtering. In: Proceedings of the 30th International Conference on Neural Information Processing Systems. 2016, 3844−3852
[45]
Monti F, Boscaini D, Masci J, Rodolà E, Svoboda J, Bronstein M M. Geometric deep learning on graphs and manifolds using mixture model CNNs. In: Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. 2017, 5115−5124
[46]
Abu-El-Haija S, Perozzi B, Kapoor A, Alipourfard N, Lerman K, Harutyunyan H, Ver Steeg G, Galstyan A. MixHop: higher-order graph convolutional architectures via sparsified neighborhood mixing. In: Proceedings of the 36th International Conference on Machine Learning. 2019, 21−29
[47]
van der Maaten L, Hinton G . Visualizing data using t-SNE. Journal of Machine Learning Research, 2008, 9( 86): 2579–2605
[48]
Bruna J, Zaremba W, Szlam A, LeCun Y. Spectral networks and locally connected networks on graphs. In: Proceedings of the 2nd International Conference on Learning Representations. 2014
[49]
Levie R, Monti F, Bresson X, Bronstein M M . CayleyNets: graph convolutional neural networks with complex rational spectral filters. IEEE Transactions on Signal Processing, 2019, 67( 1): 97–109
[50]
Hamilton W L, Ying R, Leskovec J. Inductive representation learning on large graphs. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017, 1025−1035
[51]
Gao H, Wang Z, Ji S. Large-scale learnable graph convolutional networks. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2018, 1416−1424
[52]
Hinton G E, Krizhevsky A, Wang S D. Transforming auto-encoders. In: Proceedings of the 21st International Conference on Artificial Neural Networks. 2011, 44−51
[53]
Liu Z, Zhang H, Chen Z, Wang Z, Ouyang W. Disentangling and unifying graph convolutions for skeleton-based action recognition. In: Proceedings of 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020, 143−152
[54]
Wang Y, Tang S, Lei Y, Song W, Wang S, Zhang M. DisenHAN: disentangled heterogeneous graph attention network for recommendation. In: Proceedings of the 29th ACM International Conference on Information & Knowledge Management. 2020, 1605−1614
[55]
Qin Y, Wang Y, Sun F, Ju W, Hou X, Wang Z, Cheng J, Lei J, Zhang M. DisenPOI: disentangling sequential and geographical influence for point-of-interest recommendation. In: Proceedings of the 16th ACM International Conference on Web Search and Data Mining. 2023, 508−516
[56]
Wang Y, Qin Y, Sun F, Zhang B, Hou X, Hu K, Cheng J, Lei J, Zhang M. DisenCTR: dynamic graph-based disentangled representation for click-through rate prediction. In: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval. 2022
[57]
Wang Y, Song Y, Li S, Cheng C, Ju W, Zhang M, Wang S. DisenCite: graph-based disentangled representation learning for context-specific citation generation. In: Proceedings of the 36th AAAI Conference on Artificial Intelligence. 2022, 11449−11458
[58]
Wu J, Shi W, Cao X, Chen J, Lei W, Zhang F, Wu W, He X. DisenKGAT: knowledge graph embedding with disentangled graph attention network. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management. 2021, 2140−2149
[59]
Bae I, Jeon H G. Disentangled multi-relational graph convolutional network for pedestrian trajectory prediction. In: Proceedings of the 35th AAAI Conference on Artificial Intelligence. 2021, 911−919
[60]
Mu Z, Tang S, Tan J, Yu Q, Zhuang Y. Disentangled motif-aware graph learning for phrase grounding. In: Proceedings of the 35th AAAI Conference on Artificial Intelligence. 2021, 13587−13594

Acknowledgements

This work was supported by the National Natural Science Foundation of China (Grant Nos. 62141214 and 62272171).

Competing interests

The authors declare that they have no competing interests or financial conflicts to disclose.

RIGHTS & PERMISSIONS

2025 Higher Education Press
AI Summary AI Mindmap
PDF(9090 KB)

Accesses

Citations

Detail

Sections
Recommended

/