A binary-domain recurrent-like architecture-based dynamic graph neural network

Zi-chao Chen , Sui Lin

Autonomous Intelligent Systems ›› 2024, Vol. 4 ›› Issue (1) : 11

PDF
Autonomous Intelligent Systems ›› 2024, Vol. 4 ›› Issue (1) : 11 DOI: 10.1007/s43684-024-00067-9
Original Article
research-article

A binary-domain recurrent-like architecture-based dynamic graph neural network

Author information +
History +
PDF

Abstract

The integration of Dynamic Graph Neural Networks (DGNNs) with Smart Manufacturing is crucial as it enables real-time, adaptive analysis of complex data, leading to enhanced predictive accuracy and operational efficiency in industrial environments. To address the problem of poor combination effect and low prediction accuracy of current dynamic graph neural networks in spatial and temporal domains, and over-smoothing caused by traditional graph neural networks, a dynamic graph prediction method based on spatiotemporal binary-domain recurrent-like architecture is proposed: Binary Domain Graph Neural Network (BDGNN). The proposed model begins by utilizing a modified Graph Convolutional Network (GCN) without an activation function to extract meaningful graph topology information, ensuring non-redundant embeddings. In the temporal domain, Recurrent Neural Network (RNN) and residual systems are employed to facilitate the transfer of dynamic graph node information between learner weights, aiming to mitigate the impact of noise within the graph sequence. In the spatial domain, the AdaBoost (Adaptive Boosting) algorithm is applied to replace the traditional approach of stacking layers in a graph neural network. This allows for the utilization of multiple independent graph learners, enabling the extraction of higher-order neighborhood information and alleviating the issue of over-smoothing. The efficacy of BDGNN is evaluated through a series of experiments, with performance metrics including Mean Average Precision (MAP) and Mean Reciprocal Rank (MRR) for link prediction tasks, as well as metrics for traffic speed regression tasks across diverse test sets. Compared with other models, the better experiments results demonstrate that BDGNN model can not only better integrate the connection between time and space information, but also extract higher-order neighbor information to alleviate the over-smoothing phenomenon of the original GCN.

Keywords

Dynamic graph neural network / Smart manufacturing / Over-smoothing / Link prediction / Traffic prediction

Cite this article

Download citation ▾
Zi-chao Chen, Sui Lin. A binary-domain recurrent-like architecture-based dynamic graph neural network. Autonomous Intelligent Systems, 2024, 4(1): 11 DOI:10.1007/s43684-024-00067-9

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

J. Li, H. Shomer, H. Mao, S. Zeng, Y. Ma, N. Shah, J. Tang, D. Yin, Evaluating graph neural networks for link prediction: current pitfalls and new benchmarking. Adv. Neural Inf. Proces. Syst. 36 (2024). arXiv:2306.10453

[2]

WangH., CuiZ., LiuR., FangL., ShaY.. A multi-type transferable method for missing link prediction in heterogeneous social networks. IEEE Trans. Knowl. Data Eng., 2023, 35(11): 10981-10991

[3]

TanQ., ZhangX., LiuN., ZhaD., LiL., ChenR., ChoiS.-H., HuX.. Bring your own view: graph neural networks for link prediction with personalized subgraph selection. Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining, 2023625633

[4]

K. Xu, W. Hu, J. Leskovec, S. Jegelka, How powerful are graph neural networks? arXiv preprint (2018). arXiv:1810.00826

[5]

WuY., FuY., XuJ., YinH., ZhouQ., LiuD.. Heterogeneous question answering community detection based on graph neural network. Inf. Sci., 2023, 621: 652-671

[6]

HeD., SongY., JinD., FengZ., ZhangB., YuZ., ZhangW.. Community-centric graph convolutional network for unsupervised community detection. Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, 202135153521

[7]

CiniA., MariscaI., BianchiF.M., AlippiC.. Scalable spatiotemporal graph neural networks. Proceedings of the AAAI Conference on Artificial Intelligence, 20237218722637

[8]

DaiS., WangJ., HuangC., YuY., DongJ.. Dynamic multi-view graph neural networks for citywide traffic inference. ACM Trans. Knowl. Discov. Data, 2023, 17(4): 1-22

[9]

FriedmanJ., HastieT., TibshiraniR.. Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors). Ann. Stat., 2000, 28(2): 337-407

[10]

LiQ., HanZ., WuX.-M.. Deeper insights into graph convolutional networks for semi-supervised learning. Proceedings of the AAAI Conference on Artificial Intelligence, 201832

[11]

SeoY., DefferrardM., VandergheynstP., BressonX.. Structured sequence modeling with graph convolutional recurrent networks. Neural Information Processing: 25th International Conference, ICONIP 2018, 2018, Berlin. Springer. 36237325

[12]

HochreiterS., SchmidhuberJ.. Long short-term memory. Neural Comput., 1997, 9(8): 1735-1780

[13]

P. Goyal, N. Kamra, X. He, Y. Liu, Dyngem: deep embedding method for dynamic graphs. arXiv preprint (2018). arXiv:1805.11273

[14]

ParejaA., DomeniconiG., ChenJ., MaT., SuzumuraT., KanezashiH., KalerT., SchardlT., LeisersonC.. Evolvegcn: evolving graph convolutional networks for dynamic graphs. Proceedings of the AAAI Conference on Artificial Intelligence, 20205363537034

[15]

TakeuchiK., KawaharaY., IwataT.. Structurally regularized non-negative tensor factorization for spatio-temporal pattern discoveries. Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2017, 2017, Berlin. Springer. 58259810

[16]

ShiM., HuangY., ZhuX., TangY., ZhuangY., LiuJ.. Gaen: graph attention evolving networks. Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence (IJCAI), 2021

[17]

P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, Y. Bengio, et al., Graph attention networks. arXiv:1710.10903 [stat.ML]

[18]

K. Cho, B. Van Merriënboer, D. Bahdanau, Y. Bengio, On the properties of neural machine translation: encoder-decoder approaches. arXiv preprint (2014). arXiv:1409.1259

[19]

YouJ., DuT., LeskovecJ.. Roland: graph learning framework for dynamic graphs. Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 202223582366

[20]

XuK., LiC., TianY., SonobeT., KawarabayashiK.-I., JegelkaS.. Representation learning on graphs with jumping knowledge networks. International Conference on Machine Learning, 201854535462PMLR

[21]

Y. Rong, W. Huang, T. Xu, J. Huang, Dropedge: towards deep graph convolutional networks on node classification. arXiv preprint (2019). arXiv:1907.10903

[22]

SunK., ZhuZ., LinZ.. Adagcn: adaboosting graph convolutional networks into deep models. International Conference on Learning Representations, 2021arXiv:1908.05081v3

[23]

WuF., SouzaA., ZhangT., FiftyC., YuT., WeinbergerK.. Simplifying graph convolutional networks. International Conference on Machine Learning, 201968616871PMLR

[24]

HastieT., RossetS., ZhuJ., ZouH.. Multi-class adaboost. Stat. Interface, 2009, 2(3): 349-360

[25]

GoyalP., ChhetriS.R., CanedoA.. dyngraph2vec: capturing network dynamics using dynamic graph representation learning. Knowl.-Based Syst., 2020, 187104816

[26]

Y. Li, R. Yu, C. Shahabi, Y. Liu, Diffusion convolutional recurrent neural network: data-driven traffic forecasting. arXiv preprint (2017). arXiv:1707.01926

[27]

ZhaoL., SongY., ZhangC., LiuY., WangP., LinT., DengM., LiH.. T-GCN: a temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transp. Syst., 2019, 21(9): 3848-3858

[28]

ZhuJ., WangQ., TaoC., DengH., ZhaoL., LiH.. AST-GCN: attribute-augmented spatiotemporal graph convolutional network for traffic forecasting. IEEE Access, 2021, 9: 35973-35983

Funding

Natural Science Foundation of Guangdong Province(2021A1515011243)

Science and Technology Planning Project of Guangdong Province(2019B010139001)

Guangzhou Municipal Science and Technology Project(201902020016)

RIGHTS & PERMISSIONS

The Author(s)

AI Summary AI Mindmap
PDF

163

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/