Heterogeneous-attributes enhancement deep framework for network embedding

Lisheng QIAO, Fan ZHANG, Xiaohui HUANG, Kai LI, Enhong CHEN

Front. Comput. Sci. ›› 2021, Vol. 15 ›› Issue (6) : 156616.

PDF(858 KB)
PDF(858 KB)
Front. Comput. Sci. ›› 2021, Vol. 15 ›› Issue (6) : 156616. DOI: 10.1007/s11704-021-9515-8
RESEARCH ARTICLE

Heterogeneous-attributes enhancement deep framework for network embedding

Author information +
History +

Abstract

Network embedding, which targets at learning the vector representation of vertices, has become a crucial issue in network analysis. However, considering the complex structures and heterogeneous attributes in real-world networks, existing methods may fail to handle the inconsistencies between the structure topology and attribute proximity. Thus, more comprehensive techniques are urgently required to capture the highly non-linear network structure and solve the existing inconsistencies with retaining more information. To that end, in this paper, we propose a heterogeneous-attributes enhancement deep framework (HEDF), which could better capture the non-linear structure and associated information in a deep learningway, and effectively combine the structure information of multi-views by the combining layer. Along this line, the inconsistencies will be handled to some extent and more structure information will be preserved through a semi-supervised mode. The extensive validations on several real-world datasets show that our model could outperform the baselines, especially for the sparse and inconsistent situation with less training data.

Keywords

network embedding / heterogeneous-attributes / deep framework / inconsistent

Cite this article

Download citation ▾
Lisheng QIAO, Fan ZHANG, Xiaohui HUANG, Kai LI, Enhong CHEN. Heterogeneous-attributes enhancement deep framework for network embedding. Front. Comput. Sci., 2021, 15(6): 156616 https://doi.org/10.1007/s11704-021-9515-8

References

[1]
Li J, Dani H, Hu X, Tang J, Chang Y, Liu H. Attributed network embedding for learning in a dynamic environment. In: Proceedings of the ACM Conference on Information and Knowledge Management. 2017, 387–396
CrossRef Google scholar
[2]
Wang D, Cui P, Zhu W. Structural deep network embedding. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2016, 1225–1234
CrossRef Google scholar
[3]
Sen P, Namata G, Bilgic M, Getoor L, Galligher B, Eliassi-Rad T. Collective classification in network data. AI Magazine, 2008, 29(3): 93
CrossRef Google scholar
[4]
Liu J,Wang D, Feng S, Zhang Y, Zhao W. Learning distributed representations for community search using node embedding. Frontiers of Computer Science, 2019, 13(2): 437–439
CrossRef Google scholar
[5]
Wang Y, Feng C, Ling C, Yin H, Guo C, Chu Y. User identity linkage across social networks via linked heterogeneous network embedding. World Wide Web, 2018, 22(6): 1–22
CrossRef Google scholar
[6]
Tian H, Tao Y, Pouyanfar S, Chen S C, Shyu ML.Multimodal deep representation learning for video classification. World Wide Web, 2019, 22(3): 1325–1341
CrossRef Google scholar
[7]
Liu Z, Yang Y, Zi H, Shen F, Zhang D, Shen H T. Embedding and predicting the event at early stage. World Wide Web, 2019, 22(3): 1055–1074
CrossRef Google scholar
[8]
Perozzi B, Al-Rfou R, Skiena S. Deepwalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2014, 701–710
CrossRef Google scholar
[9]
Tang J, Qu M, Wang M, Zhang M, Yan J, Mei Q. Line: large-scale information network embedding. In: Proceedings of the 24th International Conference on World Wide Web. 2015, 1067–1077
CrossRef Google scholar
[10]
Grover A, Leskovec J. Node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2016, 855–864
CrossRef Google scholar
[11]
Niepert M, Ahmed M H, Kutzkov K. Learning convolutional neural networks for graphs. In: Proceedings of International Conference on Machine Learning. 2016, 2014–2023
[12]
Velickovic P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y. Graph attention networks. In: Proceedings of the 6th International Conference on Learning Representations. 2018
[13]
Li C, Wang S, Yang D, Li Z, Yang Y, Zhang X, Zhou J. PPNE: property preserving network embedding. In: Proceedings of International Conference on Database Systems for Advanced Applications. 2017, 163–179
CrossRef Google scholar
[14]
Wang X, Cui P, Wang J, Pei J, Zhu W, Yang S. Community preserving network embedding. In: Proceedings of the AAAI Conference on Artificial Intelligence. 2017, 203–209
[15]
Wang Z, Han Y, Lin T, Yuemei X U, Song C I, Tang H. Topology-aware virtual network embedding based on closeness centrality. Frontiers of Computer Science, 2013, 7(3): 446–457
CrossRef Google scholar
[16]
Sun X, Guo J, Ding X, Liu T. A general framework for content-enhanced network representation learning. 2016, arXiv Preprint arXiv:1610.02906
[17]
Yang C, Liu Z, Zhao D, Sun M, Chang E Y. Network representation learning with rich text information. In: Proceedings of International Joint Conference on Artificial Intelligence. 2015, 2111–2117
[18]
Yang D,Wang S, Li C, Zhang X, Li Z. From properties to links: deep network embedding on incomplete graphs. In: Proceedings of the 2017 ACM Conference on Information and Knowledge Management. 2017, 367–376
CrossRef Google scholar
[19]
Bullinaria J A, Levy J P. Extracting semantic representations from word co-occurrence statistics: stop-lists, stemming, and SVD. Behavior Research Methods, 2012, 44(3): 890–907
CrossRef Google scholar
[20]
Belkin M, Niyogi P. Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Proceedings of the 14th International Conference on Neural Information Processing Systems. 2001, 585–591
[21]
Yan S, Xu D, Zhang B, Zhang H J, Yang Q, Lin S. Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(1): 40–51
CrossRef Google scholar
[22]
Keikha MM, Rahgozar M, Asadpour M. Community aware random walk for network embedding. Knowledge-Based Systems, 2018, 148: 47–54
CrossRef Google scholar
[23]
Scarselli F, Gori M, Tsoi A C, Hagenbuchner M,Monfardini G. The graph neural network model. IEEE Transactions on Neural Networks, 2009, 20(1): 61
CrossRef Google scholar
[24]
Chang S, Han W, Tang J, Qi G J, Aggarwal C C, Huang T S. Heterogeneous network embedding via deep architectures. In: Proceedings of the 21st ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2015, 119–128
CrossRef Google scholar
[25]
Cen Y, Zou X, Zhang J, Yang H, Zhou J, Tang J. Representation learning for attributed multiplex heterogeneous network. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2019, 1358–1368
CrossRef Google scholar
[26]
Qu M, Tang J, Shang J, Ren X, Zhang M, Han J. An attention-based collaboration framework for multi-view network representation learning. In: Proceedings of the ACM Conference on Information and Knowledge Management. 2017, 1767–1776
CrossRef Google scholar
[27]
Liben-Nowell D, Kleinberg J. The link-prediction problem for social networks. Journal of the Association for Information Science and Technology, 2007, 58(7): 1019–1031
CrossRef Google scholar
[28]
Zhang G, Liu Y, Jin X. A survey of autoencoder-based recommender systems. Frontiers of Computer Science, 2020, 14(2): 430–450
CrossRef Google scholar
[29]
Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003, 15(6): 1373–1396
CrossRef Google scholar
[30]
Jamali M, Ester M. A matrix factorization technique with trust propagation for recommendation in social networks. In: Proceedings of the 4th ACM Conference on Recommender Systems. 2010, 135–142
CrossRef Google scholar
[31]
Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. In: Proceedings of the 3rd International Conference on Learning Representations. 2015
[32]
Rumelhart D E, Hinton G E, Williams R J. Learning representations by back-propagating errors. Nature, 1986, 323(6088): 533
CrossRef Google scholar
[33]
Leskovec J, Mcauley J J. Learning to discover social circles in ego networks. In: Proceedings of the 25th International Conference on Neural Information Processing Systems. 2012, 539–547
[34]
Chua T, Tang J, Hong R, Li H, Luo Z, Zheng Y. Nus-wide: a real-world web image database from national university of singapore. In: Proceedings of the ACM International Conference on Image and Video Retrieval. 2009, 48
CrossRef Google scholar
[35]
Fu T Y, Lee W C, Lei Z. Hin2vec: explore metapaths in heterogeneous information networks for representation learning. In: Proceedings of the ACM International Conference on Information and Knowledge Management. 2017, 1797–1806
CrossRef Google scholar
[36]
Chawla N V, Bowyer K W, Hall L O, Kegelmeyer W P. Smote: synthetic minority over-sampling technique. Journal of Artificial Intelligence Research, 2002, 16: 321–357
CrossRef Google scholar
[37]
Hsia C Y, Chiang W L, Lin C J. Preconditioned conjugate gradient methods in truncated newton frameworks for largescale linear classification. In: Proceedings of Asian Conference on Machine Learning. 2018, 312–326

RIGHTS & PERMISSIONS

2021 Higher Education Press
AI Summary AI Mindmap
PDF(858 KB)

Accesses

Citations

Detail

Sections
Recommended

/