Estimating posterior inference quality of the relational infinite latent feature model for overlapping community detection

Qianchen YU , Zhiwen YU , Zhu WANG , Xiaofeng WANG , Yongzhi WANG

Front. Comput. Sci. ›› 2020, Vol. 14 ›› Issue (6) : 146323

PDF (1213KB)
Front. Comput. Sci. ›› 2020, Vol. 14 ›› Issue (6) : 146323 DOI: 10.1007/s11704-020-9370-z
RESEARCH ARTICLE

Estimating posterior inference quality of the relational infinite latent feature model for overlapping community detection

Author information +
History +
PDF (1213KB)

Abstract

Overlapping community detection has become a very hot research topic in recent decades, and a plethora of methods have been proposed. But, a common challenge in many existing overlapping community detection approaches is that the number of communities K must be predefinedmanually. We propose a flexible nonparametric Bayesian generative model for count-value networks, which can allow K to increase as more and more data are encountered instead of to be fixed in advance. The Indian buffet process was used to model the community assignment matrix Z, and an uncollapsed Gibbs sampler has been derived.However, as the community assignment matrix Z is a structured multi-variable parameter, how to summarize the posterior inference results and estimate the inference quality about Z, is still a considerable challenge in the literature. In this paper, a graph convolutional neural network based graph classifier was utilized to help to summarize the results and to estimate the inference quality about Z. We conduct extensive experiments on synthetic data and real data, and find that empirically, the traditional posterior summarization strategy is reliable.

Keywords

graph convolutional neural network / graph classification / overlapping community detection / nonparametric Bayesian generative model / relational infinite latent feature model / Indian buffet process / uncollapsed Gibbs sampler / posterior inference quality estimation

Cite this article

Download citation ▾
Qianchen YU, Zhiwen YU, Zhu WANG, Xiaofeng WANG, Yongzhi WANG. Estimating posterior inference quality of the relational infinite latent feature model for overlapping community detection. Front. Comput. Sci., 2020, 14(6): 146323 DOI:10.1007/s11704-020-9370-z

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Lu N, Luo W, Ni L, Jiang H, Ding W. Extending CDFR for overlapping community detection. In: Proceedings of the 1st International Conference on Data Intelligence and Security. 2018, 200–206

[2]

Luo W, Yan Z, Bu C, Zhang D. Community detection by fuzzy relations. IEEE Transactions on Emerging Topics in Computing, 2017, 1(9): 125–134

[3]

Wasserman L. Asymptotic properties of nonparametric bayesian procedures. In: Dey D, Müller P, Sinha D, eds. Practical Nonparametric Semiparametric Bayesian Statistics, Springer, New York. 1998

[4]

Anders B. Bayesian data analysis. Journal of the Royal Statistical Society, 2005, 168(1): 251–252

[5]

Weisfeiler B, Lehman A. A reduction of a graph to a canonical form and an algebra arising during this reduction. Nauchno-Techniches Kaya Informatsia, 1968, 9(2): 12–16

[6]

Xu K, Hu W, Leskovec J, Jegelka S. How powerful are graph neural networks? In: Proceedings of International Conference for Learning Representations. 2019

[7]

Li Q, Han Z, Wu X M. Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the 33rd National Conference on Artificial Intelligence. 2018

[8]

Zhang M, Cui Z, Neumann M, Chen Y. An end-to-end deep learning architecture for graph classification. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence. 2018, 4438–4445

[9]

Airoldi E M, Blei D M, Fienberg S E, Xing E P. Mixedmembership stochastic blockmodels. Journal of Machine Learning Research, 2008, 9(5): 1981–1992

[10]

Tom G, Zoubin G. Infinite latent feature models, the indian buffet process. In: Proceedings of International Conference on Neural Information Processing Systems. 2005

[11]

Holland P W, Kathryn B L, Samuel L. Stochastic blockmodels: first steps. Social Networks, 1983, 5(2): 109–137

[12]

Kemp C, Tenenbaum J B, Griffiths T L, Yamada T, Ueda N. Learning systems of concepts with an infinite relational model. In: Proceedings of the 21st National Conference on Artificial Intelligence. 2006, 381–388

[13]

Karrer B, Newman M E. Stochastic blockmodels, community structure in networks. Physical Review E Stat, 2011, 83(2): 96–107

[14]

Xu K S, Iii A O H. Dynamic stochastic blockmodels: statistical models for time-evolving networks. In: Proceedings of International Conference on Social Computing, Behavioral-Cultural Modeling, Prediction. 2013

[15]

David M B, Thomas L Gand, Michael I J. The nested chinese restaurant process, bayesian nonparametric inference. Advances in Neural Information Processing Systems, 2007, 16(2): 17–24

[16]

Miller L, Tadayuki K. Bayesian nonparametric latent feature models. Dissertations, Theses–Gradworks. 2011, 201–226

[17]

Morup M, Schmidt M N, Hansen L K. Infinite multiple membership relational modeling for complex networks. In: Proceedings of IEEE International Workshop on Machine Learning for Signal Processing. 2011

[18]

Palla K, David A K, Ghahramani Z. Relational learning, network modelling using infinite latent attribute models. IEEE Transactions on Pattern Analysis, Machine Intelligence, 2015, 37(2): 462–474

[19]

Herlau T, Schmidt M N, Morup M. Infinite-degree-corrected stochastic block model. Physical Review E Statistical Nonlinear, Soft Matter Physics, 2014, 90(3): 328–331

[20]

Wang C, David M B. Variational inference for the nested chinese restaurant process. In: Proceedings of the 22nd International Conference on Neural Information Processing Systems. 2009, 1990–1998

[21]

Thibaux R, Michael I J. Hierarchical beta processes, the indian buffet process. In: Proceedings of the 11th International Conference on Artificial Intelligence, Statistics. 2007

[22]

Persi D. Finite forms of de finetti’s theorem on exchangeability. Synthese, 1977, 36(2): 271–281

[23]

Broderick T, Jordan M I, Pitman J. Beta processes, stick-breaking, and power laws. Bayesian Analysis, 2012, 7(2): 439–476

[24]

Heaukulani C K. Generalized IBPs, random multisets, tree-structured feature allocations. University of Cambridge, Dissertations, 2016, 201–226

[25]

Tamara B, Michael I J, Jim P. Cluster, feature modeling from combinatorial stochastic processes. Statistical Science, 2013, 28(3): 289–312

[26]

Zhou M, Hannah L, Dunson D. Beta-negative binomial process and poisson factor analysis. In: Proceedings of International Conference on Artificial Intelligence and Statistics. 2011, 1462–1471

[27]

Lijoi A, Pruenster I. Models beyond the Dirichlet process. SSRN Electronic Journal, 2009, 23–49

[28]

Blasi P D, Favaro S, Lijoi A, Mena R H, Prunster R, Ruggiero M. Are gibbs-type priors the most natural generalization of the Dirichlet process? IEEE Transactions on Pattern Analysis, Machine Intelligence, 2015, 37(2): 212–229

[29]

Gelman A B, Carlin J B, Stern H S, Dunson D B, Vehtari A, Rubin D B. Bayesian data analysis. Wiley Interdisciplinary Reviews Cognitive Science, 2014, 1(5): 658–676

[30]

Martin O. Bayesian Analysis with Python. Packt Publishing. 2016

[31]

The PyMC Development Team. Pymc3 3.6 documentation. 2018

[32]

Finale D V, Zoubin G. Accelerated gibbs sampling for the indian buffet process. In: Proceedings of International Conference on Machine Learning. 2009

[33]

Tung H Y, Wu C Y, Manzil Z, Alexander J S. Spectral methods for nonparametric models. In: Proceedings of the 28th International Conference on Neural Information Processing Systems. 2017

[34]

Tsoi A C, Scarselli F, Gori M. A neural network approach to web graph processing. In: Proceedings of Asia-pacific Web Conference on Web Technologies Research, Development. 2005

[35]

Scarselli F, Gori M, Ah C T, Hagenbuchner M, Monfardini G. The graph neural network model. IEEE Transactions on Neural Networks, 2009, 20(1): 61–79

[36]

Bruna J, Mallat S. Invariant scattering convolution networks. IEEE Transactions on Pattern Analysis, Machine Intelligence, 2013, 35(8): 1872–1886

[37]

Mikael H, Joan B, Yann L. Deep convolutional networks on graphstructured data. Computer Science. 2015

[38]

David D, Dougal M, Jorge A I, Rafael G B, Timothy H, Alan A G, Ryan P A. Convolutional networks on graphs for learning molecular fingerprints. In: Proceedings of International Conference on Neural Information Processing Systems. 2015

[39]

Justin G, Samuel S, Riley S, Patrick F. Neural message passing for quantum chemistry. In: Proceedings of the International Conference on Machine Learning. 2017

[40]

Michael D, Xavier B, Pierre V. Convolutional neural networks on graphs with fast localized spectral filtering. In: Proceedings of the 30th International Conference on Neural Information Processing Systems. 2016, 3844–3852

[41]

Ron L, Federico M, Xavier B, Michael M B. Cayleynets: graph convolutional neural networks with complex rational spectral filters. IEEE Transactions on Signal Processing, 2019, 67(1): 97–109

[42]

Thomas N K, Max W. Semi-supervised classification with graph convolutional networks. In: Proceedings of the International Conference on Learning Representations. 2016

[43]

Peter W B, Jessica B H, Victor B, Alvaro S G, Vinicius Z, et al. Relational inductive biases, deep learning, graph networks. In: Proceedings of the International Conference on Learning Representations. 2018

[44]

Zhou J, Cui G, Zhang Z, Cheng Y, Liu Z Y, Sun M S. Graph neural networks: a review of methods, applications. IEEE Transactions on Knowledge and Data Engineering. 2018

[45]

Zhang Z, Cui P, Zhu W. Deep learning on graphs: a survey. IEEE Transactions on Knowledge and Data Engineering. 2018

[46]

Wu Z, Pan S, Chen F, Long G, Zhang C, Yu P S. A comprehensive survey on graph neural networks. IEEE Transactions on Knowledge and Data Engineering. 2019

[47]

Dai H, Dai B, Song L. Discriminative embeddings of latent variable models for structured data. In: Proceedings of the International Conference on Machine Learning. 2016

[48]

Scarselli F, Gori M, Tsoi A C, Hagenbuchner M, Monfardini G. Computational capabilities of graph neural networks. IEEE Transactions on Neural Networks, 2009, 20(1): 81–102

[49]

Morris C, Kersting K, Mutzel P. Global-local feature maps of graphs. In: Proceedings of the IEEE International Conference on Data Mining. 2017

[50]

Nino S, Pascal S, Erik J, Van L, Kurt M, Karsten M B. Weisfeilerlehman graph kernels. Journal of Machine Learning Research, 2011, 12(3): 2539–2561

[51]

Rossi R A, Ahmed N K. The network data repository with interactive graph analytics, visualization. In: Proceedings of the 29th AAAI Conference on Artificial Intelligence. 2015

[52]

Johnson M J, Duvenaud D, Wiltschko A B. Composing graphical models with neural networks for structured representations, fast inference. 2017, arXiv preprint arXiv:1603.06277

[53]

Kingma D P, Welling M. Auto-encoding variational bayes. In: Proceedings of the International Conference on Learning Representations. 2013

RIGHTS & PERMISSIONS

Higher Education Press and Springer-Verlag GmbH Germany, part of Springer Nature

AI Summary AI Mindmap
PDF (1213KB)

1102

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/