Skip to main content

Efficient Online Inference for Bayesian Nonparametric Relational Models

Author(s): Kim, Dae Il; Gopalan, Prem K; Blei, David; Sudderth, Erik

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr14g0d
Abstract: Stochastic block models characterize observed network relationships via latent community memberships. In large social networks, we expect entities to participate in multiple communities, and the number of communities to grow with the network size. We introduce a new model for these phenomena, the hierarchical Dirichlet process relational model, which allows nodes to have mixed membership in an unbounded set of communities. To allow scalable learning, we derive an online stochastic variational inference algorithm. Focusing on assortative models of undirected networks, we also propose an efficient structured mean field variational bound, and online methods for automatically pruning unused communities. Compared to state-of-the-art online learning methods for parametric relational models, we show significantly improved perplexity and link prediction accuracy for sparse networks with tens of thousands of nodes. We also showcase an analysis of LittleSis, a large network of who-knows-who at the heights of business and government.
Publication Date: 2013
Citation: Kim, Dae Il, Prem Gopalan, David M. Blei, and Erik B. Sudderth. "Efficient Online Inference for Bayesian Nonparametric Relational Models." In Advances in Neural Information Processing Systems 26 (2013): pp. 962-970.
ISSN: 1049-5258
Pages: 962 - 970
Type of Material: Conference Article
Journal/Proceeding Title: Advances in Neural Information Processing Systems
Version: Final published version. This is an open access article.



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.