Skip to main content

A Practical Algorithm for Topic Modeling with Provable Guarantees

Author(s): Arora, Sanjeev; Ge, Rong; Halpern, Yonatan; Mimno, David; Moitra, Ankur; et al

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1h27j
Abstract: Topic models provide a useful method for dimensionality reduction and exploratory data analysis in large text corpora. Most approaches to topic model learning have been based on a maximum likelihood objective. Efficient algorithms exist that attempt to approximate this objective, but they have no provable guarantees. Recently, algorithms have been introduced that provide provable bounds, but these algorithms are not practical because they are inefficient and not robust to violations of model assumptions. In this paper we present an algorithm for learning topic models that is both provable and practical. The algorithm produces results comparable to the best MCMC implementations while running orders of magnitude faster.
Publication Date: 2013
Citation: Arora, Sanjeev, Rong Ge, Yonatan Halpern, David Mimno, Ankur Moitra, David Sontag, Yichen Wu, and Michael Zhu. "A Practical Algorithm for Topic Modeling with Provable Guarantees." In Proceedings of the 30th International Conference on Machine Learning (2013): pp. 280-288.
ISSN: 2640-3498
Pages: 280 - 288
Type of Material: Conference Article
Journal/Proceeding Title: Proceedings of the 30th International Conference on Machine Learning
Version: Final published version. Article is made available in OAR by the publisher's permission or policy.



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.