To refer to this page use:
|Abstract:||Topic models provide a useful method for dimensionality reduction and exploratory data analysis in large text corpora. Most approaches to topic model learning have been based on a maximum likelihood objective. Efficient algorithms exist that attempt to approximate this objective, but they have no provable guarantees. Recently, algorithms have been introduced that provide provable bounds, but these algorithms are not practical because they are inefficient and not robust to violations of model assumptions. In this paper we present an algorithm for learning topic models that is both provable and practical. The algorithm produces results comparable to the best MCMC implementations while running orders of magnitude faster.|
|Citation:||Arora, Sanjeev, Rong Ge, Yonatan Halpern, David Mimno, Ankur Moitra, David Sontag, Yichen Wu, and Michael Zhu. "A Practical Algorithm for Topic Modeling with Provable Guarantees." In Proceedings of the 30th International Conference on Machine Learning (2013): pp. 280-288.|
|Pages:||280 - 288|
|Type of Material:||Conference Article|
|Journal/Proceeding Title:||Proceedings of the 30th International Conference on Machine Learning|
|Version:||Final published version. Article is made available in OAR by the publisher's permission or policy.|
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.