# A non-generative framework and convex relaxations for unsupervised learning

## Author(s): Hazan, Elad; Ma, T

To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1vq24
DC FieldValueLanguage
dc.contributor.authorMa, T-
dc.date.accessioned2018-07-20T15:08:04Z-
dc.date.available2018-07-20T15:08:04Z-
dc.date.issued2016en_US
dc.identifier.citationHazan, E, Ma, T. (2016). A non-generative framework and convex relaxations for unsupervised learning. 3314 - 3322en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1vq24-
dc.description.abstractWe give a novel formal theoretical framework for unsupervised learning with two distinctive characteristics. First, it does not assume any generative model and based on a worst-case performance metric. Second, it is comparative, namely performance is measured with respect to a given hypothesis class. This allows to avoid known computational hardness results and improper algorithms based on convex relaxations. We show how several families of unsupervised learning models, which were previously only analyzed under probabilistic assumptions and are otherwise provably intractable, can be efficiently learned in our framework by convex optimizationen_US
dc.format.extent3314 - 3322en_US
dc.language.isoen_USen_US
dc.relation.ispartofAdvances in Neural Information Processing Systemsen_US
dc.rightsAuthor's manuscripten_US
dc.titleA non-generative framework and convex relaxations for unsupervised learningen_US
dc.typeConference Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat
A non-generative framework and convex relaxations for unsupervised learning.pdf229.1 kBAdobe PDF