A non-generative framework and convex relaxations for unsupervised learning
Author(s): Hazan, Elad; Ma, T
DownloadTo refer to this page use:
http://arks.princeton.edu/ark:/88435/pr1vq24
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Hazan, Elad | - |
dc.contributor.author | Ma, T | - |
dc.date.accessioned | 2018-07-20T15:08:04Z | - |
dc.date.available | 2018-07-20T15:08:04Z | - |
dc.date.issued | 2016 | en_US |
dc.identifier.citation | Hazan, E, Ma, T. (2016). A non-generative framework and convex relaxations for unsupervised learning. 3314 - 3322 | en_US |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/pr1vq24 | - |
dc.description.abstract | We give a novel formal theoretical framework for unsupervised learning with two distinctive characteristics. First, it does not assume any generative model and based on a worst-case performance metric. Second, it is comparative, namely performance is measured with respect to a given hypothesis class. This allows to avoid known computational hardness results and improper algorithms based on convex relaxations. We show how several families of unsupervised learning models, which were previously only analyzed under probabilistic assumptions and are otherwise provably intractable, can be efficiently learned in our framework by convex optimization | en_US |
dc.format.extent | 3314 - 3322 | en_US |
dc.language.iso | en_US | en_US |
dc.relation.ispartof | Advances in Neural Information Processing Systems | en_US |
dc.rights | Author's manuscript | en_US |
dc.title | A non-generative framework and convex relaxations for unsupervised learning | en_US |
dc.type | Conference Article | en_US |
pu.type.symplectic | http://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceeding | en_US |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
A non-generative framework and convex relaxations for unsupervised learning.pdf | 229.1 kB | Adobe PDF | View/Download |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.