Skip to main content

Online Agnostic Boosting via Regret Minimization

Author(s): Brukhim, Nataly; Chen, Xinyi; Hazan, Elad; Moran, Shay

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr18n9d
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBrukhim, Nataly-
dc.contributor.authorChen, Xinyi-
dc.contributor.authorHazan, Elad-
dc.contributor.authorMoran, Shay-
dc.date.accessioned2021-10-08T19:50:49Z-
dc.date.available2021-10-08T19:50:49Z-
dc.date.issued2020en_US
dc.identifier.citationBrukhim, Nataly, Xinyi Chen, Elad Hazan, and Shay Moran. "Online Agnostic Boosting via Regret Minimization." Advances in Neural Information Processing Systems 33 (2020).en_US
dc.identifier.issn1049-5258-
dc.identifier.urihttps://papers.nips.cc/paper/2020/file/07168af6cb0ef9f78dae15739dd73255-Paper.pdf-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr18n9d-
dc.description.abstractBoosting is a widely used machine learning approach based on the idea of aggregating weak learning rules. While in statistical learning numerous boosting methods exist both in the realizable and agnostic settings, in online learning they exist only in the realizable case. In this work we provide the first agnostic online boosting algorithm; that is, given a weak learner with only marginally-better-than-trivial regret guarantees, our algorithm boosts it to a strong learner with sublinear regret. Our algorithm is based on an abstract (and simple) reduction to online convex optimization, which efficiently converts an arbitrary online convex optimizer to an online booster. Moreover, this reduction extends to the statistical as well as the online realizable settings, thus unifying the 4 cases of statistical/online and agnostic/realizable boosting.en_US
dc.language.isoen_USen_US
dc.relation.ispartofAdvances in Neural Information Processing Systemsen_US
dc.rightsFinal published version. Article is made available in OAR by the publisher's permission or policy.en_US
dc.titleOnline Agnostic Boosting via Regret Minimizationen_US
dc.typeConference Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
BoostingRegretMin.pdf347.41 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.