Skip to main content

Gradients weights improve regression and classification

Author(s): Kpotufe, S; Boularias, A; Schultz, T; Kim, K

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1dw14
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKpotufe, S-
dc.contributor.authorBoularias, A-
dc.contributor.authorSchultz, T-
dc.contributor.authorKim, K-
dc.date.accessioned2021-10-11T14:17:11Z-
dc.date.available2021-10-11T14:17:11Z-
dc.date.issued2016en_US
dc.identifier.citationKpotufe, Samory, Abdeslam Boularias, Thomas Schultz, and Kyoungok Kim. "Gradients weights improve regression and classification." The Journal of Machine Learning Research 17, no. 22 (2016): pp. 1-34.en_US
dc.identifier.issn1532-4435-
dc.identifier.urihttp://www.jmlr.org/papers/v17/13-351.html-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1dw14-
dc.description.abstractIn regression problems over ℝd, the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i according to an estimate of the variation of f along coordinate i -- e.g. the L1 norm of the ith-directional derivative of f -- is an efficient way to significantly improve the performance of distance-based regressors such as kernel and k-NN regressors. The approach, termed Gradient Weighting (GW), consists of a first pass regression estimate fn which serves to evaluate the directional derivatives of f, and a second-pass regression estimate on the re-weighted data. The GW approach can be instantiated for both regression and classification, and is grounded in strong theoretical principles having to do with the way regression bias and variance are affected by a generic feature-weighting scheme. These theoretical principles provide further technical foundation for some existing feature-weighting heuristics that have proved successful in practice. We propose a simple estimator of these derivative norms and prove its consistency. The proposed estimator computes efficiently and easily extends to run online. We then derive a classification version of the GW approach which evaluates on real-worlds datasets with as much success as its regression counterpart.en_US
dc.format.extent1 - 34en_US
dc.language.isoen_USen_US
dc.relation.ispartofJournal of Machine Learning Researchen_US
dc.rightsFinal published version. Article is made available in OAR by the publisher's permission or policy.en_US
dc.titleGradients weights improve regression and classificationen_US
dc.typeJournal Articleen_US
dc.identifier.eissn1533-7928-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
GradientWeightRegressionClassification.pdf803.56 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.