Skip to main content

Generalized high-dimensional trace regression via nuclear norm regularization

Author(s): Fan, Jianqing; Gong, W; Zhu, Z

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1gg4h
Abstract: © 2019 Elsevier B.V. We study the generalized trace regression with a near low-rank regression coefficient matrix, which extends notion of sparsity for regression coefficient vectors. Specifically, given a matrix covariate X, the probability density function of the response Y is f(Y|X)=c(Y)exp(ϕ−1−Yη∗+b(η∗)), where η∗=tr(Θ∗TX). This model accommodates various types of responses and embraces many important problem setups such as reduced-rank regression, matrix regression that accommodates a panel of regressors, matrix completion, among others. We estimate Θ∗ through minimizing empirical negative log-likelihood plus nuclear norm penalty. We first establish a general theory and then for each specific problem, we derive explicitly the statistical rate of the proposed estimator. They all match the minimax rates in the linear trace regression up to logarithmic factors. Numerical studies confirm the rates we established and demonstrate the advantage of generalized trace regression over linear trace regression when the response is dichotomous. We also show the benefit of incorporating nuclear norm regularization in dynamic stock return prediction and in image classification.
Publication Date: 1-Sep-2019
Citation: Fan, J, Gong, W, Zhu, Z. (2019). Generalized high-dimensional trace regression via nuclear norm regularization. Journal of Econometrics, 212 (1), 177 - 202. doi:10.1016/j.jeconom.2019.04.026
DOI: doi:10.1016/j.jeconom.2019.04.026
ISSN: 0304-4076
EISSN: 1872-6895
Pages: 177 - 202
Type of Material: Journal Article
Journal/Proceeding Title: Journal of Econometrics
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.