Skip to main content

The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences

Author(s): Chen, Yuxin; Candès, EJ

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1cs1n
Abstract: Various applications involve assigning discrete label values to a collection of objects based on some pairwise noisy data. Due to the discrete—and hence nonconvex—structure of the problem, computing the optimal assignment (e.g., maximum-likelihood assignment) becomes intractable at first sight. This paper makes progress towards efficient computation by focusing on a concrete joint alignment problem; that is, the problem of recovering n discrete variables xi ∊ {1, …, m}, 1 ≤ i ≤ n, given noisy observations of their modulo differences {xi — xj mod m}. We propose a low-complexity and model-free nonconvex procedure, which operates in a lifted space by representing distinct label values in orthogonal directions and attempts to optimize quadratic functions over hypercubes. Starting with a first guess computed via a spectral method, the algorithm successively refines the iterates via projected power iterations. We prove that for a broad class of statistical models, the proposed projected power method makes no error—and hence converges to the maximum-likelihood estimate—in a suitable regime. Numerical experiments have been carried out on both synthetic and real data to demonstrate the practicality of our algorithm. We expect this algorithmic framework to be effective for a broad range of discrete assignment problems.
Publication Date: 2018
Citation: Chen, Y, Candès, EJ. (2018). The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences. Communications on Pure and Applied Mathematics, 71 (1648 - 1714. doi:10.1002/cpa.21760
DOI: doi:10.1002/cpa.21760
Pages: 1648 - 1714
Type of Material: Journal Article
Journal/Proceeding Title: Communications on Pure and Applied Mathematics
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.