Skip to main content

Multireference Alignment using Semidefinite Programming

Author(s): Bandeira, Afonso S; Charikar, Moses; Singer, Amit; Zhu, Andy

To refer to this page use:
Abstract: The multireference alignment problem consists of estimating a signal from multiple noisy shifted observations. Inspired by existing Unique-Games approximation algorithms, we provide a semidefinite program (SDP) based relaxation which approximates the maximum likelihood estimator (MLE) for the multireference alignment problem. Although we show this MLE problem is Unique-Games hard to approximate within any constant, we observe that our poly-time approximation algorithm for this problem appears to perform quite well in typical instances, outperforming existing methods. In an attempt to explain this behavior we provide stability guarantees for our SDP under a random noise model on the observations. This case is more challenging to analyze than traditional semi-random instances of Unique-Games: the noise model is on vertices of a graph and translates into dependent noise on the edges. Interestingly, we show that if certain positivity constraints in the relaxation are dropped, its solution becomes equivalent to performing phase correlation, a popular method used for pairwise alignment in imaging applications. Finally, we describe how symmetry reduction techniques from matrix representation theory can greatly decrease the computational cost of the SDP considered.
Publication Date: 2014
Electronic Publication Date: 2014
Citation: Afonso S. Bandeira, Moses Charikar, Amit Singer, and Andy Zhu. 2014. Multireference alignment using semidefinite programming. In Proceedings of the 5th conference on Innovations in theoretical computer science (ITCS '14). ACM, New York, NY, USA, 459-470. DOI:
DOI: 10.1145/2554797.2554839
Type of Material: Conference Article
Journal/Proceeding Title: Proceedings of the 5th conference on Innovations in theoretical computer science (ITCS '14)
Version: Author's manuscript

Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.