Skip to main content

Unsupervised learning by a "softened" correlation game: duality and convergence

Author(s): Luther, Kyle L; Yang, Runzhe; Seung, H Sebastian

To refer to this page use:
Abstract: Neural networks with Hebbian excitation and anti-Hebbian inhibition form an interesting class of biologically plausible unsupervised learning algorithms. It has recently been shown that such networks can be regarded as online gradient descent-ascent algorithms for solving min-max problems that are dual to unsupervised learning principles formulated with no explicit reference to neural networks. Here we generalize one such formulation, the correlation game, by replacing a hard constraint with a soft penalty function. Our "softened" correlation game contains the nonnegative similarity matching principle as a special case. For solving the primal problem, we derive a projected gradient ascent algorithm that achieves speed through sorting. For solving the dual problem, we derive a projected gradient descent-ascent algorithm, the stochastic online variant of which can be interpreted as a neural network algorithm. We prove strong duality when the inhibitory connection matrix is positive definite, a condition that also prohibits multistability of neural activity dynamics. We show empirically that the neural net algorithm can converge when inhibitory plasticity is faster than excitatory plasticity, and may fail to converge in the opposing case. This is intuitively interpreted using the structure of the min-max problem.
Publication Date: 2019
Citation: Luther, Kyle L., Runzhe Yang, and H. Sebastian Seung. "Unsupervised learning by a 'softened' correlation game: duality and convergence." In 2019 53rd Asilomar Conference on Signals, Systems, and Computers (2019), pp. 876-883. doi:10.1109/IEEECONF44664.2019.9048957
DOI: 10.1109/IEEECONF44664.2019.9048957
ISSN: 1058-6393
EISSN: 2576-2303
Pages: 876 - 883
Type of Material: Conference Article
Journal/Proceeding Title: 2019 53rd Asilomar Conference on Signals, Systems, and Computers
Version: Author's manuscript

Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.