Skip to main content

Self-sustaining iterated learning

Author(s): Chazelle, Bernard; Wang, C

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1qm2m
Abstract: An important result from psycholinguistics (Griffiths & Kalish, 2005) states that no language can be learned iteratively by rational agents in a self-sustaining manner. We show how to modify the learning process slightly in order to achieve self-sustainability. Our work is in two parts. First, we characterize iterated learnability in geometric terms and show how a slight, steady increase in the lengths of the training sessions ensures self-sustainability for any discrete language class. In the second part, we tackle the nondiscrete case and investigate self-sustainability for iterated linear regression. We discuss the implications of our findings to issues of non-equilibrium dynamics in natural algorithms.
Publication Date: 2017
Electronic Publication Date: 2017
Citation: Chazelle, B, Wang, C. (2017). Self-sustaining iterated learning. 67 (10.4230/LIPIcs.ITCS.2017.17
DOI: doi:10.4230/LIPIcs.ITCS.2017.17
Type of Material: Conference Article
Journal/Proceeding Title: 8th Innovations in Theoretical Computer Science Conference, ITCS 2017
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.