Skip to main content

Self-sustaining iterated learning

Author(s): Chazelle, Bernard; Wang, C

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1qm2m
Full metadata record
DC FieldValueLanguage
dc.contributor.authorChazelle, Bernard-
dc.contributor.authorWang, C-
dc.date.accessioned2018-07-20T15:08:42Z-
dc.date.available2018-07-20T15:08:42Z-
dc.date.issued2017en_US
dc.identifier.citationChazelle, B, Wang, C. (2017). Self-sustaining iterated learning. 67 (10.4230/LIPIcs.ITCS.2017.17en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1qm2m-
dc.description.abstractAn important result from psycholinguistics (Griffiths & Kalish, 2005) states that no language can be learned iteratively by rational agents in a self-sustaining manner. We show how to modify the learning process slightly in order to achieve self-sustainability. Our work is in two parts. First, we characterize iterated learnability in geometric terms and show how a slight, steady increase in the lengths of the training sessions ensures self-sustainability for any discrete language class. In the second part, we tackle the nondiscrete case and investigate self-sustainability for iterated linear regression. We discuss the implications of our findings to issues of non-equilibrium dynamics in natural algorithms.en_US
dc.language.isoen_USen_US
dc.relation.ispartof8th Innovations in Theoretical Computer Science Conference, ITCS 2017en_US
dc.rightsAuthor's manuscripten_US
dc.titleSelf-sustaining iterated learningen_US
dc.typeConference Articleen_US
dc.identifier.doidoi:10.4230/LIPIcs.ITCS.2017.17-
dc.date.eissued2017en_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
1609.03960v1.pdf609.73 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.