To refer to this page use:
http://arks.princeton.edu/ark:/88435/pr1qm2m
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chazelle, Bernard | - |
dc.contributor.author | Wang, C | - |
dc.date.accessioned | 2018-07-20T15:08:42Z | - |
dc.date.available | 2018-07-20T15:08:42Z | - |
dc.date.issued | 2017 | en_US |
dc.identifier.citation | Chazelle, B, Wang, C. (2017). Self-sustaining iterated learning. 67 (10.4230/LIPIcs.ITCS.2017.17 | en_US |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/pr1qm2m | - |
dc.description.abstract | An important result from psycholinguistics (Griffiths & Kalish, 2005) states that no language can be learned iteratively by rational agents in a self-sustaining manner. We show how to modify the learning process slightly in order to achieve self-sustainability. Our work is in two parts. First, we characterize iterated learnability in geometric terms and show how a slight, steady increase in the lengths of the training sessions ensures self-sustainability for any discrete language class. In the second part, we tackle the nondiscrete case and investigate self-sustainability for iterated linear regression. We discuss the implications of our findings to issues of non-equilibrium dynamics in natural algorithms. | en_US |
dc.language.iso | en_US | en_US |
dc.relation.ispartof | 8th Innovations in Theoretical Computer Science Conference, ITCS 2017 | en_US |
dc.rights | Author's manuscript | en_US |
dc.title | Self-sustaining iterated learning | en_US |
dc.type | Conference Article | en_US |
dc.identifier.doi | doi:10.4230/LIPIcs.ITCS.2017.17 | - |
dc.date.eissued | 2017 | en_US |
pu.type.symplectic | http://www.symplectic.co.uk/publications/atom-terms/1.0/journal-article | en_US |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
1609.03960v1.pdf | 609.73 kB | Adobe PDF | View/Download |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.