Skip to main content

Calibration, Entropy Rates, and Memory in Language Models

Author(s): Braverman, Mark; Chen, X; Kakade, SM; Narasimhan, Karthik; Zhang, C; et al

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr17z5t
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBraverman, Mark-
dc.contributor.authorChen, X-
dc.contributor.authorKakade, SM-
dc.contributor.authorNarasimhan, Karthik-
dc.contributor.authorZhang, C-
dc.contributor.authorZhang, Y-
dc.date.accessioned2021-10-08T19:47:10Z-
dc.date.available2021-10-08T19:47:10Z-
dc.date.issued2019-06-01en_US
dc.identifier.citationBraverman, M, Chen, X, Kakade, SM, Narasimhan, K, Zhang, C, Zhang, Y. (2019). Calibration, Entropy Rates, and Memory in Language Models. eprint arXiv:1906.05664, arXiv - 1906.05664en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr17z5t-
dc.description.abstractBuilding accurate language models that capture meaningful long-term dependencies is a core challenge in natural language processing. Towards this end, we present a calibration-based approach to measure long-term discrepancies between a generative sequence model and the true distribution, and use these discrepancies to improve the model. Empirically, we show that state-of-the-art language models, including LSTMs and Transformers, are \emph{miscalibrated}: the entropy rates of their generations drift dramatically upward over time. We then provide provable methods to mitigate this phenomenon. Furthermore, we show how this calibration-based approach can also be used to measure the amount of memory that language models use for prediction.en_US
dc.format.extentarXiv - 1906.05664en_US
dc.language.isoen_USen_US
dc.relation.ispartofeprint arXiv:1906.05664en_US
dc.rightsAuthor's manuscripten_US
dc.titleCalibration, Entropy Rates, and Memory in Language Modelsen_US
dc.typeJournal Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
CalibrationEntropyRatesMemoryLanguageModels.pdf674.74 kBAdobe PDFView/Download
Calibration, Entropy Rates, and Memory in Language Models.pdf408.67 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.