Skip to main content

Extractor-based time-space lower bounds for learning

Author(s): Garg, Sumegha; Raz, Ran; Tal, Avishay

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1fr61
Abstract: A matrix M: A × X → {−1,1} corresponds to the following learning problem: An unknown element x ∈ X is chosen uniformly at random. A learner tries to learn x from a stream of samples, (a1, b1), (a2, b2) …, where for every i, ai ∈ A is chosen uniformly at random and bi = M(ai,x). Assume that k, l, r are such that any submatrix of M of at least 2−k · |A| rows and at least 2−l · |X| columns, has a bias of at most 2−r. We show that any learning algorithm for the learning problem corresponding to M requires either a memory of size at least Ω(k · l ), or at least 2Ω(r) samples. The result holds even if the learner has an exponentially small success probability (of 2−Ω(r)). In particular, this shows that for a large class of learning problems, any learning algorithm requires either a memory of size at least Ω((log|X|) · (log|A|)) or an exponential number of samples, achieving a tight Ω((log|X|) · (log|A|)) lower bound on the size of the memory, rather than a bound of Ω(min{(log|X|)2,(log|A|)2}) obtained in previous works by Raz [FOCS’17] and Moshkovitz and Moshkovitz [ITCS’18]. Moreover, our result implies all previous memory-samples lower bounds, as well as a number of new applications. Our proof builds on the work of Raz [FOCS’17] that gave a general technique for proving memory samples lower bounds.
Publication Date: Jun-2018
Citation: Garg, Sumegha, Ran Raz, and Avishay Tal. "Extractor-based time-space lower bounds for learning." Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing (2018), pp. 990-1002. doi:10.1145/3188745.3188962
DOI: 10.1145/3188745.3188962
Pages: 990 - 1002
Type of Material: Conference Article
Series/Report no.: STOC 2018;
Journal/Proceeding Title: Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.