Matching the Statistical Query Lower Bound for K-Sparse Parity Problems with Stochastic Gradient Descent

arXiv (Cornell University)(2024)

引用 0|浏览35
暂无评分
摘要
The k-parity problem is a classical problem in computational complexity andalgorithmic theory, serving as a key benchmark for understanding computationalclasses. In this paper, we solve the k-parity problem with stochasticgradient descent (SGD) on two-layer fully-connected neural networks. Wedemonstrate that SGD can efficiently solve the k-sparse parity problem on ad-dimensional hypercube (k≤ O(√(d))) with a sample complexity ofÕ(d^k-1) using 2^Θ(k) neurons, thus matching theestablished Ω(d^k) lower bounds of Statistical Query (SQ) models. Ourtheoretical analysis begins by constructing a good neural network capable ofcorrectly solving the k-parity problem. We then demonstrate how a trainedneural network with SGD can effectively approximate this good network, solvingthe k-parity problem with small statistical errors. Our theoretical resultsand findings are supported by empirical evidence, showcasing the efficiency andefficacy of our approach.
更多
查看译文
关键词
Stochastic Gradient Descent,Approximation Algorithms,Coordinate Descent,Probabilistic Learning,Imprecise Probabilities
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要