Speaker: Jonathan Siegel
Title: Optimal Approximation Rates for Deep ReLU Neural Networks on Sobolev Spaces
Abstract: Deep ReLU neural networks are among the most widely used class of neural networks in practical applications. We consider the problem of determining optimal Lp-approximation rates for deep ReLU neural networks on the Sobolev class Ws(Lq) for all 1 ≤ p, q ≤ ∞ and s > 0. This problem is important for studying the application of neural networks in scientific computing, and existing sharp results are only available when q = ∞, i.e., when the derivatives are measured in L∞. In our work, we extend these results and determine the best possible rates for all p, q, and s for which a compact Sobolev embedding holds, i.e., when s/d > 1/q – 1/p. We will discuss some of the technical details of the proof and conclude by giving a few open research directions.