**Speaker:** Jonathan Siegel

**Title:** Optimal Approximation Rates for Deep ReLU Neural Networks on Sobolev Spaces

**Abstract:** Deep ReLU neural networks are among the most widely used class of neural networks in practical applications. We consider the problem of determining optimal *L _{p}*-approximation rates for deep ReLU neural networks on the Sobolev class

*W*(

^{s}*L*) for all 1 ≤

_{q}*p*,

*q*≤ ∞ and

*s*> 0. This problem is important for studying the application of neural networks in scientific computing, and existing sharp results are only available when

*q*= ∞, i.e., when the derivatives are measured in

*L*

_{∞}. In our work, we extend these results and determine the best possible rates for all

*p*,

*q*, and

*s*for which a compact Sobolev embedding holds, i.e., when

*s*/

*d*> 1/

*q*– 1/

*p*. We will discuss some of the technical details of the proof and conclude by giving a few open research directions.