Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Approximation of Smoothness Classes by Deep ReLU Networks

Abstract : We consider approximation rates of sparsely connected deep rectified linear unit (ReLU) and rectified power unit (RePU) neural networks for functions in Besov spaces $B^\alpha_{q}(L^p)$ in arbitrary dimension $d$, on bounded or unbounded domains. We show that RePU networks with a fixed activation function attain optimal approximation rates for functions in the Besov space $B^\alpha_{\tau}(L^\tau)$ on the critical embedding line $1/\tau=\alpha/d+1/p$ for arbitrary smoothness order $\alpha>0$. Moreover, we show that ReLU networks attain near to optimal rates for any Besov space strictly above the critical line. Using interpolation theory, this implies that the entire range of smoothness classes at or above the critical line is (near to) optimally approximated by deep ReLU/RePU networks.
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-02909881
Contributor : Anthony Nouy <>
Submitted on : Friday, July 31, 2020 - 12:47:02 PM
Last modification on : Tuesday, August 25, 2020 - 11:26:36 AM

Links full text

Identifiers

  • HAL Id : hal-02909881, version 1
  • ARXIV : 2007.15645

Collections

Citation

Mazen Ali, Anthony Nouy. Approximation of Smoothness Classes by Deep ReLU Networks. 2020. ⟨hal-02909881⟩

Share

Metrics

Record views

34