Approximation Error and Complexity Bounds for ReLU Networks on Low-Regular Function Spaces

التفاصيل البيبلوغرافية
العنوان: Approximation Error and Complexity Bounds for ReLU Networks on Low-Regular Function Spaces
المؤلفون: Davis, Owen, Geraci, Gianluca, Motamed, Mohammad
سنة النشر: 2024
المجموعة: Computer Science
Statistics
مصطلحات موضوعية: Statistics - Machine Learning, Computer Science - Machine Learning, 41A25, 41A30, 41A46, 68T07
الوصف: In this work, we consider the approximation of a large class of bounded functions, with minimal regularity assumptions, by ReLU neural networks. We show that the approximation error can be bounded from above by a quantity proportional to the uniform norm of the target function and inversely proportional to the product of network width and depth. We inherit this approximation error bound from Fourier features residual networks, a type of neural network that uses complex exponential activation functions. Our proof is constructive and proceeds by conducting a careful complexity analysis associated with the approximation of a Fourier features residual network by a ReLU network.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2405.06727
رقم الأكسشن: edsarx.2405.06727
قاعدة البيانات: arXiv