Towards optimal hierarchical training of neural networks

التفاصيل البيبلوغرافية
العنوان: Towards optimal hierarchical training of neural networks
المؤلفون: Feischl, Michael, Rieder, Alexander, Zehetgruber, Fabian
سنة النشر: 2024
المجموعة: Computer Science
Mathematics
مصطلحات موضوعية: Mathematics - Numerical Analysis
الوصف: We propose a hierarchical training algorithm for standard feed-forward neural networks that adaptively extends the network architecture as soon as the optimization reaches a stationary point. By solving small (low-dimensional) optimization problems, the extended network provably escapes any local minimum or stationary point. Under some assumptions on the approximability of the data with stable neural networks, we show that the algorithm achieves an optimal convergence rate s in the sense that loss is bounded by the number of parameters to the -s. As a byproduct, we obtain computable indicators which judge the optimality of the training state of a given network and derive a new notion of generalization error.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2407.02242
رقم الأكسشن: edsarx.2407.02242
قاعدة البيانات: arXiv