FastPart: Over-Parameterized Stochastic Gradient Descent for Sparse optimisation on Measures

التفاصيل البيبلوغرافية
العنوان: FastPart: Over-Parameterized Stochastic Gradient Descent for Sparse optimisation on Measures
المؤلفون: De Castro, Yohann, Gadat, Sébastien, Marteau, Clément
سنة النشر: 2023
المجموعة: Mathematics
Statistics
مصطلحات موضوعية: Mathematics - Optimization and Control, Statistics - Machine Learning
الوصف: This paper presents a novel algorithm that leverages Stochastic Gradient Descent strategies in conjunction with Random Features to augment the scalability of Conic Particle Gradient Descent (CPGD) specifically tailored for solving sparse optimisation problems on measures. By formulating the CPGD steps within a variational framework, we provide rigorous mathematical proofs demonstrating the following key findings: (i) The total variation norms of the solution measures along the descent trajectory remain bounded, ensuring stability and preventing undesirable divergence; (ii) We establish a global convergence guarantee with a convergence rate of $\mathcal{O}(\log(K)/\sqrt{K})$ over $K$ iterations, showcasing the efficiency and effectiveness of our algorithm; (iii) Additionally, we analyze and establish local control over the first-order condition discrepancy, contributing to a deeper understanding of the algorithm's behavior and reliability in practical applications.
Comment: 41 pages
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2312.05993
رقم الأكسشن: edsarx.2312.05993
قاعدة البيانات: arXiv