Minimizing Convex Functionals over Space of Probability Measures via KL Divergence Gradient Flow

التفاصيل البيبلوغرافية
العنوان: Minimizing Convex Functionals over Space of Probability Measures via KL Divergence Gradient Flow
المؤلفون: Yao, Rentian, Huang, Linjun, Yang, Yun
سنة النشر: 2023
المجموعة: Mathematics
Statistics
مصطلحات موضوعية: Mathematics - Statistics Theory
الوصف: Motivated by the computation of the non-parametric maximum likelihood estimator (NPMLE) and the Bayesian posterior in statistics, this paper explores the problem of convex optimization over the space of all probability distributions. We introduce an implicit scheme, called the implicit KL proximal descent (IKLPD) algorithm, for discretizing a continuous-time gradient flow relative to the Kullback-Leibler divergence for minimizing a convex target functional. We show that IKLPD converges to a global optimum at a polynomial rate from any initialization; moreover, if the objective functional is strongly convex relative to the KL divergence, for example, when the target functional itself is a KL divergence as in the context of Bayesian posterior computation, IKLPD exhibits globally exponential convergence. Computationally, we propose a numerical method based on normalizing flow to realize IKLPD. Conversely, our numerical method can also be viewed as a new approach that sequentially trains a normalizing flow for minimizing a convex functional with a strong theoretical guarantee.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2311.00894
رقم الأكسشن: edsarx.2311.00894
قاعدة البيانات: arXiv