Non-saturating GAN training as divergence minimization

التفاصيل البيبلوغرافية
العنوان: Non-saturating GAN training as divergence minimization
المؤلفون: Shannon, Matt, Poole, Ben, Mariooryad, Soroosh, Bagby, Tom, Battenberg, Eric, Kao, David, Stanton, Daisy, Skerry-Ryan, RJ
سنة النشر: 2020
المجموعة: Computer Science
Statistics
مصطلحات موضوعية: Computer Science - Machine Learning, Statistics - Machine Learning
الوصف: Non-saturating generative adversarial network (GAN) training is widely used and has continued to obtain groundbreaking results. However so far this approach has lacked strong theoretical justification, in contrast to alternatives such as f-GANs and Wasserstein GANs which are motivated in terms of approximate divergence minimization. In this paper we show that non-saturating GAN training does in fact approximately minimize a particular f-divergence. We develop general theoretical tools to compare and classify f-divergences and use these to show that the new f-divergence is qualitatively similar to reverse KL. These results help to explain the high sample quality but poor diversity often observed empirically when using this scheme.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2010.08029
رقم الأكسشن: edsarx.2010.08029
قاعدة البيانات: arXiv