Variational Autoencoders and Nonlinear ICA: A Unifying Framework

التفاصيل البيبلوغرافية
العنوان: Variational Autoencoders and Nonlinear ICA: A Unifying Framework
المؤلفون: Khemakhem, Ilyes, Kingma, Diederik P., Monti, Ricardo Pio, Hyvärinen, Aapo
المصدر: Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, pages 2207-2217, year 2020
سنة النشر: 2019
المجموعة: Computer Science
Statistics
مصطلحات موضوعية: Statistics - Machine Learning, Computer Science - Machine Learning
الوصف: The framework of variational autoencoders allows us to efficiently learn deep latent-variable models, such that the model's marginal distribution over observed variables fits the data. Often, we're interested in going a step further, and want to approximate the true joint distribution over observed and latent variables, including the true prior and posterior distributions over latent variables. This is known to be generally impossible due to unidentifiability of the model. We address this issue by showing that for a broad family of deep latent-variable models, identification of the true joint distribution over observed and latent variables is actually possible up to very simple transformations, thus achieving a principled and powerful form of disentanglement. Our result requires a factorized prior distribution over the latent variables that is conditioned on an additionally observed variable, such as a class label or almost any other observation. We build on recent developments in nonlinear ICA, which we extend to the case with noisy, undercomplete or discrete observations, integrated in a maximum likelihood framework. The result also trivially contains identifiable flow-based generative models as a special case.
Comment: Accepted for publication at AISTATS 2020. This is a slightly updated version of the published manuscript; see Corrigendum at the end of the paper
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/1907.04809
رقم الأكسشن: edsarx.1907.04809
قاعدة البيانات: arXiv