Integrating Present and Past in Unsupervised Continual Learning

التفاصيل البيبلوغرافية
العنوان: Integrating Present and Past in Unsupervised Continual Learning
المؤلفون: Zhang, Yipeng, Charlin, Laurent, Zemel, Richard, Ren, Mengye
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Computer Vision and Pattern Recognition
الوصف: We formulate a unifying framework for unsupervised continual learning (UCL), which disentangles learning objectives that are specific to the present and the past data, encompassing stability, plasticity, and cross-task consolidation. The framework reveals that many existing UCL approaches overlook cross-task consolidation and try to balance plasticity and stability in a shared embedding space. This results in worse performance due to a lack of within-task data diversity and reduced effectiveness in learning the current task. Our method, Osiris, which explicitly optimizes all three objectives on separate embedding spaces, achieves state-of-the-art performance on all benchmarks, including two novel benchmarks proposed in this paper featuring semantically structured task sequences. Compared to standard benchmarks, these two structured benchmarks more closely resemble visual signals received by humans and animals when navigating real-world environments. Finally, we show some preliminary evidence that continual models can benefit from such realistic learning scenarios.
Comment: CoLLAs 2024 (Oral)
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2404.19132
رقم الأكسشن: edsarx.2404.19132
قاعدة البيانات: arXiv