Continual Learning with Pretrained Backbones by Tuning in the Input Space

التفاصيل البيبلوغرافية
العنوان: Continual Learning with Pretrained Backbones by Tuning in the Input Space
المؤلفون: Marullo, Simone, Tiezzi, Matteo, Gori, Marco, Melacci, Stefano, Tuytelaars, Tinne
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Computer Vision and Pattern Recognition
الوصف: The intrinsic difficulty in adapting deep learning models to non-stationary environments limits the applicability of neural networks to real-world tasks. This issue is critical in practical supervised learning settings, such as the ones in which a pre-trained model computes projections toward a latent space where different task predictors are sequentially learned over time. As a matter of fact, incrementally fine-tuning the whole model to better adapt to new tasks usually results in catastrophic forgetting, with decreasing performance over the past experiences and losing valuable knowledge from the pre-training stage. In this paper, we propose a novel strategy to make the fine-tuning procedure more effective, by avoiding to update the pre-trained part of the network and learning not only the usual classification head, but also a set of newly-introduced learnable parameters that are responsible for transforming the input data. This process allows the network to effectively leverage the pre-training knowledge and find a good trade-off between plasticity and stability with modest computational efforts, thus especially suitable for on-the-edge settings. Our experiments on four image classification problems in a continual learning setting confirm the quality of the proposed approach when compared to several fine-tuning procedures and to popular continual learning methods.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2306.02947
رقم الأكسشن: edsarx.2306.02947
قاعدة البيانات: arXiv