تقرير
CLIPood: Generalizing CLIP to Out-of-Distributions
العنوان: | CLIPood: Generalizing CLIP to Out-of-Distributions |
---|---|
المؤلفون: | Shu, Yang, Guo, Xingzhuo, Wu, Jialong, Wang, Ximei, Wang, Jianmin, Long, Mingsheng |
سنة النشر: | 2023 |
المجموعة: | Computer Science |
مصطلحات موضوعية: | Computer Science - Machine Learning, Computer Science - Computer Vision and Pattern Recognition |
الوصف: | Out-of-distribution (OOD) generalization, where the model needs to handle distribution shifts from training, is a major challenge of machine learning. Contrastive language-image pre-training (CLIP) models have shown impressive zero-shot ability, but the further adaptation of CLIP on downstream tasks undesirably degrades OOD performances. This paper aims at generalizing CLIP to out-of-distribution test data on downstream tasks. We propose CLIPood, a fine-tuning method that can adapt CLIP models to OOD situations where both domain shifts and open classes may occur on the unseen test data. To exploit the semantic relations between classes from the text modality, CLIPood introduces a new training objective, margin metric softmax (MMS), with class adaptive margins for fine-tuning. To incorporate both pre-trained zero-shot model and fine-tuned task-adaptive model, CLIPood leverages a new optimization strategy, Beta moving average (BMA), to maintain a temporal ensemble weighted by Beta distribution. Experiments on diverse datasets with different OOD scenarios show that CLIPood consistently outperforms existing generalization techniques. Comment: Accepted by ICML 2023 |
نوع الوثيقة: | Working Paper |
URL الوصول: | http://arxiv.org/abs/2302.00864 |
رقم الأكسشن: | edsarx.2302.00864 |
قاعدة البيانات: | arXiv |
الوصف غير متاح. |