AdapMTL: Adaptive Pruning Framework for Multitask Learning Model

التفاصيل البيبلوغرافية
العنوان: AdapMTL: Adaptive Pruning Framework for Multitask Learning Model
المؤلفون: Xiang, Mingcan, Tang, Steven Jiaxun, Yang, Qizheng, Guan, Hui, Liu, Tongping
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computer Vision and Pattern Recognition, Computer Science - Machine Learning
الوصف: In the domain of multimedia and multimodal processing, the efficient handling of diverse data streams such as images, video, and sensor data is paramount. Model compression and multitask learning (MTL) are crucial in this field, offering the potential to address the resource-intensive demands of processing and interpreting multiple forms of media simultaneously. However, effectively compressing a multitask model presents significant challenges due to the complexities of balancing sparsity allocation and accuracy performance across multiple tasks. To tackle these challenges, we propose AdapMTL, an adaptive pruning framework for MTL models. AdapMTL leverages multiple learnable soft thresholds independently assigned to the shared backbone and the task-specific heads to capture the nuances in different components' sensitivity to pruning. During training, it co-optimizes the soft thresholds and MTL model weights to automatically determine the suitable sparsity level at each component to achieve both high task accuracy and high overall sparsity. It further incorporates an adaptive weighting mechanism that dynamically adjusts the importance of task-specific losses based on each task's robustness to pruning. We demonstrate the effectiveness of AdapMTL through comprehensive experiments on popular multitask datasets, namely NYU-v2 and Tiny-Taskonomy, with different architectures, showcasing superior performance compared to state-of-the-art pruning methods.
Comment: 13 pages, 9 figures, Published at ACM Multimedia (ACM MM) 2024
نوع الوثيقة: Working Paper
DOI: 10.1145/3664647.3681426
URL الوصول: http://arxiv.org/abs/2408.03913
رقم الأكسشن: edsarx.2408.03913
قاعدة البيانات: arXiv