CPT: Consistent Proxy Tuning for Black-box Optimization

التفاصيل البيبلوغرافية
العنوان: CPT: Consistent Proxy Tuning for Black-box Optimization
المؤلفون: He, Yuanyang, Huang, Zitong, Xu, Xinxing, Goh, Rick Siow Mong, Khan, Salman, Zuo, Wangmeng, Liu, Yong, Feng, Chun-Mei
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning
الوصف: Black-box tuning has attracted recent attention due to that the structure or inner parameters of advanced proprietary models are not accessible. Proxy-tuning provides a test-time output adjustment for tuning black-box language models. It applies the difference of the output logits before and after tuning a smaller white-box "proxy" model to improve the black-box model. However, this technique serves only as a decoding-time algorithm, leading to an inconsistency between training and testing which potentially limits overall performance. To address this problem, we introduce Consistent Proxy Tuning (CPT), a simple yet effective black-box tuning method. Different from Proxy-tuning, CPT additionally exploits the frozen large black-box model and another frozen small white-box model, ensuring consistency between training-stage optimization objective and test-time proxies. This consistency benefits Proxy-tuning and enhances model performance. Note that our method focuses solely on logit-level computation, which makes it model-agnostic and applicable to any task involving logit classification. Extensive experimental results demonstrate the superiority of our CPT in both black-box tuning of Large Language Models (LLMs) and Vision-Language Models (VLMs) across various datasets. The code is available at https://github.com/chunmeifeng/CPT.
Comment: 10 pages,2 figures plus supplementary materials
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2407.01155
رقم الأكسشن: edsarx.2407.01155
قاعدة البيانات: arXiv