Memorization Capacity for Additive Fine-Tuning with Small ReLU Networks

التفاصيل البيبلوغرافية
العنوان: Memorization Capacity for Additive Fine-Tuning with Small ReLU Networks
المؤلفون: Sohn, Jy-yong, Kwon, Dohyun, An, Seoyeon, Lee, Kangwook
سنة النشر: 2024
المجموعة: Computer Science
Statistics
مصطلحات موضوعية: Computer Science - Machine Learning, Statistics - Machine Learning
الوصف: Fine-tuning large pre-trained models is a common practice in machine learning applications, yet its mathematical analysis remains largely unexplored. In this paper, we study fine-tuning through the lens of memorization capacity. Our new measure, the Fine-Tuning Capacity (FTC), is defined as the maximum number of samples a neural network can fine-tune, or equivalently, as the minimum number of neurons ($m$) needed to arbitrarily change $N$ labels among $K$ samples considered in the fine-tuning process. In essence, FTC extends the memorization capacity concept to the fine-tuning scenario. We analyze FTC for the additive fine-tuning scenario where the fine-tuned network is defined as the summation of the frozen pre-trained network $f$ and a neural network $g$ (with $m$ neurons) designed for fine-tuning. When $g$ is a ReLU network with either 2 or 3 layers, we obtain tight upper and lower bounds on FTC; we show that $N$ samples can be fine-tuned with $m=\Theta(N)$ neurons for 2-layer networks, and with $m=\Theta(\sqrt{N})$ neurons for 3-layer networks, no matter how large $K$ is. Our results recover the known memorization capacity results when $N = K$ as a special case.
Comment: 10 pages, 9 figures, UAI 2024
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2408.00359
رقم الأكسشن: edsarx.2408.00359
قاعدة البيانات: arXiv