Theoretical Analysis of Weak-to-Strong Generalization

التفاصيل البيبلوغرافية
العنوان: Theoretical Analysis of Weak-to-Strong Generalization
المؤلفون: Lang, Hunter, Sontag, David, Vijayaraghavan, Aravindan
سنة النشر: 2024
المجموعة: Computer Science
Statistics
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Computation and Language, Statistics - Machine Learning
الوصف: Strong student models can learn from weaker teachers: when trained on the predictions of a weaker model, a strong pretrained student can learn to correct the weak model's errors and generalize to examples where the teacher is not confident, even when these examples are excluded from training. This enables learning from cheap, incomplete, and possibly incorrect label information, such as coarse logical rules or the generations of a language model. We show that existing weak supervision theory fails to account for both of these effects, which we call pseudolabel correction and coverage expansion, respectively. We give a new bound based on expansion properties of the data distribution and student hypothesis class that directly accounts for pseudolabel correction and coverage expansion. Our bounds capture the intuition that weak-to-strong generalization occurs when the strong model is unable to fit the mistakes of the weak teacher without incurring additional error. We show that these expansion properties can be checked from finite data and give empirical evidence that they hold in practice.
Comment: 36 pages, 3 figures
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2405.16043
رقم الأكسشن: edsarx.2405.16043
قاعدة البيانات: arXiv