Provably Neural Active Learning Succeeds via Prioritizing Perplexing Samples

التفاصيل البيبلوغرافية
العنوان: Provably Neural Active Learning Succeeds via Prioritizing Perplexing Samples
المؤلفون: Bu, Dake, Huang, Wei, Suzuki, Taiji, Cheng, Ji, Zhang, Qingfu, Xu, Zhiqiang, Wong, Hau-San
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning
الوصف: Neural Network-based active learning (NAL) is a cost-effective data selection technique that utilizes neural networks to select and train on a small subset of samples. While existing work successfully develops various effective or theory-justified NAL algorithms, the understanding of the two commonly used query criteria of NAL: uncertainty-based and diversity-based, remains in its infancy. In this work, we try to move one step forward by offering a unified explanation for the success of both query criteria-based NAL from a feature learning view. Specifically, we consider a feature-noise data model comprising easy-to-learn or hard-to-learn features disrupted by noise, and conduct analysis over 2-layer NN-based NALs in the pool-based scenario. We provably show that both uncertainty-based and diversity-based NAL are inherently amenable to one and the same principle, i.e., striving to prioritize samples that contain yet-to-be-learned features. We further prove that this shared principle is the key to their success-achieve small test error within a small labeled set. Contrastingly, the strategy-free passive learning exhibits a large test error due to the inadequate learning of yet-to-be-learned features, necessitating resort to a significantly larger label complexity for a sufficient test error reduction. Experimental results validate our findings.
Comment: Accepted by the 41th Intemational Conference on Machine Learning (lCML 2024)
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2406.03944
رقم الأكسشن: edsarx.2406.03944
قاعدة البيانات: arXiv