Progressive Feedforward Collapse of ResNet Training

التفاصيل البيبلوغرافية
العنوان: Progressive Feedforward Collapse of ResNet Training
المؤلفون: Wang, Sicong, Gai, Kuo, Zhang, Shihua
سنة النشر: 2024
المجموعة: Computer Science
Mathematics
Statistics
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Mathematics - Optimization and Control, Mathematics - Statistics Theory, 68T07, I.2.0
الوصف: Neural collapse (NC) is a simple and symmetric phenomenon for deep neural networks (DNNs) at the terminal phase of training, where the last-layer features collapse to their class means and form a simplex equiangular tight frame aligning with the classifier vectors. However, the relationship of the last-layer features to the data and intermediate layers during training remains unexplored. To this end, we characterize the geometry of intermediate layers of ResNet and propose a novel conjecture, progressive feedforward collapse (PFC), claiming the degree of collapse increases during the forward propagation of DNNs. We derive a transparent model for the well-trained ResNet according to that ResNet with weight decay approximates the geodesic curve in Wasserstein space at the terminal phase. The metrics of PFC indeed monotonically decrease across depth on various datasets. We propose a new surrogate model, multilayer unconstrained feature model (MUFM), connecting intermediate layers by an optimal transport regularizer. The optimal solution of MUFM is inconsistent with NC but is more concentrated relative to the input data. Overall, this study extends NC to PFC to model the collapse phenomenon of intermediate layers and its dependence on the input data, shedding light on the theoretical understanding of ResNet in classification problems.
Comment: 14 pages, 5 figures
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2405.00985
رقم الأكسشن: edsarx.2405.00985
قاعدة البيانات: arXiv