Recursive Binary Neural Network Learning Model with 2.28b/Weight Storage Requirement

التفاصيل البيبلوغرافية
العنوان: Recursive Binary Neural Network Learning Model with 2.28b/Weight Storage Requirement
المؤلفون: Guan, Tianchan, Zeng, Xiaoyang, Seok, Mingoo
سنة النشر: 2017
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Neural and Evolutionary Computing
الوصف: This paper presents a storage-efficient learning model titled Recursive Binary Neural Networks for sensing devices having a limited amount of on-chip data storage such as < 100's kilo-Bytes. The main idea of the proposed model is to recursively recycle data storage of synaptic weights (parameters) during training. This enables a device with a given storage constraint to train and instantiate a neural network classifier with a larger number of weights on a chip and with a less number of off-chip storage accesses. This enables higher classification accuracy, shorter training time, less energy dissipation, and less on-chip storage requirement. We verified the training model with deep neural network classifiers and the permutation-invariant MNIST benchmark. Our model uses only 2.28 bits/weight while for the same data storage constraint achieving ~1% lower classification error as compared to the conventional binary-weight learning model which yet has to use 8 to 16 bit storage per weight. To achieve the similar classification error, the conventional binary model requires ~4x more data storage for weights than the proposed model.
Comment: 10 pages, 4 figures, 2 tables
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/1709.05306
رقم الأكسشن: edsarx.1709.05306
قاعدة البيانات: arXiv