Activations Through Extensions: A Framework To Boost Performance Of Neural Networks

التفاصيل البيبلوغرافية
العنوان: Activations Through Extensions: A Framework To Boost Performance Of Neural Networks
المؤلفون: Kamanchi, Chandramouli, Mukherjee, Sumanta, Sampath, Kameshwaran, Dayama, Pankaj, Jati, Arindam, Ekambaram, Vijay, Phan, Dzung
سنة النشر: 2024
المجموعة: Computer Science
Mathematics
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Computer Science - Neural and Evolutionary Computing, Mathematics - Numerical Analysis
الوصف: Activation functions are non-linearities in neural networks that allow them to learn complex mapping between inputs and outputs. Typical choices for activation functions are ReLU, Tanh, Sigmoid etc., where the choice generally depends on the application domain. In this work, we propose a framework/strategy that unifies several works on activation functions and theoretically explains the performance benefits of these works. We also propose novel techniques that originate from the framework and allow us to obtain ``extensions'' (i.e. special generalizations of a given neural network) of neural networks through operations on activation functions. We theoretically and empirically show that ``extensions'' of neural networks have performance benefits compared to vanilla neural networks with insignificant space and time complexity costs on standard test functions. We also show the benefits of neural network ``extensions'' in the time-series domain on real-world datasets.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2408.03599
رقم الأكسشن: edsarx.2408.03599
قاعدة البيانات: arXiv