A Simplistic Model of Neural Scaling Laws: Multiperiodic Santa Fe Processes

التفاصيل البيبلوغرافية
العنوان: A Simplistic Model of Neural Scaling Laws: Multiperiodic Santa Fe Processes
المؤلفون: Dębowski, Łukasz
سنة النشر: 2023
المجموعة: Computer Science
Mathematics
Statistics
مصطلحات موضوعية: Computer Science - Information Theory, Computer Science - Machine Learning, Mathematics - Statistics Theory, 60G10 (Primary) 62M20, 94A17 (Secondary)
الوصف: It was observed that large language models exhibit a power-law decay of cross entropy with respect to the number of parameters and training tokens. When extrapolated literally, this decay implies that the entropy rate of natural language is zero. To understand this phenomenon -- or an artifact -- better, we construct a simple stationary stochastic process and its memory-based predictor that exhibit a power-law decay of cross entropy with the vanishing entropy rate. Our example is based on previously discussed Santa Fe processes, which decompose a random text into a process of narration and time-independent knowledge. Previous discussions assumed that narration is a memoryless source with Zipf's distribution. In this paper, we propose a model of narration that has the vanishing entropy rate and applies a randomly chosen deterministic sequence called a multiperiodic sequence. Under a suitable parameterization, multiperiodic sequences exhibit asymptotic relative frequencies given by Zipf's law. Remaining agnostic about the value of the entropy rate of natural language, we discuss relevance of similar constructions for language modeling.
Comment: 27 pages; 1 figure
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2302.09049
رقم الأكسشن: edsarx.2302.09049
قاعدة البيانات: arXiv