TSMixer: An All-MLP Architecture for Time Series Forecasting

التفاصيل البيبلوغرافية
العنوان: TSMixer: An All-MLP Architecture for Time Series Forecasting
المؤلفون: Chen, Si-An, Li, Chun-Liang, Yoder, Nate, Arik, Sercan O., Pfister, Tomas
المصدر: Transactions on Machine Learning Research (TMLR), 09/2023
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Artificial Intelligence
الوصف: Real-world time-series datasets are often multivariate with complex dynamics. To capture this complexity, high capacity architectures like recurrent- or attention-based sequential deep learning models have become popular. However, recent work demonstrates that simple univariate linear models can outperform such deep learning models on several commonly used academic benchmarks. Extending them, in this paper, we investigate the capabilities of linear models for time-series forecasting and present Time-Series Mixer (TSMixer), a novel architecture designed by stacking multi-layer perceptrons (MLPs). TSMixer is based on mixing operations along both the time and feature dimensions to extract information efficiently. On popular academic benchmarks, the simple-to-implement TSMixer is comparable to specialized state-of-the-art models that leverage the inductive biases of specific benchmarks. On the challenging and large scale M5 benchmark, a real-world retail dataset, TSMixer demonstrates superior performance compared to the state-of-the-art alternatives. Our results underline the importance of efficiently utilizing cross-variate and auxiliary information for improving the performance of time series forecasting. We present various analyses to shed light into the capabilities of TSMixer. The design paradigms utilized in TSMixer are expected to open new horizons for deep learning-based time series forecasting. The implementation is available at https://github.com/google-research/google-research/tree/master/tsmixer
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2303.06053
رقم الأكسشن: edsarx.2303.06053
قاعدة البيانات: arXiv