Can a Transformer Represent a Kalman Filter?

التفاصيل البيبلوغرافية
العنوان: Can a Transformer Represent a Kalman Filter?
المؤلفون: Goel, Gautam, Bartlett, Peter
سنة النشر: 2023
المجموعة: Computer Science
Statistics
مصطلحات موضوعية: Computer Science - Machine Learning, Statistics - Machine Learning
الوصف: Transformers are a class of autoregressive deep learning architectures which have recently achieved state-of-the-art performance in various vision, language, and robotics tasks. We revisit the problem of Kalman Filtering in linear dynamical systems and show that Transformers can approximate the Kalman Filter in a strong sense. Specifically, for any observable LTI system we construct an explicit causally-masked Transformer which implements the Kalman Filter, up to a small additive error which is bounded uniformly in time; we call our construction the Transformer Filter. Our construction is based on a two-step reduction. We first show that a softmax self-attention block can exactly represent a Nadaraya-Watson kernel smoothing estimator with a Gaussian kernel. We then show that this estimator closely approximates the Kalman Filter. We also investigate how the Transformer Filter can be used for measurement-feedback control and prove that the resulting nonlinear controllers closely approximate the performance of standard optimal control policies such as the LQG controller.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2312.06937
رقم الأكسشن: edsarx.2312.06937
قاعدة البيانات: arXiv