Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity

التفاصيل البيبلوغرافية
العنوان: Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
المؤلفون: Hao, Yiding, Angluin, Dana, Frank, Robert
سنة النشر: 2022
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computational Complexity, Computer Science - Artificial Intelligence, Computer Science - Computation and Language, Computer Science - Formal Languages and Automata Theory, Computer Science - Machine Learning
الوصف: This paper analyzes three formal models of Transformer encoders that differ in the form of their self-attention mechanism: unique hard attention (UHAT); generalized unique hard attention (GUHAT), which generalizes UHAT; and averaging hard attention (AHAT). We show that UHAT and GUHAT Transformers, viewed as string acceptors, can only recognize formal languages in the complexity class AC$^0$, the class of languages recognizable by families of Boolean circuits of constant depth and polynomial size. This upper bound subsumes Hahn's (2020) results that GUHAT cannot recognize the DYCK languages or the PARITY language, since those languages are outside AC$^0$ (Furst et al., 1984). In contrast, the non-AC$^0$ languages MAJORITY and DYCK-1 are recognizable by AHAT networks, implying that AHAT can recognize languages that UHAT and GUHAT cannot.
Comment: To appear in Transactions of the Association for Computational Linguistics
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2204.06618
رقم الأكسشن: edsarx.2204.06618
قاعدة البيانات: arXiv