InfiniMotion: Mamba Boosts Memory in Transformer for Arbitrary Long Motion Generation

التفاصيل البيبلوغرافية
العنوان: InfiniMotion: Mamba Boosts Memory in Transformer for Arbitrary Long Motion Generation
المؤلفون: Zhang, Zeyu, Liu, Akide, Chen, Qi, Chen, Feng, Reid, Ian, Hartley, Richard, Zhuang, Bohan, Tang, Hao
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computer Vision and Pattern Recognition
الوصف: Text-to-motion generation holds potential for film, gaming, and robotics, yet current methods often prioritize short motion generation, making it challenging to produce long motion sequences effectively: (1) Current methods struggle to handle long motion sequences as a single input due to prohibitively high computational cost; (2) Breaking down the generation of long motion sequences into shorter segments can result in inconsistent transitions and requires interpolation or inpainting, which lacks entire sequence modeling. To solve these challenges, we propose InfiniMotion, a method that generates continuous motion sequences of arbitrary length within an autoregressive framework. We highlight its groundbreaking capability by generating a continuous 1-hour human motion with around 80,000 frames. Specifically, we introduce the Motion Memory Transformer with Bidirectional Mamba Memory, enhancing the transformer's memory to process long motion sequences effectively without overwhelming computational resources. Notably our method achieves over 30% improvement in FID and 6 times longer demonstration compared to previous state-of-the-art methods, showcasing significant advancements in long motion generation. See project webpage: https://steve-zeyu-zhang.github.io/InfiniMotion/
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2407.10061
رقم الأكسشن: edsarx.2407.10061
قاعدة البيانات: arXiv