UniT: Unified Tactile Representation for Robot Learning

التفاصيل البيبلوغرافية
العنوان: UniT: Unified Tactile Representation for Robot Learning
المؤلفون: Xu, Zhengtong, Uppuluri, Raghava, Zhang, Xinwei, Fitch, Cael, Crandall, Philip Glen, Shou, Wan, Wang, Dongyi, She, Yu
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Robotics
الوصف: UniT is a novel approach to tactile representation learning, using VQVAE to learn a compact latent space and serve as the tactile representation. It uses tactile images obtained from a single simple object to train the representation with transferability and generalizability. This tactile representation can be zero-shot transferred to various downstream tasks, including perception tasks and manipulation policy learning. Our benchmarking on an in-hand 3D pose estimation task shows that UniT outperforms existing visual and tactile representation learning methods. Additionally, UniT's effectiveness in policy learning is demonstrated across three real-world tasks involving diverse manipulated objects and complex robot-object-environment interactions. Through extensive experimentation, UniT is shown to be a simple-to-train, plug-and-play, yet widely effective method for tactile representation learning. For more details, please refer to our open-source repository https://github.com/ZhengtongXu/UniT and the project website https://zhengtongxu.github.io/unifiedtactile.github.io/.
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2408.06481
رقم الأكسشن: edsarx.2408.06481
قاعدة البيانات: arXiv