DDIO-Mapping: A Fast and Robust Visual-Inertial Odometry for Low-Texture Environment Challenge

التفاصيل البيبلوغرافية
العنوان: DDIO-Mapping: A Fast and Robust Visual-Inertial Odometry for Low-Texture Environment Challenge
المؤلفون: Jiang, Xinyu, Li, Heng, Chen, Chuangquan, Chen, Yongquan, Huang, Junlang, Zhou, Zuguang, Zhou, Yimin, Vong, Chi-Man
المصدر: IEEE Transactions on Industrial Informatics; 2024, Vol. 20 Issue: 3 p4418-4428, 11p
مستخلص: Accurate localization and pose estimation remain challenging for autonomous robots in low-texture environment. This article proposes a tightly coupled direct depth-inertial odometry and mapping (DDIO-Mapping) framework to simultaneously tackle three crucial issues in such environments: 1) ineffective feature point extraction; 2) inefficient searching of feature points; and 3) imbalanced feature extraction under uneven illumination conditions. In DDIO-Mapping, a novel robust strategy is designed that combines grayscale and depth features for optimization instead of only the RBG features in the existing methods. To improve searching efficiency, a new RGBD feature extraction is applied to directly extract both the depth and grayscale features from the RGBD images, which only requires searching the feature points in the 2-D space rather than the enormous 3-D space in K-dimensional (KD) tree. To deal with imbalanced feature extraction, a feature filtering and selection strategy is proposed to adaptively adjust the depth and grayscale weightage. Finally, with the effectively extracted features from RGBD images, a new nonlinear tightly coupled inverse depth residual function is customized to accurately estimate the optimal pose in low-texture environments. The framework is highly robust, accurate, and efficient. Experiments demonstrate that DDIO-Mapping reduces the root-mean-square error by approximately 30% compared to other state-of-the-art algorithms while retaining the same efficiency of approximately 20–35 ms.
قاعدة البيانات: Supplemental Index
الوصف
تدمد:15513203
DOI:10.1109/TII.2023.3323680