VID-SLAM: Robust Pose Estimation with RGBD-Inertial Input for Indoor Robotic Localization

4Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.

Abstract

This study proposes a tightly coupled multi-sensor Simultaneous Localization and Mapping (SLAM) framework that integrates RGB-D and inertial measurements to achieve highly accurate 6 degree of freedom (6DOF) metric localization in a variety of environments. Through the consideration of geometric consistency, inertial measurement unit constraints, and visual re-projection errors, we present visual-inertial-depth odometry (called VIDO), an efficient state estimation back-end, to minimise the cascading losses of all factors. Existing visual-inertial odometers rely on visual feature-based constraints to eliminate the translational displacement and angular drift produced by Inertial Measurement Unit (IMU) noise. To mitigate these constraints, we introduce the iterative closest point error of adjacent frames and update the state vectors of observed frames through the minimisation of the estimation errors of all sensors. Moreover, the closed-loop module allows for further optimization of the global attitude map to correct the long-term drift. For experiments, we collect an RGBD-inertial data set for a comprehensive evaluation of VID-SLAM. The data set contains RGB-D image pairs, IMU measurements, and two types of ground truth data. The experimental results show that VID-SLAM achieves state-of-the-art positioning accuracy and outperforms mainstream vSLAM solutions, including ElasticFusion, ORB-SLAM2, and VINS-Mono.

References Powered by Scopus

ORB-SLAM: A Versatile and Accurate Monocular SLAM System

6139Citations
N/AReaders
Get full text

ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras

5489Citations
N/AReaders
Get full text

KinectFusion: Real-time dense surface mapping and tracking

3507Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Improving the Performance of Visual Odometry for Autonomous Navigation in Environments with Low Illumination and Uneven Terrain

1Citations
N/AReaders
Get full text

Development of Autonomous Mobile Robot with 3DLidar Self-Localization Function Using Layout Map

0Citations
N/AReaders
Get full text

Advanced Navigation Control Systems for Autonomous Mobile Robots Utilizing 3D Lidar Technology

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Shan, D., Su, J., Wang, X., Liu, Y., Zhou, T., & Wu, Z. (2024). VID-SLAM: Robust Pose Estimation with RGBD-Inertial Input for Indoor Robotic Localization. Electronics (Switzerland), 13(2). https://doi.org/10.3390/electronics13020318

Readers' Seniority

Tooltip

Lecturer / Post doc 1

50%

Researcher 1

50%

Readers' Discipline

Tooltip

Computer Science 1

50%

Engineering 1

50%

Article Metrics

Tooltip
Mentions
Blog Mentions: 1

Save time finding and organizing research with Mendeley

Sign up for free