Article,

Survey and Experimental Comparison of RGB-D Indoor Robot Navigation Methods Supported by ROS and Their Expansion via Fusion with Wheel Odometry and IMU Data

, , , , and .
International Journal of Mechanical Engineering and Robotics Research (IJMERR), 9 (12): 1532-1540 (December 2020)

Abstract

This paper presents an experimental evaluation and comparison of selected Visual Odometry (VO) and Visual-SLAM (V-SLAM) algorithms for indoor mobile robot navigation supported by the Robot Operating System (ROS). The focus is on algorithms involving RGB-D cameras. Since RGB-D cameras integrate color and depth information, they output coherent measurement data and facilitate an efficient processing pipeline. The various underlying methods of vision-based algorithms are described and evaluated on two datasets covering different indoor situations as well as various lighting and movement conditions. In general, V-SLAM algorithms yielded better results. They were superior with respect to handling drift, in particular when loop closures were involved. However, the results confirmed that VO algorithms could outperform V-SLAM methods under certain circumstances. This happened when there was a very good match between an algorithm’s design objectives and the situation at hand. While the experiments showed that there is no single best algorithm for every scenario, ORB-SLAM2 is recommended as a robust stand-alone RGB-D based localization method available under ROS. Furthermore, we observed that the position estimation error could be reduced by around 67% on average when combining vision-based position estimates with sensor data obtained from wheel odometry and an inertial measurement unit (IMU), respectively. This clearly demonstrates the potential of sensor fusion techniques. The best results in case of sensor fusion were obtained with RGB-DSLAMv2.

Tags

Users

  • @se-group
  • @florian.spiess
  • @samuel.kounev

Comments and Reviews