Journal of Applied Science and Engineering

Published by Tamkang University Press

1.30

Impact Factor

2.10

CiteScore

Chang-Ching Huang, Chien-Hung Huang, and Jin-Siang ShawThis email address is being protected from spambots. You need JavaScript enabled to view it.

Institute of Mechatronic Engineering, National Taipei University of Technology, Taipei, Taiwan


 

Received: October 26, 2023
Accepted: February 1, 2024
Publication Date: March 8, 2024

 Copyright The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are cited.


Download Citation: ||https://doi.org/10.6180/jase.202501_28(1).0004  


This study aims to develop an autonomous mobile robot (AMR) for accurate positioning in a robot operation system (ROS) architecture. The research method uses the information obtained by the laser rangefinder to cooperate with the Cartographer SLAM algorithm to build a map and position the robot in the environment. In addition, the RGBD camera is used for visual odometry to assist Cartographer SLAM in improving positioning accuracy. For path planning, the global planning method uses the A* algorithm, and through the time elastic band (TEB) local path planning algorithm. After repeating the accuracy test 15 times, the average absolute error of the localization was 4.48 cm, which is more accurate than the result obtained using Cartographer SLAM alone at 9.94 cm. Hence, the positioning accuracy was improved by 55%.


Keywords: ROS, Navigation, Cartographer, TEB, ORB-SLAM2


  1. [1] H. Durrant-Whyte and T. Bailey, (2006) “Simultaneous localization and mapping: part I" IEEE robotics & automation magazine 13(2): 99–110. DOI: 10.1109/MRA.2006.1638022.
  2. [2] T. Bailey and H. Durrant-Whyte, (2006) “Simultaneous localization and mapping (SLAM): part II" IEEE Robotics & Automation Magazine 13(3): 108–117. DOI: 10.1109/MRA.2006.1678144.
  3. [3] G. Grisetti, C. Stachniss, and W. Burgard, (2007) “Improved Techniques for Grid Mapping With RaoBlackwellized Particle Filters" IEEE Transactions on Robotics 23(1): 34–46. DOI: 10.1109/TRO.2006.889486.
  4. [4] W. Hess, D. Kohler, H. Rapp, and D. Andor. “Realtime loop closure in 2D LIDAR SLAM”. In: 2016 IEEE International Conference on Robotics and Automation (ICRA). 2016, 1271–1278. DOI: 10.1109/ICRA.2016.7487258.
  5. [5] A. J. Davison, I. D. Reid, N. D. Molton, and O. Stasse, (2007) “MonoSLAM: Real-Time Single Camera SLAM" IEEE Transactions on Pattern Analysis and Machine Intelligence 29(6): 1052–1067. DOI: 10.1109/TPAMI.2007.1049.
  6. [6] R. Mur-Artal and J. D. Tardós, (2017) “ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras" IEEE Transactions on Robotics 33(5): 1255–1262. DOI: 10.1109/TRO.2017.2705103.
  7. [7] A. Bostel and V. Sagar, (1996) “Dynamic contral systems for AGVs" Computing & Control Engineering Journal 7(4): 169–176.
  8. [8] D. Fox, W. Burgard, and S. Thrun, (1997) “The dynamic window approach to collision avoidance" IEEE Robotics & Automation Magazine 4(1): 23–33. DOI: 10.1109/100.580977.
  9. [9] C. Rösmann, W. Feiten, T. Wösch, F. Hoffmann, and T. Bertram. “Efficient trajectory optimization using a sparse model”. In: 2013 European Conference on Mobile Robots. 2013, 138–143. DOI: 10.1109/ECMR.2013.6698833.
  10. [10] L. Zhou, C. Zhu, and X. Su. “SLAM algorithm and Navigation for Indoor Mobile Robot Based on ROS”. In: 2022 IEEE 2nd International Conference on Software Engineering and Artificial Intelligence (SEAI). 2022, 230–236. DOI: 10.1109/SEAI55746.2022.9832313.
  11. [11] P.-L. Wu, J.-J. Li, and J.-S. Shaw, (2022) “Development of an Omnidirectional AGV by Applying ORB-SLAM for Navigation Under ROS Framework" Journal of Automation, Mobile Robotics and Intelligent Systems 16(1): 14–20. [12] S. Agarwal, K. Mierle, et al. Ceres solver. 2012. URL: http://ceres-solver.org.