Journal of Applied Science and Engineering

Published by Tamkang University Press

1.30

Impact Factor

2.10

CiteScore

Xin Wang1, Bo Zhang1, Jianwei Zhao2This email address is being protected from spambots. You need JavaScript enabled to view it., and Hezhen You3

1Information and Control Engineering Faculty, Shenyang Jianzhu University, Shenyang, 110168, P.R.China

2Department of Computer information Engineering, Baoding Vocational and Technical College, Baoding, Hebei, 07051, P.R.China

3Computer Science and Technology, Tongji University, Shanghai, 200092, P.R.China


 

Received: August 19, 2023
Accepted: September 8, 2023
Publication Date: December 13, 2023

 Copyright The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are cited.


Download Citation: ||https://doi.org/10.6180/jase.202409_27(9).0012  


The method based on laser inertial navigation technology has been widely used in the navigation field of automatic guided vehicles (AGV) in warehouse workshops. The existing algorithms are prone to scale drift, large cumulative error, and LIDAR degradation leading to serious reduction in the number of sensing points, we aim to design an AGV navigation framework based on the fusion of laser, inertial, and quick response (QR) code technologies (named A-LIQ). First, an inertial measurement unit (IMU) pre-integration model with QR code is proposed, and the obtained QR code constraint information is added between two key frames to form a new composite unit, reducing scale drift, and improving positioning accuracy. Secondly, a local map optimization model is proposed, keyframes and QR codes are selectively introduced, local stratified bundle adjustment (BA) optimization is performed based on sliding windows, and keyframe poses and map point locations are updated. Finally, a LiDAR/IMU/QR code tight coupling optimization method is proposed, and the pre-integration factor, closed-loop factor, QR factor, and laser odometer factor are incorporated into the factor graph system to achieve multi-level data fusion. In this paper, the method is verified on the developed AGV navigation platform, and its performance is evaluated by using measured data and compared with LeGO-LOAM, BALM, LIO-SAM. The results show that the method does not significantly increase the calculation amount. Effectively improve the track closure effect at the closed loop, with lower positioning error, positioning accuracy error is less than 0.02 meters, attitude error is less than 2.


Keywords: Automatic Guided Vehicle (AGV) Navigation; IMU Preintegration; Laser Odometry; QR Code; Factor Graph Optimization


  1. [1] P. Z. Sun, J. You, S. Qiu, E. Q. Wu, P. Xiong, A. Song, H. Zhang, and T. Lu, (2023) “AGV-Based Vehicle Transportation in Automated Container Terminals: A Survey" IEEE Transactions on Intelligent Transportation Systems 24(1): 341–356. DOI: 10.1109/TITS.2022.3215776.
  2. [2] X. Yuwen, H. Zhang, F. Yan, and L. Chen, (2023) “Gaze Control for Active Visual SLAM via Panoramic Cost Map" IEEE Transactions on Intelligent Vehicles 8(2): 1813–1825. DOI: 10.1109/TIV.2022.3174040.
  3. [3] W. M. Yao and S. Wei, (2022) “An RGB-D SLAM Algorithm Based on Adaptive Semantic Segmentation in Dynamic Environment" ROBOT 45(1): 16–27. DOI: 10.13973/j.cnki.robot.210368.
  4. [4] Y. Zhu, C. Zheng, C. Yuan, X. Huang, and X. Hong, (2021) “CamVox: A Low-cost and Accurate Lidar-assisted Visual SLAM System": 5049–5055. DOI: 10.1109/ICRA48506.2021.9561149.
  5. [5] Q. Zou, Q. Sun, L. Chen, B. Nie, and Q. Li, (2022) “A Comparative Analysis of LiDAR SLAM-Based Indoor Navigation for Autonomous Vehicles" IEEE Transactions on Intelligent Transportation Systems 23(7): 6907–6921. DOI: 10.1109/TITS.2021.3063477.
  6. [6] X. Y. Wei, Y. W. Xu, and W. Wei, (2016) “Improvement of LIDAR SLAM Front-end Algorithm Based on Local Mapin Similar Scenes" IEEE Access 5: 975–986. DOI: 10.13973/j.cnki.robot.200541.
  7. [7] A. Bhattacharjee and C. Bhatt, (2023) “Human Arm Motion Capture Using IMU Sensors" Smart Energy and Advancement in Power Technologies: 805–817. DOI: 10.1007/978-981-19-4975-3_63.
  8. [8] C. Yi, S. Rho, B. Wei, C. Yang, Z. Ding, Z. Chen, and F. Jiang, (2022) “Detecting and Correcting IMU Movements During Joint Angle Estimation" IEEE Transactions on Instrumentation and Measurement 71: 1–14. DOI: 10.1109/tim.2022.3167771.
  9. [9] W. L. Xuan and X. Z. Yu, (2023) “A Robust LiDARIMU Joint Calibration Method" Robot 45(3): 2665–2680. DOI: 10.13973/j.cnki.robot.220023.
  10. [10] J. Liu, W. Gao, and Z. Hu, (2021) “OptimizationBased Visual-Inertial SLAM Tightly Coupled with Raw GNSS Measurements": 11612–11618. DOI: 10.1109/ICRA48506.2021.9562013.
  11. [11] S. Golodetz, M. Vankadari, A. Everitt, S. Shin, A. Markham, and N. Trigoni, (2022) “Real-Time Hybrid Mapping of Populated Indoor Scenes using a Low-Cost Monocular UAV": 325–332. DOI: 10.1109/IROS47612.2022.9982054.
  12. [12] J. Zhang and S. Singh, (2017) “Low-drift and real-time lidar odometry and mapping" Auton Robot 41(2): DOI: 10.1007/s10514-016-9548-2.
  13. [13] T. Shan and B. Englot, (2018) “LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain": 4758–4765. DOI: 10.1109/IROS.2018.8594299.
  14. [14] Y. Gao, S. Liu, M. M. Atia, and A. Noureldin, (2015) “INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm" Sensors (Basel, Switzerland) 15: 23286–23302. DOI: 10.3390/s150923286.
  15. [15] S. Yang, X. Zhu, X. Nian, L. Feng, X. Qu, and T. Ma, (2018) “A robust pose graph approach for city scale LiDAR mapping": 1175–1182. DOI: 10.1109/IROS.2018.8593754.
  16. [16] W. Wang, B. Gheneti, L. A. Mateos, F. Duarte, C. Ratti, and D. Rus, (2019) “Roboat: An Autonomous Surface Vehicle for Urban Waterways": 6340–6347. DOI: 10.1109/IROS40897.2019.8968131.
  17. [17] G. P. C. Júnior, A. M. C. Rezende, V. R. F. Miranda, R. Fernandes, H. Azpúrua, A. A. Neto, G. Pessin, and G. M. Freitas, (2022) “EKF-LOAM: An Adaptive Fusion of LiDAR SLAM With Wheel Odometry and Inertial Data for Confined Spaces With Few Geometric Features" IEEE Transactions on Automation Science and Engineering 19(3): 1458–1471. DOI: 10.1109/TASE.2022.3169442.
  18. [18] C. L. Gentil, T. Vidal-Calleja, and S. Huang, (2019) “IN2LAMA: INertial Lidar Localisation And MApping": 6388–6394. DOI: 10.1109/ICRA.2019.8794429.
  19. [19] C. Qin, H. Ye, C. E. Pranata, J. Han, S. Zhang, and M. Liu, (2020) “LINS: A Lidar-Inertial State Estimator for Robust and Efficient Navigation": 8899–8906. DOI: 10.1109/ICRA40945.2020.9197567.
  20. [20] H. Ye, Y. Chen, and M. Liu, (2019) “Tightly Coupled 3D Lidar Inertial Odometry and Mapping": 3144–3150. DOI: 10.1109/ICRA.2019.8793511.
  21. [21] T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, (2020) “LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping": 5135–5142. DOI: 10.1109/IROS45743.2020.9341176.
  22. [22] C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, (2017) “On-Manifold Preintegration for RealTime Visual–Inertial Odometry" IEEE Transactions on Robotics 33(1): 1–21. DOI: 10.1109/TRO.2016.2597321.
  23. [23] T. Qin, P. Li, and S. Shen, (2018) “VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator" IEEE Transactions on Robotics 34(4): 1004–1020. DOI: 10.1109/TRO.2018.2853729.
  24. [24] S. Bahnam, S. Pfeiffer, and G. C. de Croon, (2021) “Stereo Visual Inertial Odometry for Robots with Limited Computational Resources": 9154–9159. DOI: 10.1109/IROS51168.2021.9636807.
  25. [25] W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, (2022) “FAST-LIO2: Fast Direct LiDAR-Inertial Odometry" IEEE Transactions on Robotics 38(4): 2053–2073. DOI: 10.1109/TRO.2022.3141876.
  26. [26] Z. Liu and F. Zhang, (2021) “BALM: Bundle Adjustment for Lidar Mapping" IEEE Robotics and Automation Letters 6(2): 3184–3191. DOI: 10.1109/LRA.2021.3062815.
  27. [27] Y. Cui, X. Chen, Y. Zhang, J. Dong, Q. Wu, and F. Zhu, (2023) “BoW3D: Bag of Words for Real-Time Loop Closing in 3D LiDAR SLAM" IEEE Robotics and Automation Letters 8(5): 2828–2835. DOI: 10.1109/LRA.2022.3221336.


    



 

2.1
2023CiteScore
 
 
69th percentile
Powered by  Scopus

SCImago Journal & Country Rank

Enter your name and email below to receive latest published articles in Journal of Applied Science and Engineering.