Journal of Applied Science and Engineering

Published by Tamkang University Press

1.30

Impact Factor

1.60

CiteScore

Vo Thu Ha1, Than Thi Thuong1, and Vo Thanh Ha This email address is being protected from spambots. You need JavaScript enabled to view it.2

1Faculty of Electrical Engineering, University of Economics - Technology for Industries, Vietnam
2Faculty of Electrical and Electronic Engineering, University of Transport and Communications, No. 3 Cau Giay Street, Lang Thuong Ward, Dong Da District, Hanoi, Vietnam


 

Received: July 7, 2022
Accepted: November 5, 2022
Publication Date: December 19, 2022

 Copyright The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are cited.


Download Citation: ||https://doi.org/10.6180/jase.202309_26(9).0014  


ABSTRACT


The paper has presented the control and development of an intelligent garbage sorting system with a robot arm. This system consists of a machine vision block, a 6DOF robot manipulator, and a control unit for sorting garbage based on analytical images. YOLOv4 software will be identified the object by the neural network. This method is used for detecting and image recognition of different sizes and types of waste, such as paper-based garbage, metal garbage, and plastic garbage. The results of offline testing on a database of more than 600 untrained images show that the trained model has an average accuracy of about 98.43% for classifying different types of garbage.


Keywords: Waste Sorting, Robotic Arm, YOLO, Machine Vision, 6DOF


REFERENCES


  1. [1] T. J. Lukka, T. Tossavainen, J. V. Kujala, and T. Raiko. “Zenrobotics recycler–robotic sorting using machine learning”. In: Proceedings of the International Conference on Sensor-Based Sorting (SBS). 2014, 1–8.
  2. [2] S. Gundupalli Paulraj, S. Hait, and A. Thakur. “Automated municipal solid waste sorting for recycling using a mobile manipulator”. In: International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. 50152. American Society of Mechanical Engineers. 2016, V05AT07A045. DOI: 10.1115/DETC2016-59842.
  3. [3] J. Huang, T. Pretz, and Z. Bian. “Intelligent solid waste processing using optical sensor based sorting technology”. In: 2010 3rd international congress on image and signal processing. 4. IEEE. 2010, 1657–1661. DOI: 10.1109/CISP.2010.5647729.
  4. [4] S. Ahmad, (2004) “A new technology for automatic identification and sorting of plastics for recycling" Environmental technology 25(10): 1143–1149. DOI: 10.1080/09593332508618380.
  5. [5] K. Chahine and B. Ghazal, (2017) “Automatic sorting of solid wastes using sensor fusion" Computer Science International journal of engineering and technology 9(6): 4408–4414. DOI: 10.21817/ijet/2017/v9i6/170906127.
  6. [6] F. Ang, M. Gabriel, J. Sy, J. J. O. Tan, and A. C. Abad. “Automated waste sorter with mobile robot delivery waste system”. In: De La Salle University Research Congress. De Las Salle University Manila, Philippines. 2013, 7–9.
  7. [7] S. Z. Diya, R. A. Proma, M. N. Islam, T. T. Anannya, A. Al Mamun, R. Arefeen, S. Al Mamun, I. I. Rahman, and M. F. Rabbi. “Developing an intelligent waste sorting system with robotic arm: A step towards green environment”. In: 2018 International Conference on Innovation in Engineering and Technology (ICIET). IEEE. 2018, 1–6. DOI: 10.1109/CIET.2018.8660890.
  8. [8] W. Liu, D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C.-Y. Fu, and A. C. Berg. “Ssd: Single shot multibox detector”. In: European conference on computer vision. Springer. 2016, 21–37.
  9. [9] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi. “You only look once: Unified, real-time object detection”. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2016, 779–788. DOI:10.1109/CVPR.2016.91.
  10. [10] R. Girshick. “Fast R -CNN”. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2015, 1440–1448. DOI: 10.1109/ICCV.2015.169.
  11. [11] M. Halstead, C. McCool, S. Denman, T. Perez, and C. Fookes, (2018) “Fruit quantity and ripeness estimation using a robotic vision system" IEEE robotics and automation LETTERS 3(4): 2995–3002. DOI: 10.1109/LRA.2018.2849514.
  12. [12] J. Zhang, L. He, M. Karkee, Q. Zhang, X. Zhang, and Z. Gao, (2018) “Branch detection for apple trees trained in fruiting wall architecture using depth features and Regions-Convolutional Neural Network (R-CNN)" Computers and Electronics in Agriculture 155: 386–393. DOI: 10.1016/j.compag.2018.10.029.
  13. [13] Z. Zhang, H. Wang, H. Song, S. Zhang, and J. Zhang. “Industrial robot sorting system for municipal solid waste”. In: International Conference on Intelligent Robotics and Applications. Springer. 2019, 342–353. DOI: 10.1007/978-3-030-27532-7_31.
  14. [14] M. Gao, Q. Cai, B. Zheng, J. Shi, Z. Ni, J.Wang, and H. Lin, (2021) “A Hybrid YOLOv4 and Particle Filter Based Robotic Arm Grabbing System in Nonlinear and Non-Gaussian Environment" Electronics 10(10): 1140. DOI: 10.3390/electronics10101140.
  15. [15] J. Redmon and A. Farhadi. “YOLO9000: better, faster, stronger”. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2017, 7263–7271. DOI: 10.1109/CVPR.2017.690.