Journal of Applied Science and Engineering

Published by Tamkang University Press

1.30

Impact Factor

2.10

CiteScore

Wei Huang1,2,3, Zhen Zhang1,2,3, Wentao Li1,2 and Jiandong Tian This email address is being protected from spambots. You need JavaScript enabled to view it.1,2

1State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, P.R. China
2Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang 110016, P.R. China
3University of Chinese Academy of Sciences, Beijing 100049, P.R. China


 

Received: December 25, 2017
Accepted: May 2, 2018
Publication Date: December 1, 2018

Download Citation: ||https://doi.org/10.6180/jase.201812_21(4).0014  

ABSTRACT


Research in moving object tracking has shown significant progress towards application in recent years. However, single sensors suffer from illumination variations for the vision sensor and low directional resolution for the radar. In this paper, we propose a moving object tracking method that fuses radar and vision data. First, false radar objects are filtered out. Second, an adaptive background subtraction method is used to detect candidate regions in images. Finally, moving objects are determined when the effective radar objects are in the candidate regions.


Keywords: Moving Object Tracking, Sensor Fusion, Adaptive Background Subtraction, Millimeter-wave Radar, Vision Sensor


REFERENCES


  1. [1] Sugimoto, S., H. Tateda, H. Takahashi, and M. Okutomi (2004) Obstacle Detection Using Millimeter-wave Radar and Its Visualization on Image Sequence, International Conference on Pattern Recognition, Cambridge, England, Aug. 2326, pp. 342345. doi: 10. 1109/ICPR.2004.1334537
  2. [2] Sun, Z. H., G. Bebis, and R. Miller (2004) On-road Vehicle Detection Using Optical Sensors: a Review, 7th IEEE International Conference on Intelligent Transportation Systems, Washington, DC, U.S.A., 585590. doi: 10.1109/ITSC.2004.1398966
  3. [3] Kim, D. Y., and M. Jeon (2014) Data Fusion of Radar and Image Measurements for Multi-object Tracking via Kalman Filtering, Information Sciences 278, 641 652. doi: 10.1016/j.ins.2014.03.080
  4. [4] Piccardi, M. (2004) Background Subtraction Techniques: a Review, IEEE International Conference on Systems,ManandCybernetics,TheHague,Netherlands, Oct. 1013, pp. 30993104. doi: 10.1109/ICSMC.2004. 1400815
  5. [5] Cristani, M., M. Farenzena, D. Bloisi, and V. Murino (2010) Background Subtraction for Automated Multisensor Surveillance: a Comprehensive Review, EURASIP Journal on Advances in Signal Processing 2010(43). doi: 10.1155/2010/343057
  6. [6] Bouwmans, T., F. El Baf, and B. Vachon (2010) Statistical Background Modeling for Foreground Detection: a Survey, Handbook of Pattern Recognition and Computer Vision 4th ed., 181199.
  7. [7] Benezeth, Y., P. M. Jodoin, B. Emile, H. Laurent, and C. Rosenberger (2008) Review and Evaluation of Commonly-implementedBackground Subtraction Algorithms, 19th International Conference on Pattern Recognition, Tampa, Florida, USA, Dec. 811. doi: 10. 1109/ICPR.2008.4760998
  8. [8] Sobral, A., and A. Vacavant (2014) A Comprehensive Review of Background Subtraction Algorithms Evaluated with Synthetic and Real Videos, Computer Vision and Image Understanding 122, 421. doi: 10. 1016/j.cviu.2013.12.005
  9. [9] Obrvan, M., J. Cesic, and I. Petrovic (2016) Appearance Based Vehicle Detection by Radar-stereo Vision Integration, Learning Robotics forYoungsters-The RoboParty Experience 437449.
  10. [10] Ji, Z. P., and D. Prokhorov (2008) Radar-vision Fusion for Object Classification, World Automation Congress 2008, Hawaii, USA, Sept. 28Oct. 2, pp. 375 380.
  11. [11] Wang, T., N. N. Zheng, J. M.Xin, and Z. Ma(2011) Integrating Millimeter Wave Radar with a Monocular Vision Sensor for On-road Obstacle Detection Applications,” Sensors 11(9), 89929008. doi: 10.3390/ s110908992
  12. [12] Sole, A., O. Mano, G. P. Stein, H. Kumon, Y. Tamatsu, and A. Shashua (2004) Solid or Not Solid: Vision for Radar Target Validation, IEEE Intelligent Vehicles Symposium, Parma, Italy, Jun 1417, pp. 819824.
  13. [13] Wu, S. G., S. Decker, P. Chang, T. Camus, and J. Eledath (2009) Collision Sensing by Stereo Vision and Radar Sensor Fusion, IEEE Transactions on Intelligent Transportation Systems 10(4), 606614. doi: 10. 1109/TITS.2009.2032769
  14. [14] Alessandretti, G., A. Broggi, and P. Cerri (2007) Vehicle and Guard Rail Detection Using Radar and Vision Data Fusion, IEEE Transactions on Intelligent Transportation Systems 8(1), 95104. doi: 10.1109/TITS. 2006.888597
  15. [15] Bombini, L., P. Cerri, P. Medici, and G. Alessandretti (2006) Radar-vision Fusion for Vehicle Detection, Proceedings of International Workshop on Intelligent Transportation, 6570.
  16. [16] Chen, X. W. (2016) Study on Vehicle Detection Using Vision and Radar, Master thesis, Jilin University, Jilin, China.
  17. [17] Zhang, Z. Y. (2000) A Flexible New Technique for Camera Calibration, IEEE Transactions on Pattern Analysis and Machine Intelligence 22(11), 13301334. doi: 10.1109/34.888718