single-jc.php

JACIII Vol.22 No.5 pp. 593-601
doi: 10.20965/jaciii.2018.p0593
(2018)

Paper:

Research on Moving Target Tracking Algorithm Based on Lidar and Visual Fusion

Xiaoxiao Guo*, Yuansheng Liu**,†, Qixue Zhong**, and Mengna Chai**

*Beijing Key Laboratory of Information Service Engineering, Beijing Union University
No.97 Beisihuan East Road, Chao Yang District, Beijing 100101, China

**Beijing Engineering Research Center of Smart Mechanical Innovation Design Service, Beijing Union University
No.97 Beisihuan East Road, Chao Yang District, Beijing 100101, China

Corresponding author

Received:
February 16, 2018
Accepted:
May 7, 2018
Published:
September 20, 2018
Keywords:
autonomous vehicles, target tracking, multi-sensor fusion, data association
Abstract

Multi-sensor fusion and target tracking are two key technologies for the environmental awareness system of autonomous vehicles. In this paper, a moving target tracking method based on the fusion of Lidar and binocular camera is proposed. Firstly, the position information obtained by the two types of sensors is fused at decision level by using adaptive weighting algorithm, and then the Joint Probability Data Association (JPDA) algorithm is correlated with the result of fusion to achieve multi-target tracking. Tested at a curve in the campus and compared with the Extended Kalman Filter (EKF) algorithm, the experimental results show that this algorithm can effectively overcome the limitation of a single sensor and track more accurately.

Cite this article as:
X. Guo, Y. Liu, Q. Zhong, and M. Chai, “Research on Moving Target Tracking Algorithm Based on Lidar and Visual Fusion,” J. Adv. Comput. Intell. Intell. Inform., Vol.22 No.5, pp. 593-601, 2018.
Data files:
References
  1. [1] J. Leonard et al., “A perception-driven autonomous urban vehicle,” J. of Field Robotics, Vol.25, No.10, pp. 727-774, 2008.
  2. [2] G. Monteiro et al., “Tracking and Classification of Dynamic Obstacles using Laser Range Finder and Vision,” Workshop on “Safe Navigation in Open and Dynamic Environments,” IEEE/RSJ IROS2006, 2006.
  3. [3] C. Premebida, O. Ludwig, and U. Nunes, “LIDAR and vision based pedestrian detection system,” J. of Field Robotics, Vol.26, No.9, pp. 696-711, 2010.
  4. [4] R. O. Chavez-Garcia and O. Aycard, “Multiple Sensor Fusion and Classification for Moving Object Detection and Tracking,” IEEE Trans. on Intelligent Transportation Systems, Vol.17, No.2, pp. 525-534, 2016.
  5. [5] C. Premebida et al., “A Lidar and Vision-based Approach for Pedestrian and Vehicle Detection and Tracking,” 2007 IEEE Intelligent Transportation Systems Conf., 2007.
  6. [6] B. Habtemariam et al., “A Multiple-Detection Joint Probabilistic Data Association Filter,” IEEE J. of Selected Topics in Signal Processing, Vol.7, No.3, pp. 461-471, 2013.
  7. [7] R. Anitha, S. Renuka, and A. Abudhahir, “Multi sensor data fusion algorithms for target tracking using multiple measurements,” IEEE Int. Conf. on Computational Intelligence and Computing Research., pp. 1-4, 2014.
  8. [8] X. Chen et al., “A Novel Probabilistic Data Association for Target Tracking in a Cluttered Environment,” Sensors, Vol.16, No.12,E2180, 2016.
  9. [9] A. Haselhoff, A. Kummert, and G. Schneider, “Radar-Vision Fusion with an Application to Car-Following using an Improved AdaBoost Detection Algorithm,” Intelligent Transportation Systems Conf., Itsc. IEEE, 2007, pp. 854-858, 2007.
  10. [10] F. Liu, J. Sparbert, C. Stiller, “IMMPDA vehicle tracking system using asynchronous sensor fusion of radar and vision,” Intelligent Vehicles Symp., IEEE, pp. 168-173, 2008.
  11. [11] R. O. C. García, T. D. Vu, and O. Aycard, “Fusion at detection level for frontal object perception,” Intelligent Vehicles Symp. Proc., 2014 IEEE, pp. 1225-1230, 2014.
  12. [12] G. Lu and W. Xue, “Adaptive Weighted Fusion Algorithm for Monitoring System of Forest Fire Based on Wireless Sensor Networks,” 2nd Int. Conf. on Computer Modeling and Simulation. IEEE Computer Society, pp. 414-417, 2010.
  13. [13] L. I. Zhan-Ming, R. Z. Chen, and B. M. Zhang, “Study of adaptive weighted estimate algorithm of congeneric multi-sensor data fusion,” J. of Lanzhou University of Technology, Vol.32, No.4, pp. 78-82, 2006.
  14. [14] T. Koohi, A. Izadipour, and M. Fesharaki, “Improvement of Multi-Target Tracking in a Multi-Agent Architecture with Multi-Sensor Data Fusion,” Iranian Students Conf. on Electrical Engineering, Vol.61, No.3, 2012.
  15. [15] Y. C. Yao et al., “Design of the Multi-Sensor Target Tracking System Based on Data Fusion,” Advanced Materials Research, Vol.219-220, pp. 1407-1410, 2011.
  16. [16] S. Pellegrini, A. Ess, and L. V. Gool, “Improving Data Association by Joint Modeling of Pedestrian Trajectories and Groupings,” Springer, 2010.
  17. [17] H. Chen and R. Jian, “A multitarget tracking algorithm based on radar and infrared sensor data fusion,” IEEE, Int. Conf. on Communication Software and Networks, pp. 367-371, 2011.
  18. [18] A. T. Kamal et al., “Distributed Multi-Target Tracking and Data Association in Vision Networks,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.38, Issue 7, 2016.
  19. [19] A. G. Daronkolaei et al., “A Joint Probability Data Association Filter Algorithm for Multiple Robot Tracking Problems,” Tools in Artificial Intelligence, InTech, 2008.
  20. [20] Q. Gao, H. Zou, and C. Liu, “Joint probability data association algorithm with fusing multi-feature information,” Computer Engineering and Applications, Vol.48, No.3, pp. 111-113, 2012.
  21. [21] C. Liu, P. Shui, and S. Li, “Unscented extended Kalman filter for target tracking,” J. of Systems Engineering and Electronics, Vol.22, No.2, pp. 188-192, 2011.
  22. [22] Y. M. Cheng et al., “Multistation Passive Fusion Tracking Based on Extended Kalman Filter,” Acta Simulata Systematica Sinica, Vol.4, pp. 548-550, 2003.
  23. [23] A. E. Nordsjo, “A constrained extended Kalman filter for target tracking,” Radar Conf., 2004. Proc. of the IEEE., pp. 123-127, 2004.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 02, 2024