single-jc.php

JACIII Vol.28 No.3 pp. 573-585
doi: 10.20965/jaciii.2024.p0573
(2024)

Research Paper:

Anti-Occlusion Visual Tracking Algorithm for UAVs with Multi-Feature Adaptive Fusion

Xiaohong Qiu ORCID Icon, Xin Wu ORCID Icon, and Cong Xu

School of Software Engineering, Jiangxi University of Science and Technology
No.1180 Shuanggang E Avenue, Qingshanhu District, Nanchang, Jiangxi 330013, China

Corresponding author

Received:
June 18, 2023
Accepted:
January 17, 2024
Published:
May 20, 2024
Keywords:
correlation filtering, visual object tracking, multi-feature adaptive fusion, anti-occlusion, UAV
Abstract

Most of the existing trackers based on discriminative correlation filters use only one feature or a simple linear fusion of multiple features for object tracking, and most of them lack a mechanism to handle occlusions. This leads to poor tracking performance in rapidly changing and easily occluded scenarios, especially on unmanned aerial vehicle (UAV) platforms. To address this issue, this paper proposes an anti-occlusion visual tracking algorithm for UAVs with multi-feature adaptive fusion named multi-feature adaptive fusion and anti-occlusion tracker (MAFAOT). It introduces a novel approach for implementing an adaptive fusion of multiple features. This method transforms the multi-feature fusion problem into a maximization issue by designing a tracking quality evaluation index. It successfully achieves an adaptive fusion of gradient direction histogram and color histogram feature responses. MAFAOT also introduces an anti-occlusion update pool strategy, enabling the tracker to adapt dynamically to various complex scenarios, including occlusion and motion blur. The experimental results on the OTB100 and UAV123 datasets confirm the significant advantages of MAFAOT in terms of precision and success rate compared to other correlation filter-based algorithms. The proposed methods further enhance the expressiveness of the features and effectively avoid the problem of tracker contamination caused by occlusion. Furthermore, this paper applies the proposed methods to the kernelized correlation filters (KCF) algorithm. On the OTB100 dataset, the improved KCF algorithm shows an improvement of 10.94% in precision and 11.11% in success rate. On the UAV123 dataset, it shows an improvement of 14.53% in precision and 16.62% in success rate, further verifying the effectiveness and versatility of the proposed methods.

Multi-feature adaptive fusion

Multi-feature adaptive fusion

Cite this article as:
X. Qiu, X. Wu, and C. Xu, “Anti-Occlusion Visual Tracking Algorithm for UAVs with Multi-Feature Adaptive Fusion,” J. Adv. Comput. Intell. Intell. Inform., Vol.28 No.3, pp. 573-585, 2024.
Data files:
References
  1. [1] L. Chen and Y. Liu, “UAV Visual Target Tracking Algorithms: Review and Future Prospect,” Information and Control, Vol.51, No.1, pp. 23-40, 2022. http://doi.org/10.13976/j.cnki.xk.2022.1144
  2. [2] H. Zhang, L. Dou, B. Xin, R. Zhang, and Q. Wang, “Reconnaissance and Confirmation Task Planning of Multiple Fixed-Wing UAVs with Specific Payloads: A Comparison Study,” J. Adv. Comput. Intell. Intell. Inform., Vol.26, No.4, pp. 570-580, 2022. http://doi.org/10.20965/jaciii.2022.p0570
  3. [3] Z. Peng and Z. Chen, “Ground Target Tracking and Collision Avoidance for UAV Based Guidance Vector Field,” J. Adv. Comput. Intell. Intell. Inform., Vol.19, No.2, pp. 277-283, 2015. http://doi.org/10.20965/jaciii.2015.p0277
  4. [4] S. Javed, M. Danelljan, F. S. Khan, M. H. Khan, M. Felsberg, and J. Matas, “Visual Object Tracking with Discriminative Filters and Siamese Networks: A Survey and Outlook,” IEEE Trans. on Pattern Analysis and Machine Intelligence, pp. 6552-6574, 2022. http://doi.org/10.1109/TPAMI.2022.3212594
  5. [5] L. Bertinetto, J. Valmadre, J. F. Henriques, A. Vedaldi, and P. H. S. Torr, “Fully-Convolutional Siamese Networks for Object Tracking,” Computer Vision–ECCV 2016 Workshops, pp. 850-865, 2016. http://doi.org/10.1007/978-3-319-48881-3_56
  6. [6] B. Li, W. Wu, Q. Wang, F. Zhang, J. Xing, and J. Yan, “SiamRPN++: Evolution of Siamese Visual Tracking with Very Deep Networks,” Proc. of 2019 IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 4277-4286, 2019. https://doi.org/10.1109/CVPR.2019.00441
  7. [7] D. S. Bolme, J. R. Beveridge, B. A. Draper, and Y. M. Lui, “Visual Object Tracking Using Adaptive Correlation Filters,” Proc. of 2010 IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, pp. 2544-2550, 2010. http://doi.org/10.1109/CVPR.2010.5539960
  8. [8] J. F. Henriques, R. Caseiro, P. Martins, and J. Batista, “Exploiting the circulant structure of tracking-by-detection with kernels,” Computer Vision–ECCV 2012 Workshops, pp. 702-715, 2012. https://doi.org/10.1007/978-3-642-33765-9_50
  9. [9] J. F. Henriques, R. Caseiro, P. Martins, and J. Batista, “High-Speed Tracking with Kernelized Correlation Filters,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.37, No.3, pp. 583-596, 2015. http://doi.org/10.1109/TPAMI.2014.2345390
  10. [10] M. Danelljan, G. Häger, F. S. Khan, and M. Felsberg, “Accurate Scale Estimation for Robust Visual Tracking,” Proc. of the British Machine Vision Conf. 2014, 2014. http://dx.doi.org/10.5244/C.28.65
  11. [11] L. Bertinetto, J. Valmadre, S. Golodetz, O. Miksik, and P. H. S. Torr, “Staple: Complementary Learners for Real-Time Tracking,” Proc. of 2016 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 1401-1409, 2016. http://doi.org/10.1109/CVPR.2016.156
  12. [12] M. Danelljan, G. Bhat, F. S. Khan, and M. Felsberg, “ECO: Efficient Convolution Operators for Tracking,” Proc. of 2017 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 6931-6939, 2017. http://doi.org/10.1109/CVPR.2017.733
  13. [13] F. Li, C. Tian, W. Zuo, L. Zhang, and M.-H. Yang, “Learning Spatial-Temporal Regularized Correlation Filters for Visual Tracking,” Proc. of 2018 IEEE/CVF Conf. on Computer Vision and Pattern Recognition, pp. 4904-4913, 2018. http://doi.org/10.1109/CVPR.2018.00515
  14. [14] Y. Li, C. Fu, F. Ding, Z. Huang, and G. Lu, “AutoTrack: Towards High-Performance Visual Tracking for UAV with Automatic Spatio-Temporal Regularization,” Proc. of 2020 IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 11920-11929, 2020. http://doi.org/10.1109/CVPR42600.2020.01194
  15. [15] F. Zhang, J. Cao, H. Wang, and P. Zhao, “Anti-Occlusion Moving Target Tracking Algorithm Based on Multifeature Self-Adaptive Fusion,” Infrared Technology, Vol.45, No.2, pp. 150-160, 2023 (in Chinese).
  16. [16] G. Bhat, J. Johnander, M. Danelljan, F. S. Khan, and M. Felsberg, “Unveiling the Power of Deep Tracking,” Computer Vision–ECCV 2018 Workshops, pp. 493-509, 2018. https://doi.org/10.1007/978-3-030-01216-8_30
  17. [17] M. Wang, Y. Liu, and Z. Huang, “Large Margin Object Tracking with Circulant Feature Maps,” Proc. of 2017 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 4800-4808, 2017. http://doi.org/10.1109/CVPR.2017.510
  18. [18] Y. Wu, J. Lim, and M. H. Yang, “Object Tracking Benchmark,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.37, No.9, pp. 1834-1848, 2015. http://doi.org/10.1109/TPAMI.2014.2388226
  19. [19] M. Mueller, N. Smith, and B. Ghanem, “A Benchmark and Simulator for UAV Tracking,” Computer Vision–ECCV 2016 Workshops, pp. 445-461, 2016. http://doi.org/10.1007/978-3-319-46448-0_27
  20. [20] M. Danelljan, G. Häger, F. S. Khan, and M. Felsberg, “Learning Spatially Regularized Correlation Filters for Visual Tracking,” Proc. of the IEEE Int. Conf. on Computer Vision, pp. 4310-4318, 2015. http://doi.org/10.1109/ICCV.2015.490
  21. [21] A. Lukežic, T. Vojír, L. C. Zajc, J. Matas, and M. Kristan, “Discriminative Correlation Filter with Channel and Spatial Reliability,” Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, pp. 4847-4856, 2017. https://doi.org/10.1109/CVPR.2017.515
  22. [22] M. Danelljan, G. Häger, F. S. Khan, and M. Felsberg, “Discriminative Scale Space Tracking,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.39, No.8, pp. 1561-1575, 2017. http://doi.org/10.1109/TPAMI.2016.2609928
  23. [23] Y. Li and J. Zhu, “A Scale Adaptive Kernel Correlation Filter Tracker with Feature Integration,” Computer Vision–ECCV 2014 Workshops, pp. 254-265, 2015. http://doi.org/10.1007/978-3-319-16181-5_18

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 13, 2024