Wang Y, Liu Q, Qin L B, et al. Learning adaptive spatial regularization and aberrance repression correlation filters for visual tracking[J]. Opto-Electron Eng, 2021, 48(1): 200068. doi: 10.12086/oee.2021.200068
Citation: Wang Y, Liu Q, Qin L B, et al. Learning adaptive spatial regularization and aberrance repression correlation filters for visual tracking[J]. Opto-Electron Eng, 2021, 48(1): 200068. doi: 10.12086/oee.2021.200068

Learning adaptive spatial regularization and aberrance repression correlation filters for visual tracking

    Fund Project: National Natural Science Foundation of China (61871278), International Science and Technology Cooperation and Exchange Project of Sichuan Science and Technology Department (2018HH0143), and Chengdu Industrial Cluster Collaborative Innovation Project (2016-XT00-00015-GX)
More Information
  • This paper proposes a correlation filter tracking algorithm based on adaptive spatial regularization and aberrance repression aiming at the problem that the spatial regularization weight of the background-aware correlation filter is fixed and does not adapt to the change of the target, and the problem that enlarging search area may introduce background noise, decreasing the discrimination ability of filters. First, FHOG features, CN features, and gray features are extracted to enhance the algorithm's ability to express the target. Second, aberrance repression terms are added to the target function to constrain the response map of the current frame, and to enhance the filter's discrimination ability to alleviate the filter model degradation. Finally, adaptive spatial regularization terms are added to the objective function to make the spatial regularization weights being updated as the objective changes, so that the filter can make full use of the target's diversity information. This paper involves experiments on the public data sets OTB-2013, OTB-2015 and VOT2016 to evaluate the proposed algorithm. The experimental results show that the speed of the algorithm used in this paper is 20 frames/s, evaluation indicators such as distance accuracy and success rate are superior to comparison algorithms, and it has good robustness in a variety of complex scenarios such as occlusion, background interference, and rotation changes.
  • 加载中
  • [1] Yilmaz A, Javed O, Shah M. Object tracking: a survey[J]. ACM Comput Surv, 2006, 38(4): 1–45.

    Google Scholar

    [2] Smeulders A W M, Chu D M, Cucchiara R, et al. Visual tracking: an experimental survey[J]. IEEE Trans Pattern Anal Mach Intell, 2014, 36(7): 1442–1468. doi: 10.1109/TPAMI.2013.230

    CrossRef Google Scholar

    [3] 卢湖川, 李佩霞, 王栋. 目标跟踪算法综述[J]. 模式识别与人工智能, 2018, 31(1): 61–76.

    Google Scholar

    Lu H C, Li P X, Wang D. Visual object tracking: a survey[J]. Pattern Recognit Artif Intell, 2018, 31(1): 61–76.

    Google Scholar

    [4] Bolme D S, Beveridge J R, Draper B A, et al. Visual object tracking using adaptive correlation filters[C]//Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2010: 2544–2550.

    Google Scholar

    [5] Henriques J F, Caseiro R, Martins P, et al. Exploiting the circulant structure of tracking-by-detection with kernels[M]//Fitzgibbon A, Lazebnik S, Perona P, et al. Computer Vision–ECCV 2012. ECCV 2012. Berlin, Heidelberg: Springer, 2012: 702–715.

    Google Scholar

    [6] Henriques J F, Caseiro R, Martins P, et al. High-speed tracking with kernelized correlation filters[J]. IEEE Trans Pattern Anal Mach Intell, 2015, 37(3): 583–596. doi: 10.1109/TPAMI.2014.2345390

    CrossRef Google Scholar

    [7] Bertinetto L, Valmadre J, Golodetz S, et al. Staple: complementary learners for real-time tracking[C]//Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016: 1401–1409.

    Google Scholar

    [8] Li Y, Zhu J K. A scale adaptive kernel correlation filter tracker with feature integration[C]//Agapito L, Bronstein M, Rother C. Computer Vision-ECCV 2014 Workshops. ECCV 2014. Zurich, Switzerland: Springer, 2014: 254–265.

    Google Scholar

    [9] Danelljan M, Häger G, Khan F S, et al. Discriminative scale space tracking[J]. IEEE Trans Pattern Anal Mach Intell, 2017, 39(8): 1561–1575. doi: 10.1109/TPAMI.2016.2609928

    CrossRef Google Scholar

    [10] Galoogahi H K, Fagg A, Lucey S. Learning background-aware correlation filters for visual tracking[C]//Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), 2017: 1135–1143.

    Google Scholar

    [11] Huang Z Y, Fu C H, Li Y M, et al. Learning aberrance repressed correlation filters for real-time UAV tracking[C]//Proceedings of the 2019 IEEE International Conference on Computer Vision (ICCV), 2019: 2891–2900.

    Google Scholar

    [12] Dai K A, Wang D, Lu H C, et al. Visual tracking via adaptive spatially-regularized correlation filters[C]//Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019: 4670–4679.

    Google Scholar

    [13] 汤学猛, 陈志国, 傅毅. 基于核滤波器实时运动目标的抗遮挡再跟踪[J]. 光电工程, 2020, 47(1): 52–61. doi: 10.12086/oee.2020.190279

    CrossRef Google Scholar

    Tang X M, Chen Z G, Fu Y. Anti-occlusion and re-tracking of real-time moving target based on kernelized correlation filter[J]. Opto-Electron Eng, 2020, 47(1): 52–61. doi: 10.12086/oee.2020.190279

    CrossRef Google Scholar

    [14] 王红雨, 汪梁, 尹午荣, 等. 结合目标检测的多尺度相关滤波视觉跟踪算法[J]. 光学学报, 2019, 39(1): 0115004.

    Google Scholar

    Wang H Y, Wang L, Yin W R, et al. Multi-scale correlation filtering visual tracking algorithm combined with target detection[J]. Acta Opt Sin, 2019, 39(1): 0115004.

    Google Scholar

    [15] 刘万军, 孙虎, 姜文涛. 自适应特征选择的相关滤波跟踪算法[J]. 光学学报, 2019, 39(6): 0615004.

    Google Scholar

    Liu W J, Sun H, Jiang W T. Correlation filter tracking algorithm for adaptive feature selection[J]. Acta Opt Sin, 2019, 39(6): 0615004.

    Google Scholar

    [16] Wu Y, Lim J, Yang M H. Object tracking benchmark[J]. IEEE Trans Pattern Anal Mach Intell, 2015, 37(9): 1834–1848. doi: 10.1109/TPAMI.2014.2388226

    CrossRef Google Scholar

    [17] Wu Y, Lim J, Yang M H. Online object tracking: a benchmark[C]//Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition, 2013: 2411–2418.

    Google Scholar

    [18] Kristan M, Matas J, Leonardis A, et al. The visual object tracking VOT2015 challenge results[C]//Proceedings of the 2015 IEEE International Conference on Computer Vision Workshop (ICCVW), 2015: 1–23.

    Google Scholar

    [19] Danelljan M, Bhat G, Khan F S, et al. ECO: efficient convolution operators for tracking[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017: 6638–6646.

    Google Scholar

    [20] Li Y, Zhu J K, Hoi S C H, et al. Robust estimation of similarity transformation for visual object tracking[C]//Proceedings of 2019 AAAI Conference on Artificial Intelligence (AAAI), 2019, 33: 8666–8673.

    Google Scholar

    [21] Danelljan M, Häger G, Khan F S, et al. Learning spatially regularized correlation filters for visual tracking[C]//Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), 2015: 4310–4318.

    Google Scholar

    [22] Wang M M, Liu Y, Huang Z. Large margin object tracking with circulant feature maps[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017: 4800–4808.

    Google Scholar

    [23] Danelljan M, Häger G, Khan F S, et al. Accurate scale estimation for robust visual tracking[C]//Proceedings of the British Machine Vision Conference 2014, 2014: 1–11.

    Google Scholar

    [24] Nam H, Baek M, Han B. Modeling and propagating CNNs in a tree structure for visual tracking[EB/OL]. (2016-08-25)[2020-04-08]. https://arxiv.org/abs/1608.07242.

    Google Scholar

  • Overview: As the application basis of intelligent video surveillance, motion analysis, human-computer interaction, spanning monitoring, behavior analysis, and UAV tracking, object tracking is one of the most important researches and one of the basic problems in the field of computer vision. Although the object tracking technology has made great progress in the past few decades, it is still a challenging task to track the target accurately and robustly in the case of deformation, occlusion, scale change, background clutter, and so on. Owing to its excellent performance, the kernel correlation filter tracking algorithm has recently become a popular research subject in the object tracking community. However, traditional correlation filter tracking methods use the properties of the cyclic matrix to transform calculations from the spatial domain to the frequency domain. Although calculation speed is improved using this method, some nonreal samples are also generated. This leads to undesired boundary effects, which reduce the discrimination ability of the filter and affect tracking performance. To some extent, these effects were alleviated by adding pre-defined spatial constraints on the filter coefficients or expanding the search area. However, such constraints are usually fixed for different objects and do not change during the tracking process, while expanding the search area can easily introduce background noise. To overcome the shortcomings of the algorithm, a tracking method based on adaptive spatial regularization and aberrance repression is proposed. First, FHOG features, CN features, and gray features are extracted to enhance the algorithm's ability to express targets. Second, the aberrance repression term is introduced into the objective function to constrain the rate of change of the current frame's response map, which can suppress the drift of the tracking box. Finally, the adaptive spatial regularization term is introduced into the objective function, which learns an effective spatial weight for a specific object and its appearance variations. The ADMM algorithm is used to solve the filter model and reduce computation time. In this study, experiments are performed on the OTB-2013, OTB-2015, and VOT2016 public databases. These databases are commonly used to evaluate the performance of tracking algorithms. It is worth mentioning that the OTB public database includes scale variation, illumination variation, occlusion, and background clutter challenges. Thus, it can accurately and objectively evaluate algorithm performance. The authors used KCF, BACF, Staple, ECO-HC, LDES, SRDCF, LMCF, DSST, etc., as comparison algorithms and used MATLAB R2017a as a programming language. Experimental results indicate that the proposed method exhibits excellent performance in tracking accuracy and robustness in complex scenes such as occlusion, background clutter, and rotation changes.

  • 加载中
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Figures(7)

Tables(3)

Article Metrics

Article views(4677) PDF downloads(1356) Cited by(0)

Access History

Other Articles By Authors

Article Contents

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint