Robust Visual Tracking via an Improved Background Aware Correlation Filter

Recently, there emerge many excellent algorithms in the field of visual object tracking. Especially, the background aware correlation filter (BACF) has received much attention, owing to its ability to cope with the boundary effect. However, in the related works, there exist two aspects of imperfecti...

Full description

Bibliographic Details
Main Authors: Xiaoxiao Sheng, Yungang Liu, Huijun Liang, Fengzhong Li, Yongchao Man
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8648403/
Description
Summary:Recently, there emerge many excellent algorithms in the field of visual object tracking. Especially, the background aware correlation filter (BACF) has received much attention, owing to its ability to cope with the boundary effect. However, in the related works, there exist two aspects of imperfections: 1) only histograms of oriented gradients (HOG) is extracted, through which the visual information of targets cannot be fully expressed; and 2) the scale estimation strategy is imperfect in terms of scale parameters, which makes it impossible to accurately track the targets with large-scale changes. To overcome the imperfections, an improved BACF method of robust visual object tracking is proposed to achieve the location of targets with higher accuracy in complex scenarios allowing scale variation, occlusion, rotation, illumination variation, and so on. Crucially, a feature fusion strategy based on HOG and color names is integrated to extract a powerful feature of targets, and a modified scale estimation strategy is designed to enhance the ability to track targets with large-scale changes. The effectiveness and robustness of the proposed method are demonstrated through evaluations on OTB2103 and OTB2015 benchmarks. Particularly, compared with other state-of-the-art correlation filter-based trackers and deep learning-based trackers, the proposed method is competitive in terms of accuracy and success rate.
ISSN:2169-3536