Object tracking using temporally matching filters

Abstract One of the primary challenges of visual tracking is the variable appearance of the target object. As tracking proceeds, the target object can change its appearance due to illumination changes, rotations, deformations etc. Modern trackers incorporate online updating to learn how the target c...

Full description

Bibliographic Details
Main Authors: Brendan Robeson, Mohammadreza Javanmardi, Xiaojun Qi
Format: Article
Language:English
Published: Wiley 2021-06-01
Series:IET Computer Vision
Online Access:https://doi.org/10.1049/cvi2.12040
Description
Summary:Abstract One of the primary challenges of visual tracking is the variable appearance of the target object. As tracking proceeds, the target object can change its appearance due to illumination changes, rotations, deformations etc. Modern trackers incorporate online updating to learn how the target changes over time. However, they do not use the history of target appearance. To address this shortcoming, we uniquely use domain adaptation with the target appearance history to efficiently learn a temporally matching filter (TMF) during online updating. This TMF emphasizes the persistent features found in different appearances of the target object. It also improves the classification accuracy of the convolutional neural network by assisting the training of the classification layers without incurring the runtime overhead of updating the convolutional layers. Extensive experimental results demonstrate that the proposed TMF‐based tracker, which incorporates domain adaptation with the target appearance history, improves tracking performance on three benchmark video databases (OTB‐50, OTB‐100 and VOT2016) over other online learning trackers. Specifically, it improves the overlap success of VITAL and MDNet by 0.44 % and 1.03 % on the OTB‐100 dataset and improves the accuracy of VITAL and MDNet by 0.55 % and 0.06 % on the VOT2016 dataset, respectively.
ISSN:1751-9632
1751-9640