Summary: | Background subtraction based on change detection is the first step in many computer vision systems. Many background subtraction methods have been proposed to detect foreground objects through background modeling. However, most of these methods are pixel-based, which only use pixel-by-pixel comparisons, and a few others are spatial-based, which take the neighborhood of each analyzed pixel into consideration. In this paper, inspired by a illumination- invariant feature based on locality-sensitive histograms proposed for object tracking, we first develop a novel texture descriptor named the Local Similarity Statistical Descriptor (LSSD), which calculates the similarity between the current pixel and its neighbors. The LSSD descriptor shows good performance in illumination variation and dynamic background scenes. Then, we model each background pixel representation with a combination of color features and LSSD features. These features are then embedded in a low-cost and highly efficient background modeling framework. The color and texture features have their own merits and demerits; they can compensate each other, resulting in better performance. Both quantitative and qualitative evaluations carried out on the change detection dataset are provided to demonstrate the effectiveness of our method.
|