The Community for Technology Leaders
2010 IEEE International Conference on Multimedia and Expo (2010)
Singapore, Singapore
July 19, 2010 to July 23, 2010
ISBN: 978-1-4244-7491-2
pp: 766-771
Wei Zhou , MOE-Microsoft Key Laboratory of Multimedia Computing and Communication, University of Science and Technology of China
Liansheng Zhuang , MOE-Microsoft Key Laboratory of Multimedia Computing and Communication, University of Science and Technology of China
Nenghai Yu , MOE-Microsoft Key Laboratory of Multimedia Computing and Communication, University of Science and Technology of China
ABSTRACT
In this paper, we propose a new method for modeling appearance variances in generic object tracking task. Although object tracking has been studied by many researchers for a long time, there are still many challenging problems, which is mainly due to the complex variances of object's appearance. While most of traditional methods using a global or pixel-wise approach, we proposed a part-based tracking framework. We divide an object region into several non-overlapping parts (note they are not semantic as limbs and head of a human), and then a local classifier is updated on-line for each part. We gain a global confidence map by applying these local classifiers to the next frame, and find the new location of target object, i.e. the peak of confidence map, using mean-shift. Our tracker runs real-time, and is robust to some kinds of appearance variance (e.g. change of illumination, occlusion, change of pose, deformation of shape, object/camera movement and so on). Experiments show that our method outperforms the other states of the art approaches, especially on dealing with occlusion.
INDEX TERMS
CITATION

W. Zhou, N. Yu and L. Zhuang, "A robust part-based tracker," 2010 IEEE International Conference on Multimedia and Expo(ICME), Singapore, Singapore, 2010, pp. 766-771.
doi:10.1109/ICME.2010.5583855
94 ms
(Ver 3.3 (11022016))