2007 IEEE Conference on Computer Vision and Pattern Recognition (2007)
Minneapolis, MN, USA
June 17, 2007 to June 22, 2007
Ioannis Patras , Department of Electronic Engineering, Queen Mary, University of London, UK. I.Patras@elec.qmul.ac.uk
Edwin R. Hancock , Department of Computer Science, The University of York, UK. firstname.lastname@example.org
This paper1 addresses the problem of efficient visual 2D template tracking in image sequences. We adopt a discriminative approach in which the observations at each frame yield direct predictions of a parametrisation of the state (e.g. position/scale/rotation) of the tracked target. To this end, a Bayesian Mixture of Experts (BME) is trained on a dataset of image patches that are generated by applying artificial transformations to the template at the first frame. In contrast to other methods in the literature, we explicitly address the problem that the prediction accuracy can deteriorate drastically for observations that are not similar to the ones in the training set; such observations are common in case of partial occlusions or of fast motion. To do so, we couple the BME with a probabilistic kernel-based classifier which, when trained, can determine the probability that a new/unseen observation can accurately predict the state of the target (the `relevance' of the observation in question). In addition, in the particle filtering framework, we derive a recursive scheme for maintaining an approximation of the posterior probability of the target's state in which the probabilistic predictions of multiple observations are moderated by their corresponding relevance. We apply the algorithm in the problem of 2D template tracking and demonstrate that the proposed scheme outperforms classical methods for discriminative tracking in case of motions large in magnitude and of partial occlusions.
E. R. Hancock and I. Patras, "Regression tracking with data relevance determination," 2007 IEEE Conference on Computer Vision and Pattern Recognition(CVPR), Minneapolis, MN, USA, 2007, pp. 1-8.