Issue No. 10 - October (2003 vol. 25)
David J. Fleet , IEEE Computer Society
Thomas F. El-Maraghi , IEEE Computer Society
Allan D. Jepson , IEEE Computer Society
<p><b>Abstract</b>—We propose a framework for learning robust, adaptive, appearance models to be used for motion-based tracking of natural objects. The model adapts to slowly changing appearance, and it maintains a natural measure of the <it>stability</it> of the observed image structure during tracking. By identifying stable properties of appearance, we can weight them more heavily for motion estimation, while less stable properties can be proportionately downweighted. The appearance model involves a mixture of stable image structure, learned over long time courses, along with two-frame motion information and an outlier process. An online EM-algorithm is used to adapt the appearance model parameters over time. An implementation of this approach is developed for an appearance model based on the filter responses from a steerable pyramid. This model is used in a motion-based tracking algorithm to provide robustness in the face of image outliers, such as those caused by occlusions, while adapting to natural changes in appearance such as those due to facial expressions or variations in 3D pose.</p>
Motion, optical flow, tracking, occlusion, EM algorithm, adaptive appearance models.
David J. Fleet, Thomas F. El-Maraghi, Allan D. Jepson, "Robust Online Appearance Models for Visual Tracking", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 25, no. , pp. 1296-1311, October 2003, doi:10.1109/TPAMI.2003.1233903