Issue No. 05 - May (2013 vol. 35)
R. Sznitman , EPFL IC ISIM CVLAB, Lausanne, Switzerland
R. Richa , Johns Hopkins Univ., Baltimore, MD, USA
R. H. Taylor , Johns Hopkins Univ., Baltimore, MD, USA
B. Jedynak , Johns Hopkins Univ., Baltimore, MD, USA
G. D. Hager , Johns Hopkins Univ., Baltimore, MD, USA
Methods for tracking an object have generally fallen into two groups: tracking by detection and tracking through local optimization. The advantage of detection-based tracking is its ability to deal with target appearance and disappearance, but it does not naturally take advantage of target motion continuity during detection. The advantage of local optimization is efficiency and accuracy, but it requires additional algorithms to initialize tracking when the target is lost. To bridge these two approaches, we propose a framework for unified detection and tracking as a time-series Bayesian estimation problem. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a target in each frame. To do this we integrate the Active Testing (AT) paradigm with Bayesian filtering, and this results in a framework capable of both detecting and tracking robustly in situations where the target object enters and leaves the field of view regularly. We demonstrate our approach on a retinal tool tracking problem and show through extensive experiments that our method provides an efficient and robust tracking solution.
Instruments, Target tracking, Surgery, Testing, Optimization, Aerospace electronics
R. Sznitman, R. Richa, R. H. Taylor, B. Jedynak and G. D. Hager, "Unified Detection and Tracking of Instruments during Retinal Microsurgery," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 35, no. 5, pp. 1263-1273, 2013.