The Community for Technology Leaders
RSS Icon
Subscribe
Sydney, NSW, Australia
Nov. 22, 2006 to Nov. 24, 2006
ISBN: 0-7695-2688-8
pp: 32
E.D. Cheng , University of Technology, Australia
C. Madden , University of Technology, Australia
M. Piccardi , Senior Member, IEEE
ABSTRACT
Tracking people by their appearance across disjoint camera views is challenging since appearance may vary significantly across such views. This problem has been tackled in the past by computing intensity transfer functions between each camera pair during an initial training stage. However, in real-life situations, intensity transfer functions depend not only on the camera pair, but also on the actual illumination at pixel-wise resolution and may prove impractical to estimate to a satisfactory extent. For this reason, in this paper we propose an appearance representation for people tracking capable of coping with the typical illumination changes occurring in a surveillance scenario. Our appearance representation is based on an online K-means color clustering algorithm, a fixed, data-dependent intensity transformation, and the incremental use of frames. Moreover, a similarity measurement is proposed to match the appearance representations of any two given moving objects along sequences of frames. Experimental results presented in this paper show that the proposed methods provides a viable while effective approach for tracking people across disjoint camera views in typical surveillance scenarios.
INDEX TERMS
null
CITATION
E.D. Cheng, C. Madden, M. Piccardi, "Mitigating the Effects of Variable Illumination for Tracking across Disjoint Camera Views", AVSS, 2006, 2013 10th IEEE International Conference on Advanced Video and Signal Based Surveillance, 2013 10th IEEE International Conference on Advanced Video and Signal Based Surveillance 2006, pp. 32, doi:10.1109/AVSS.2006.76
6 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool