The Community for Technology Leaders
Green Image
Issue No. 04 - April (2012 vol. 18)
ISSN: 1077-2626
pp: 573-580
Yanli Liu , Coll. of Comput. Sci., Sichuan Univ., Chengdu, China
X. Granier , LaBRI, INRIA Bordeaux Sud-Ouest, Bordeaux, France
In augmented reality, one of key tasks to achieve a convincing visual appearance consistency between virtual objects and video scenes is to have a coherent illumination along the whole sequence. As outdoor illumination is largely dependent on the weather, the lighting condition may change from frame to frame. In this paper, we propose a full image-based approach for online tracking of outdoor illumination variations from videos captured with moving cameras. Our key idea is to estimate the relative intensities of sunlight and skylight via a sparse set of planar feature-points extracted from each frame. To address the inevitable feature misalignments, a set of constraints are introduced to select the most reliable ones. Exploiting the spatial and temporal coherence of illumination, the relative intensities of sunlight and skylight are finally estimated by using an optimization process. We validate our technique on a set of real-life videos and show that the results with our estimations are visually coherent along the video sequences.
video signal processing, augmented reality, cameras, feature extraction, image sequences, lighting, object tracking, optimisation, optimization process, online tracking, outdoor lighting variation, augmented reality, moving camera, visual appearance consistency, virtual object, video scene, illumination, video sequence, lighting condition, full image-based approach, sunlight relative intensity, skylight relative intensity, planar feature point extraction, spatial coherence, temporal coherence, Lighting, Estimation, Cameras, Three dimensional displays, Feature extraction, Buildings, Geometry, moving cameras., Augmented reality, illumination coherence
Yanli Liu, X. Granier, "Online Tracking of Outdoor Lighting Variations for Augmented Reality with Moving Cameras", IEEE Transactions on Visualization & Computer Graphics, vol. 18, no. , pp. 573-580, April 2012, doi:10.1109/TVCG.2012.53
213 ms
(Ver 3.3 (11022016))