2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2014)
Sept. 10, 2014 to Sept. 12, 2014
Taiki Fukiage , The University of Tokyo
Takeshi Oishi , The University of Tokyo
Katsushi Ikeuchi , The University of Tokyo
There are many situations in which virtual objects are presented half-transparently on a background in real time applications. In such cases, we often want to show the object with constant visibility. However, using the conventional alpha blending, visibility of a blended object substantially varies depending on colors, textures, and structures of the background scene. To overcome this problem, we present a framework for blending images based on a subjective metric of visibility. In our method, a blending parameter is locally and adaptively optimized so that visibility of each location achieves the targeted level. To predict visibility of an object blended by an arbitrary parameter, we utilize one of the error visibility metrics that have been developed for image quality assessment. In this study, we demonstrated that the metric we used can linearly predict visibility of a blended pattern on various texture images, and showed that the proposed blending methods can work in practical situations assuming augmented reality.
Image color analysis, Computational modeling, Predictive models, Sensitivity, Visualization, Transforms
Taiki Fukiage, Takeshi Oishi, Katsushi Ikeuchi, "Visibility-based blending for real-time applications", 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), vol. 00, no. , pp. 63-72, 2014, doi:10.1109/ISMAR.2014.6948410