The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.08 - Aug. (2013 vol.19)
pp: 1415-1424
D. Iwai , Grad. Sch. of Eng. Sci., Osaka Univ., Toyonaka, Japan
T. Yabiki , Grad. Sch. of Eng. Sci., Osaka Univ., Toyonaka, Japan
K. Sato , Grad. Sch. of Eng. Sci., Osaka Univ., Toyonaka, Japan
ABSTRACT
This paper presents a new label layout technique for projection-based augmented reality (AR) that determines the placement of each label directly projected onto an associated physical object with a surface that is normally inappropriate for projection (i.e., nonplanar and textured). Central to our technique is a new legibility estimation method that evaluates how easily people can read projected characters from arbitrary viewpoints. The estimation method relies on the results of a psychophysical study that we conducted to investigate the legibility of projected characters on various types of surfaces that deform their shapes, decrease their contrasts, or cast shadows on them. Our technique computes a label layout by minimizing the energy function using a genetic algorithm (GA). The terms in the function quantitatively evaluate different aspects of the layout quality. Conventional label layout solvers evaluate anchor regions and leader lines. In addition to these evaluations, we design our energy function to deal with the following unique factors, which are inherent in projection-based AR applications: the estimated legibility value and the disconnection of the projected leader line. The results of our subjective experiment showed that the proposed technique could significantly improve the projected label layout.
INDEX TERMS
Layout, Surface texture, Shape, Estimation, Computational modeling, Image color analysis, Gaussian noise, projected character's legibility, Projection-based augmented reality, view management, label layout
CITATION
D. Iwai, T. Yabiki, K. Sato, "View Management of Projected Labels on Nonplanar and Textured Surfaces", IEEE Transactions on Visualization & Computer Graphics, vol.19, no. 8, pp. 1415-1424, Aug. 2013, doi:10.1109/TVCG.2012.321
REFERENCES
[1] B. Bell, S. Feiner, and T. Höllerer, "View Management for Virtual and Augmented Reality," Proc. ACM Symp. User Interface Software and Technology, pp. 101-110, 2001.
[2] O. Bimber and R. Raskar, Spatial Augmented Reality: Merging Real and Virtual Worlds. A.K. Peters Ltd., 2005.
[3] J. Christensen, J. Marks, and S. Shieber, "An Empirical Study of Algorithms for Point-Feature Label Placement," ACM Trans. Graphics, vol. 14, pp. 203-232, 1995.
[4] K. Ali, K. Hartmann, and T. Strothotte, "Label Layout for Interactive 3D Illustrations," The J. WSCG, vol. 13, pp. 1-8, 2005.
[5] I. Vollick, D. Vogel, M. Agrawala, and A. Hertzmann, "Specifying Label Layout Style by Example," Proc. ACM Symp. User Interface Software and Technology, pp. 221-230, 2007.
[6] R. Azuma and C. Furmanski, "Evaluating Label Placement for Augmented Reality View Management," Proc. IEEE/ACM Int'l Symp. Mixed and Augmented Reality, pp. 66-75, 2003.
[7] K. Makita, M. Kanbara, and N. Yokoya, "View Management of Annotations for Wearable Augmented Reality," Proc. IEEE Int'l Conf. Multimedia and Expo, pp. 982-985, 2009.
[8] F. Zhang and H. Sun, "Dynamic Labeling Management in Virtual and Augmented Environments," Proc. Int'l Conf. Computer Aided Design and Computer Graphics, pp. 397-402, 2005.
[9] J.L. Gabbard, J.E. Swan, and D. Hix, "The Effects of Text Drawing Styles, Background Textures, and Natural Lighting on Text Legibility in Outdoor Augmented Reality," Presence: Teleoperators and Virtual Environments, vol. 15, pp. 16-32, 2006.
[10] J.L. Gabbard, J.E. Swan, D. Hix, S.-J. Kim, and G. Fitch, "Active Text Drawing Styles for Outdoor Augmented Reality: A User-Based Study and Design Implications," Proc. IEEE Virtual Reality Conf., pp. 35-42, 2007.
[11] A. Leykin and M. Tuceryan, "Automatic Determination of Text Readability Over Textured Backgrounds for Augmented Reality Systems," Proc. IEEE/ACM Int'l Symp. Mixed and Augmented Reality, pp. 224-230, 2004.
[12] T. Siriborvornratanakul and M. Sugimoto, "Clutter-Aware Adaptive Projection Inside a Dynamic Environment," Proc. ACM Symp. Virtual Reality Software and Technology, pp. 241-242, 2008.
[13] K. Uemura, K. Tajimi, N. Sakata, and S. Nishida, "Annotation View Management for Wearable Projection," Proc. Int'l Conf. Artificial Reality and Telexistence, pp. 202-206, 2010.
[14] O. Bimber, D. Iwai, G. Wetzstein, and A. Grundhöfer, "The Visual Computing of Projector-Camera Systems," Computer Graphics Forum, vol. 27, no. 8, pp. 2219-2254, 2008.
[15] M. Nagase, D. Iwai, and K. Sato, "Dynamic Defocus and Occlusion Compensation of Projected Imagery by Model-Based Optimal Projector Selection in Multi-Projection Environment," Virtual Reality, vol. 15, pp. 119-132, 2011.
[16] J. Jankowski, K. Samp, I. Irzynska, M. Jozwowicz, and S. Decker, "Integrating Text with Video and 3D Graphics: The Effects of Text Drawing Styles on Text Readability," Proc. Int'l Conf. Human Factors in Computing Systems, pp. 1321-1330, 2010.
[17] O. Bimber, A. Emmerling, and T. Klemmer, "Embedded Entertainment with Smart Projectors," IEEE Computer, vol. 38, no. 1, pp. 56-63, Jan. 2005.
[18] G. Welch, H. Fuchs, R. Raskar, H. Towles, and M.S. Brown, "Projected Imagery in Your 'Office of the Future'," IEEE Computer Graphics and Applications, vol. 20, no. 4, pp. 62-67, July 2000.
[19] A. Grundhöfer, M. Seeger, F. Hantsch, and O. Bimber, "Dynamic Adaptation of Projected Imperceptible Codes," Proc. IEEE/ACM Int'l Symp. Mixed and Augmented Reality, pp. 161-168, 2003.
[20] H. Satoh, M. Yamamura, and S. Kobayashi, "Minimal Generation Gap Model for GAS Considering both Exploration and Exploitation," Proc. Int'l Conf. Soft Computing, pp. 494-497, 1996.
29 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool