This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Multifocal Projection: A Multiprojector Technique for Increasing Focal Depth
July/August 2006 (vol. 12 no. 4)
pp. 658-667

Abstract—In this paper, we describe a novel multifocal projection concept that applies conventional video projectors and camera feedback. Multiple projectors with differently adjusted focal planes, but overlapping image areas are used. They can be either differently positioned in the environment or can be integrated into a single projection unit. The defocus created on an arbitrary surface is estimated automatically for each projector pixel. If this is known, a final image with minimal defocus can be composed in real-time from individual pixel contributions of all projectors. Our technique is independent of the surfaces' geometry, color and texture, the environment light, as well as of the projectors' position, orientation, luminance, and chrominance.

[1] W. Biehling, C. Deter, S. Dube, B. Hill, S. Helling, K. Isakovic, S. Klose, and K. Schiewe, “LaserCave— Some Building Blocks for Immersive Screens,” Proc. Int'l Status Conf. Virtual and Augmented Reality, 2004.
[2] O. Bimber, A. Emmerling, and T. Klemmer, “Embedded Entertainment with Smart Projectors,” Computer, pp. 56-63, vol. 38, no. 1, 2005.
[3] O. Bimber, G. Wetzstein, A. Emmerling, and C. Nitschke, “Enabling View-Dependent Stereoscopic Projection in Real Environments,” Proc. IEEE/ACM Int'l Symp. Mixed and Augmented Reality, pp. 14-23, 2005.
[4] M. Brown, A. Majumder, and R. Yang, “Camera-Based Calibration Techniques for Seamless Multiprojector Displays,” IEEE Trans. Visualization and Computer Graphics, vol. 11, no. 2, pp. 193-206, Mar.-Apr. 2005.
[5] M.Ch. Chiang and T.E. Boult, “Local Blur Estimation and Super-Resolution,” Proc. Conf. Computer Vision and Pattern Recognition, pp. 821-826, 1997.
[6] H. Eltoukhy and S. Kavusi, “A Computationally Efficient Algorithm for Multi-Focus Image Reconstruction,” Proc. SPIE Electronic Imaging Conf., 2003.
[7] K. Fujii, M.D. Grossberg, and S.K. Nayar, “A Projector-Camera System with Real-Time Photometric Adaptation for Dynamic Environments,” Proc. Conf. Computer Vision and Pattern Recognition, vol. 2, pp. 20-25, 2005.
[8] P. Grossmann, “Depth from Focus,” Pattern Recognition Letters, vol. 5, pp. 63-69, 1987.
[9] M.D. Grossberg, H. Peri, S.K. Nayar, and P. Belhumeur, “Making One Object Look Like Another: Controlling Appearance Using a Projector-Camera System,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, vol. 1, pp. 452-459, 2004.
[10] P. Hill, N. Canagarajah, and D. Bull, “Image Fusion Using Complex Wavelets,” Proc. 13th British Machine Vision Conf., 2002.
[11] C. Jaynes, S. Webb, and R.M. Steele, “Camera-Based Detection and Removal of Shadows from Interactive Multiprojector Displays,” IEEE Trans. Visualization and Computer Graphics, vol. 10, no. 3, pp. 290-301, May-June 2004.
[12] S.K. Kim, S.R. Park, and J.K. Paik, “Simultaneous Out-of-Focus Blur Estimation and Restoration for Digital Auto-Focusing System,” IEEE Trans. Consumer Electronics, vol. 44, pp. 1071-1075, 1998.
[13] S.K. Kim, S. Hwang, J. Shin, P.K. Paik, B. Abidi, and M.A. Abidi, “Object-Based Image Restoration for Multilayer Auto-Focusing,” Proc. SPIE-IS&T Electronic Imaging Conf., vol. 5308, pp. 893-901, 2004.
[14] M. Levoy, B. Chen, V. Vaish, M. Horowitz, I. McDowall, and M. Bolas, “Synthetic Aperture Confocal Imagining,” Proc. ACM SIGGRAPH '04, pp. 825-834, 2004.
[15] K.-L. Low, G. Welch, A. Lastra, and H. Fuchs, “Life-Sized Projector-Based Dioramas,” Proc. Symp. Virtual Reality Software and Technology, pp. 93-101, 2001.
[16] A. Majumder and G. Welch, “Computer Graphics Optique: Optical Superposition of Projected Computer Graphics,” Proc. Eurographics Workshop Virtual Environment/Immersive Projection Technology, pp. 209-218, 2001.
[17] M. McGuire, W. Matusik, H. Pfister, J.F. Hughes, and F. Durand, “Defocus Video Matting,” Proc. ACM SIGGRAPH '05, pp. 567-676, 2005.
[18] S.K. Nayar, M. Watanabe, and M. Noguchi, “Real-Time Focus Range Sensor,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 18, pp. 1186-1198, 1996.
[19] S.K. Nayar, H. Peri, M.D. Grossberg, and P.N. Belhumeur, “A Projection System with Radiometric Compensation for Screen Imperfections,” Proc. Int'l Workshop Projector-Camera Systems, 2003.
[20] M. Noguchiand and S.K. Nayar, “Microscopic Shape from Focus Using Active Illumination,” Proc. Int'l Conf. Pattern Recognition, pp. 147-152, 1994.
[21] R. Raskar, G. Welch, M. Cutts, A. Lake, L. Stesin, and H. Fuchs, “The Office of the Future: A Unified Approach to Image-Based Modeling and Spatially Immersive Displays,” Proc. ACM SIGGRAPH '98, pp. 179-188, 1998.
[22] R. Raskar, J. van Baar, P. Beardsley, T. Willwacher, S. Rao, and C. Forlines, “iLamps: Geometrically Aware and Self-Configuring Projectors,” Proc. ACM SIGGRAPH, pp. 809-818, 2003.
[23] D.M. Tsai and C.C. Chou, “A Fast Focus Measure for Video Display Inspection,” Machine Vision and Applications, vol. 14, no. 3, pp. 192-196, 2003.
[24] D. Wang, I. Sato, T. Okabe, and Y. Sato, “Radiometric Compensation in a Projector-Camera System Based on the Properties of Human Vision System,” Proc. IEEE Int'l Workshop Projector-Camera Systems, 2005.

Index Terms:
Computing methodologies, computer graphics, picture/image generation, digitizing and scanning, display algorithms, image processing and computer vision, digitization and image capture, radiometry, reflectance, sampling, scanning, enhancement, sharpening and deblurring, scene analysis, color, shape.
Citation:
Oliver Bimber, Andreas Emmerling, "Multifocal Projection: A Multiprojector Technique for Increasing Focal Depth," IEEE Transactions on Visualization and Computer Graphics, vol. 12, no. 4, pp. 658-667, July-Aug. 2006, doi:10.1109/TVCG.2006.75
Usage of this product signifies your acceptance of the Terms of Use.