CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 1996 vol.18 Issue No.12 - December
Issue No.12 - December (1996 vol.18)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.546256
<p><b>Abstract</b>—Structures of dynamic scenes can only be recovered using a real-time range sensor. Depth from defocus offers an effective solution to fast and dense range estimation. However, accurate depth estimation requires theoretical and practical solutions to a variety of problems including recovery of textureless surfaces, precise blur estimation, and magnification variations caused by defocusing. Both textured and textureless surfaces are recovered using an illumination pattern that is projected via the same optical path used to acquire images. The illumination pattern is optimized to maximize accuracy and spatial resolution in computed depth. The relative blurring in two images is computed using a narrow-band linear operator that is designed by considering all the optical, sensing, and computational elements of the depth from defocus system. Defocus invariant magnification is achieved by the use of an additional aperture in the imaging optics. A prototype focus range sensor has been developed that has a workspace of 1 cubic foot and produces up to 512 × 480 depth estimates at 30 Hz with an average RMS error of 0.2%. Several experimental results are included to demonstrate the performance of the sensor.</p>
Depth from defocus, constant magnification defocusing, active illumination pattern, optical transfer function, image sensing, tuned focus operator, depth estimation, real-time range sensor.
Shree K. Nayar, Masahiro Watanabe, Minori Noguchi, "Real-Time Focus Range Sensor", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.18, no. 12, pp. 1186-1198, December 1996, doi:10.1109/34.546256