This Article 
 Bibliographic References 
 Add to: 
Epipolar Geometry of Opti-Acoustic Stereo Imaging
October 2007 (vol. 29 no. 10)
pp. 1776-1788
Optical and acoustic cameras are suitable imaging systems to inspect underwater structures, both in regular maintenance and security operations. Despite high resolution, optical systems have limited visibility range when deployed in turbid waters. In contrast, the new generation of high-frequency (MHz) acoustic cameras can provide images with enhanced target details in highly turbid waters, though their range is reduced by one to two orders of magnitude compared to traditional low-/midfrequency (10s-100s KHz) sonar systems. It is conceivable that an effective inspection strategy is the deployment of both optical and acoustic cameras on a submersible platform, to enable target imaging in a range of turbidity conditions. Under this scenario and where visibility allows, registration of the images from both cameras arranged in binocular stereo configuration provides valuable scene information that cannot be readily recovered from each sensor alone. We explore and derive the constraint equations for the epipolar geometry and stereo triangulation in utilizing these two sensing modalities with different projection models. Theoretical results supported by computer simulations show that an opti-acoustic stereo imaging system outperforms a traditional binocular vision with optical cameras, particularly for increasing target distance and (or) turbidity.

[1] C. Barat and M.J. Rendas, “Exploiting Natural Contours for Automatic Sonar-to-Video Calibration,” Proc. MTS/IEEE Oceans Conf., vol. 1, June 2005.
[2] E.O. Belcher, B. Matsuyama, and G. Trimble, “Object Identification with Acoustic Lenses,” Proc. MTS/IEEE Oceans Conf., Nov. 2001.
[3] E.O. Belcher, D.G. Gallagher, J.R. Barone, and R.E. Honaker, “Acoustic Lens Camera and Underwater Display Combine to Provide Efficient and Effective Hull and Berth Inspections,” Proc. MTS/IEEE Oceans Conf., Sept. 2003.
[4] Panoramic Vision: Sensors, Theory and Applications, R. Bensoman and S.B. Kang, eds. Springer, 2001.
[5] M. Brown, D. Burschka, and G. Hager, “Advances in Computational Stereo,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 25, no. 8, Aug. 2003.
[6] U. Castellani, A. Fusiello, V. Murino, L. Papaleo, E. Puppo, and M. Pittore, “A Complete System for Online 3D Modelling from Acoustic Images,” Signal Processing Image Comm., vol. 20, 2005.
[7] Q. Chen and G. Medioni, “A Volumetric Stereo Matching Method: Application to Image-Based Modeling,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, vol. 1, June 1999.
[8] O. Faugeras, Three Dimensional Computer Vision. MIT Press, 1996.
[9] A. Fusiello and V. Murino, “Augmented Scene Modeling and Visualization by Optical and Acoustic Sensor Integration,” IEEE Trans. Visualization and Computer Graphics, vol. 10, no. 5, Nov.-Dec. 2004.
[10] J.A. Gifford, “Mapping Shipwreck Sites by Digital Stereovideogrammetry,” Underwater Archaeology, 1997.
[11] N. Gracias and J. Santos-Victor, “Underwater Mosaicing and Trajectory Reconstruction Using Global Alignment,” Proc. MTS/IEEE Oceans Conf., 2001.
[12] N. Gracias and S. Negahdaripour, “Underwater Mosaic Creation Using Video Sequences from Different Altitudes,” Proc. MTS/IEEE Oceans Conf., 2005.
[13] D. Gueriot, “Bathymetric and Side-Scan Data Fusion for Sea-Bottom 3D Mosaicing,” Proc. MTS/IEEE Oceans Conf., 2000.
[14] R.I. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision. Cambridge Univ. Press. 2000.
[15] B. Kamgar-Parsi, L.J. Rosenblum, and E.O. Belcher, “Underwater Imaging with a Moving Acoustic Lens,” IEEE Trans. Image Processing, vol. 7, no. 1, Jan. 1998.
[16] M. Lhuillier and L. Quan, “Match Propagation for Image-Based Modeling and Rendering,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 8, Aug. 2002.
[17] S. Negahdaripour and P. Firoozfam, “An ROV Stereovision System for Ship Hull Inspection,” IEEE J. Oceanic Eng., vol. 31, no. 3, July 2006.
[18] S. Negahdaripour, “Calibration of DIDSON Forward-Scan Acoustic Video Camera,” Proc. MTS/IEEE Oceans Conf., Aug. 2005.
[19] T. Pajdla, “Epipolar Geometry of Some Non-Classical Cameras,” Proc. Computer Vision Winter Workshop, 2001.
[20] N. Patoulatos, W.S. Edwards, D.R. Haynor, and K. Yongmin, “Interactive 3D Registration of Ultrasound and Magnetic Resonance Images Based on a Magnetic Position Sensor,” IEEE Trans. Information Technology in Biomedicine, vol. 3, Dec. 1999.
[21] C.A. Piron, P. Causer, R. Jong, R. Shumak, and D.B. Plewes, “A Hybrid Breast Biopsy System Combining Ultrasound and MRI,” IEEE Trans. Medical Imaging, vol. 20, no. 4, 2003.
[22] P. Rademacher and G. Bishop, “Multiple-Center-of-Projection Images,” Proc. ACM SIGGRAPH '98, July 1998.
[23] Y. Rzhanov, G.R. Cutter, and L. Huff, “Sensor-Assisted Video Mosaicing for Seafloor Mapping,” Proc. Int'l Conf. Image Processing, 2001.
[24] H. Sekkati and S. Negahdaripour, “Direct and Indirect 3D Reconstruction from Opti-Acoustic Stereo Imaging,” Proc. IEEE Int'l Conf. 3D Visualization and Transmission, June 2006.
[25] H. Sekkati and S. Negahdaripour, “Opti-Acoustic Stereo Imaging: On System Calibration and 3D Target Reconstruction,” IEEE Trans. Image Processing, in review.
[26] T. Svoboda, T. Pajdla, and V. Hlavc, “Epipolar Geometry for Panoramic Cameras,” Proc. Fifth European Conf. Computer Vision, June 1998.
[27] S. Seitz and J. Kim, “The Space of All Stereo Images,” Int'l J. Computer Vision, vol. 48, no. 1, 2002.
[28] D. Scharstein and R. Szeliski, “A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms,” Int'l J. Computer Vision, vol. 47, no. 1, 2002.
[29] M. Sermesant, C. Forest, X. Pennec, H. Delingette, and N. Ayache, “Biomechanical Model Construction from Different Modalities: Application to Cardiac Images,” Proc. Fifth Int'l Conf. Medical Image Computing and Computer-Assisted Intervention, T. Dohi and R.Kikinis, eds., Sept. 2002.
[30] Y.Y. Schechner and N. Karpel, “Clear Underwater Vision,” Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition, vol. 1, 2004.
[31] C.C. Slama et al., “Manual of Photogrammetry,” Am. Soc. Photogrammetry and Remote Sensing, 1983.
[32] R.Y. Tsai, “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses,” IEEE Trans. Robotics and Automation, vol. 3, no. 4, 1987.
[33] R. Vesetas and G. Manzie, “AMI: A 3D Imaging Sonar for Mine Identification in Turbid Waters,” Proc. MTS/IEEE Oceans Conf., Nov. 2001.
[34] C. von Alt, B. Allen, T. Austin, N. Forrester, L. Freitag, R. Goldsborough, M. Grund, M. Purcell, and R. Stokey, “Semi-Autonomous Mapping System,” Proc. MTS/IEEE Oceans Conf., Sept. 2003.
[35] P.R. Wolf and B.A. Dewitt, Elements of Photogrammetry with Applications in GIS. McGraw-Hill, 2000.
[36] J. Yu and L. McMillan, “General Linear Cameras,” Proc. Eighth European Conference Computer Vision, 2004.
[37] , 2007.
[38] http:/, 2007.
[39] http:/, 2007.
[40] HF_Conf_Granger.pdf, 2006.
[41] A. Collignon, “Multi-Modality Medical Image Registration by Maximization of Mutual Information,” PhD thesis, Catholic Univ. of Leuven, Leuven, Belgium, 1998.
[42] Medical Image Registration, J.V. Hajnal, D.L.G. Hill, and D.J.Hawkes, eds. CRC Press, 2001.
[43] P.A. Viola, “Alignment by Maximization of Mutual Information,” PhD thesis, Massachusetts Inst. of Technology, Boston, 1995.

Index Terms:
Stereovision, Epipolar Geometry, Triangulation, Optical and Acoustic Imaging
Shahriar Negahdaripour, "Epipolar Geometry of Opti-Acoustic Stereo Imaging," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 10, pp. 1776-1788, Oct. 2007, doi:10.1109/TPAMI.2007.1092
Usage of this product signifies your acceptance of the Terms of Use.