Issue No. 08 - August (1992 vol. 14)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.149595
<p>The AIMS (automatic interpretation using multiple sensors) system, which uses registered laser radar and thermal imagers, is discussed. Its objective is to detect and recognize man-made objects at kilometer range in outdoor scenes. The multisensor fusion approach is applied to four sensing modalities (range, intensity, velocity, and thermal) to improve both image segmentation and interpretation. Low-level attributes of image segments (regions) are computed by the segmentation modules and then converted to the KEE format. The knowledge-based interpretation modules are constructed using KEE and Lisp. AIMS applies forward chaining in a bottom-up fashion to derive object-level interpretations from databases generated by the low-level processing modules. The efficiency of the interpretaton process is enhanced by transferring nonsymbolic processing tasks to a concurrent service manager (program). A parallel implementation of the interpretation module is reported. Experimental results using real data are presented.</p>
image interpretation; multiple sensing modalities; AIMS; laser radar; thermal imagers; multisensor fusion; image segmentation; KEE format; knowledge-based interpretation modules; Lisp; forward chaining; concurrent service manager; computer vision; computerised pattern recognition; infrared imaging; knowledge based systems; optical radar; remote sensing by laser beam
J. Aggarwal and C. Chu, "Image Interpretation Using Multiple Sensing Modalities," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 14, no. , pp. 840-847, 1992.