The Community for Technology Leaders
Green Image
<p><it>Abstract</it>—We introduce a new surface representation for recognizing curved objects. Our approach begins by representing an object by a discrete mesh of points built from range data or from a geometric model of the object. The mesh is computed from the data by deforming a standard shaped mesh, for example, an ellipsoid, until it fits the surface of the object. We define local regularity constraints that the mesh must satisfy. We then define a canonical mapping between the mesh describing the object and a standard spherical mesh. A surface curvature index that is pose-invariant is stored at every node of the mesh. We use this object representation for recognition by comparing the spherical model of a reference object with the model extracted from a new observed scene. We show how the similarity between reference model and observed data can be evaluated and we show how the pose of the reference object in the observed scene can be easily computed using this representation.</p><p>We present results on real range images which show that this approach to modelling and recognizing 3D objects has three main advantages: <l2><li><p><b>First, it is applicable to complex curved surfaces that cannot be handled by conventional techniques. </b></p></li><li><p><b>Second, it reduces the recognition problem to the computation of similarity between spherical distributions; in particular, the recognition algorithm does not require any combinatorial search. </b></p></li><li><p><b>Finally, even though it is based on a spherical mapping, the approach can handle occlusions and partial views.</b></p></li></l2></p>
Object recognition, deformable surfaces, range data, pose registration, 3D modeling, surface models, free-form surfaces.
Katsushi Ikeuchi, Martial Hebert, Hervé Delingette, "A Spherical Representation for Recognition of Free-Form Surfaces", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 17, no. , pp. 681-690, July 1995, doi:10.1109/34.391410
625 ms
(Ver 3.3 (11022016))