The Community for Technology Leaders
Green Image
Issue No. 02 - February (2012 vol. 34)
ISSN: 0162-8828
pp: 346-358
Yong Jae Lee , Dept. of Electr. & Comput. Eng., Univ. of Texas at Austin, Austin, TX, USA
K. Grauman , Dept. of Comput. Sci., Univ. of Texas at Austin, Austin, TX, USA
How can knowing about some categories help us to discover new ones in unlabeled images? Unsupervised visual category discovery is useful to mine for recurring objects without human supervision, but existing methods assume no prior information and thus tend to perform poorly for cluttered scenes with multiple objects. We propose to leverage knowledge about previously learned categories to enable more accurate discovery, and address challenges in estimating their familiarity in unsegmented, unlabeled images. We introduce two variants of a novel object-graph descriptor to encode the 2D and 3D spatial layout of object-level co--occurrence patterns relative to an unfamiliar region and show that by using them to model the interaction between an image's known and unknown objects, we can better detect new visual categories. Rather than mine for all categories from scratch, our method identifies new objects while drawing on useful cues from familiar ones. We evaluate our approach on several benchmark data sets and demonstrate clear improvements in discovery over conventional purely appearance-based baselines.
ubiquitous computing, graph theory, object detection, benchmark data sets, object graphs, context aware visual category discovery, human supervision, cluttered scenes, leverage knowledge, object graph descriptor, 3D spatial layout, 2D spatial layout, Image segmentation, Object recognition, Visualization, Unsupervised learning, Layout, Three dimensional displays, unsupervised learning., Object recognition, context, category discovery

Yong Jae Lee and K. Grauman, "Object-Graphs for Context-Aware Visual Category Discovery," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 34, no. , pp. 346-358, 2012.
90 ms
(Ver 3.3 (11022016))