The Community for Technology Leaders
Green Image
ISSN: 0162-8828
Xiaogang Wang , The Chinese University of Hong Kong, Hong Kong
Shi Qiu , The Chinese University of Hong Kong, Hong Kong
Ke Liu , The Chinese University of Hong Kong, Hong Kong
Xiaoou Tang , The Chinese University of Hong Kong, Hong Kong and Shenzhen Institutes of Advanced Technology, CAS
Image re-ranking, as an effective way to improve the results of web-based image search, has been adopted by current commercial search engines such as Bing and Google. Given a query keyword, a pool of images are first retrieved based on textual information. By asking the user to select a query image from the pool, the remaining images are re-ranked based on their visual similarities with the query image. A major challenge is that the similarities of visual features do not well correlate with images' semantic meanings. People proposed to match images in a semantic space which used reference classes closely related to the semantic meanings of images as basis. However, learning a universal visual semantic space to characterize highly diverse images from the web is difficult and inefficient. We propose a novel image re-ranking framework, which automatically offline learns different semantic spaces for different query keywords. The visual features of images are projected into their related semantic spaces to get semantic signatures. At the online stage, images are re-ranked by comparing their semantic signatures obtained from the semantic space specified by the query keyword. The proposed query-specific semantic signatures significantly improve both the accuracy and efficiency of image re-ranking.
Semantics, Visualization, Training, Search engines, Accuracy, Indexes, Image/video retrieval, Internet search

X. Tang, K. Liu, X. Wang and S. Qiu, "Web Image Re-ranking Using Query-Specific Semantic Signatures," in IEEE Transactions on Pattern Analysis & Machine Intelligence.
174 ms
(Ver 3.3 (11022016))