Query-Adaptive Hash Code Ranking for Large-Scale Multi-View Visual Search

Publication Type:
Journal Article
IEEE Transactions on Image Processing, 2016, 25 (10), pp. 4514 - 4524
Issue Date:
Filename Description Size
07516672.pdfPublished Version2.67 MB
Adobe PDF
Full metadata record
© 1992-2012 IEEE. Hash-based nearest neighbor search has become attractive in many applications. However, the quantization in hashing usually degenerates the discriminative power when using Hamming distance ranking. Besides, for large-scale visual search, existing hashing methods cannot directly support the efficient search over the data with multiple sources, and while the literature has shown that adaptively incorporating complementary information from diverse sources or views can significantly boost the search performance. To address the problems, this paper proposes a novel and generic approach to building multiple hash tables with multiple views and generating fine-grained ranking results at bitwise and tablewise levels. For each hash table, a query-Adaptive bitwise weighting is introduced to alleviate the quantization loss by simultaneously exploiting the quality of hash functions and their complement for nearest neighbor search. From the tablewise aspect, multiple hash tables are built for different data views as a joint index, over which a query-specific rank fusion is proposed to rerank all results from the bitwise ranking by diffusing in a graph. Comprehensive experiments on image search over three well-known benchmarks show that the proposed method achieves up to 17.11% and 20.28% performance gains on single and multiple table search over the state-of-The-Art methods.
Please use this identifier to cite or link to this item: