Fast Supervised Discrete Hashing

Publication Type:
Journal Article
Citation:
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40 (2), pp. 490 - 496
Issue Date:
2018-02-01
Filename Description Size
07873258.pdfPublished Version1.06 MB
Adobe PDF
Full metadata record
© 2017 IEEE. Learning-based hashing algorithms are 'hot topics' because they can greatly increase the scale at which existing methods operate. In this paper, we propose a new learning-based hashing method called 'fast supervised discrete hashing' (FSDH) based on 'supervised discrete hashing' (SDH). Regressing the training examples (or hash code) to the corresponding class labels is widely used in ordinary least squares regression. Rather than adopting this method, FSDH uses a very simple yet effective regression of the class labels of training examples to the corresponding hash code to accelerate the algorithm. To the best of our knowledge, this strategy has not previously been used for hashing. Traditional SDH decomposes the optimization into three sub-problems, with the most critical sub-problem - discrete optimization for binary hash codes - solved using iterative discrete cyclic coordinate descent (DCC), which is time-consuming. However, FSDH has a closed-form solution and only requires a single rather than iterative hash code-solving step, which is highly efficient. Furthermore, FSDH is usually faster than SDH for solving the projection matrix for least squares regression, making FSDH generally faster than SDH. For example, our results show that FSDH is about 12-times faster than SDH when the number of hashing bits is 128 on the CIFAR-10 data base, and FSDH is about 151-times faster than FastHash when the number of hashing bits is 64 on the MNIST data-base. Our experimental results show that FSDH is not only fast, but also outperforms other comparative methods.
Please use this identifier to cite or link to this item: