Scalable clip-based near-duplicate video detection with ordinal measure

Publication Type:
Conference Proceeding
Citation:
CIVR 2010 - 2010 ACM International Conference on Image and Video Retrieval, 2010, pp. 121 - 128
Issue Date:
2010-08-27
Filename Description Size
Thumbnail2013006874OK.pdf Published version873.5 kB
Adobe PDF
Full metadata record
Detection of duplicate or near-duplicate videos on large-scale database plays an important role in video search. In this paper, we analyze the problem of near-duplicates detection and propose a practical and effective solution for real-time large-scale video retrieval. Unlike many existing approaches which make use of video frames or key-frames, our solution is based on a more discriminative signature of video clips. The feature used in this paper is an extension of ordinal measures which have proven to be robust to change in brightness, compression formats and compression ratios. For efficient retrieval, we propose to use Multi-Probe Locality Sensitive Hashing (MPLSH) to index the video clips for fast similarity search and high recall. MPLSH is able to filter out a large number of dissimilar clips from video database. To refine the search process, we apply a slightly more expensive clip-based signature matching between a pair of videos. Experimental results on the data set of 12, 790 videos [26] show that the proposed approach achieves at least 6.5% average precision improvement over the baseline color histogram approach while satisfying real-time requirements. Furthermore, our approach is able to locate the frame offset of copy segment in near-duplicate videos. Copyright © 2010 ACM.
Please use this identifier to cite or link to this item: