Interactive surveillance event detection through mid-level discriminative representation

Publisher:
ACM
Publication Type:
Conference Proceeding
Citation:
ICMR 2014 - Proceedings of the ACM International Conference on Multimedia Retrieval 2014, 2014, pp. 305 - 312
Issue Date:
2014-01-01
Full metadata record
Files in This Item:
Filename Description Size
p305-Gao.pdfPublished version2.77 MB
Adobe PDF
Event detection from real surveillance videos with complicated background environment is always a very hard task. Different from the traditional retrospective and interactive systems designed on this task, which are mainly executed on video fragments located within the event-occurrence time, in this paper we propose a new interactive system constructed on the mid-level discriminative representations (patches/ shots) which are closely related to the event (might occur beyond the event-occurrence period) and are easier to be detected than video fragments. By virtue of such easilydistinguished mid-level patterns, our framework realizes an effective labor division between computers and human participants. The task of computers is to train classifiers on a bunch of mid-level discriminative representations, and to sort all the possible mid-level representations in the evaluation sets based on the classifier scores. The task of human participants is then to readily search the events based on the clues offered by these sorted mid-level representations. For computers, such mid-level representations, with more concise and consistent patterns, can be more accurately detected than video fragments utilized in the conventional framework, and on the other hand, a human participant can always much more easily search the events of interest implicated by these location-anchored mid-level representations than conventional video fragments containing entire scenes. Both of these two properties facilitate the availability of our framework in real surveillance event detection applications. Copyright is held by the owner/author(s).
Please use this identifier to cite or link to this item: