Compact multiple-instance learning

Publication Type:
Conference Proceeding
International Conference on Information and Knowledge Management, Proceedings, 2017, Part F131841 pp. 2007 - 2010
Issue Date:
Filename Description Size
p2007-chai.pdfPublished version525.36 kB
Adobe PDF
Full metadata record
© 2017 Association for Computing Machinery. The weakly supervised Multiple-Instance Learning (MIL) problem has been successfully applied in information retrieval tasks. Two related issues might affect the performance of MIL algorithms: how to cope with label ambiguities and how to deal with non-discriminative components, and we propose COmpact MultiPle-Instance LEarning (COMPILE) to consider them simultaneously. To treat label ambiguities, COMPILE seeks ground-truth positive instances in positive bags. By using weakly supervised information to learn data's short binary representations, COMPILE enhances discrimination via strengthening discriminative components and suppressing non-discriminative ones. We adapt block coordinate descent to optimize COMPILE efficiently. Experiments on text categorization empirically show: 1) COMPILE unifies disambiguation and data preprocessing successfully; 2) it generates short binary representations efficiently to enhance discrimination at significantly reduced storage cost.
Please use this identifier to cite or link to this item: