Learning sparse confidence-weighted classifier on very high dimensional data

Publication Type:
Conference Proceeding
Citation:
30th AAAI Conference on Artificial Intelligence, AAAI 2016, 2016, pp. 2080 - 2086
Issue Date:
2016-01-01
Filename Description Size
12329-55997-1-PB.pdfPublished version731.18 kB
Adobe PDF
Full metadata record
© Copyright 2016, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. Confidence-weighted (CW) learning is a successful online learning paradigm which maintains a Gaussian distribution over classifier weights and adopts a covariance matrix to represent the uncertainties of the weight vectors. However, there are two deficiencies in existing full CW learning paradigms, these being the sensitivity to irrelevant features, and the poor scalability to high dimensional data due to the maintenance of the covariance structure. In this paper, we begin by presenting an online-batch CW learning scheme, and then present a novel paradigm to learn sparse CW classifiers. The proposed paradigm essentially identifies feature groups and naturally builds a block diagonal covariance structure, making it very suitable for CW learning over very high-dimensional data. Extensive experimental results demonstrate the superior performance of the proposed methods over state-of-the-art counterparts on classification and feature selection tasks.
Please use this identifier to cite or link to this item: