Co-regularized ensemble for feature selection

Publication Type:
Conference Proceeding
Citation:
IJCAI International Joint Conference on Artificial Intelligence, 2013, pp. 1380 - 1386
Issue Date:
2013-12-01
Full metadata record
Supervised feature selection determines feature relevance by evaluating feature's correlation with the classes. Joint minimization of a classifier's loss function and an ℓ2;1-norm regularization has been shown to be effective for feature selection. However, the appropriate feature subset learned from different classifiers' loss function may be different. Less effort has been made on improving the performance of feature selection by the ensemble of different classifiers' criteria and take advantages of them. Furthermore, for the cases when only a few labeled data per class are available, overfitting would be a potential problem and the performance of each classifier is restrained. In this paper, we add a joint ℓ2;1-norm on multiple feature selection matrices to ensemble different classifiers' loss function into a joint optimization framework. This added co-regularization term has twofold role in enhancing the effect of regularization for each criterion and uncovering common irrelevant features. The problem of over-fitting can be alleviated and thus the performance of feature selection is improved. Extensive experiment on different data types demonstrates the effectiveness of our algorithm.
Please use this identifier to cite or link to this item: