Learning sparse SVM for feature selection on very high dimensional datasets
- Publication Type:
- Conference Proceeding
- Citation:
- ICML 2010 - Proceedings, 27th International Conference on Machine Learning, 2010, pp. 1047 - 1054
- Issue Date:
- 2010-09-17
Closed Access
| Filename | Description | Size | |||
|---|---|---|---|---|---|
| 2013004291OK.pdf | 450.24 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
A sparse representation of Support Vector Machines (SVMs) with respect to input features is desirable for many applications. In this paper, by introducing a 0-1 control variable to each input feature, Zo-norm Sparse SVM (SSVM) is converted to a mixed integer programming (MIP) problem. Rather than directly solving this MIP, we propose an efficient cutting plane algorithm combining with multiple kernel learning to solve its convex relaxation. A global convergence proof for our method is also presented. Comprehensive experimental results on one synthetic and l0 real world datasets show that our proposed method can obtain better or competitive performance compared with existing SVM-based feature selection methods in term of sparsity and generalization performance. Moreover, our proposed method can effectively handle large-scale and extremely high dimensional problems. Copyright 2010 by the author(s)/owner(s).
Please use this identifier to cite or link to this item:
