Dimensionality reduction by soft-margin support vector machine
- Publication Type:
- Conference Proceeding
- Citation:
- ICA 2017 - 2017 IEEE International Conference on Agents, 2017, pp. 154 - 156
- Issue Date:
- 2017-08-23
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
08015324.pdf | Published version | 128.55 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© 2017 IEEE. Dimensionality reduction is one of the key issues of machine learning and data mining, especially for high-dimensional data set. In the literature, there are various dimensionality reduction methods, such as PCA, LDA, and KLDA, and the difference between them mainly lies in the optimization objective. In this paper, we propose a new dimensionality reduction method, whose optimization objective is to maximize the margin between different classes, after projecting the original features into some specific lower-dimensional subspace. The specific subspace is constructed with the help of soft margin support vector machines. Our experiments based on several real-world datasets show that this method improves the performance on classification, and it not only can reduce redundant information in features but also is robust to noise.
Please use this identifier to cite or link to this item: