Adaptive pruning algorithm for least squares support vector machine classifier

Springer Berlin / Heidelberg
Publication Type:
Journal Article
Soft Computing - A Fusion of Foundations, Methodologies and Applications, 2010, 14 (7), pp. 667 - 680
Issue Date:
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2010001501OK.pdf700.97 kB
Adobe PDF
As a new version of support vector machine (SVM), least squares SVM (LS-SVM) involves equality instead of inequality constraints and works with a least squares cost function. A well-known drawback in the LSSVM applications is that the sparseness is lost. In this paper, we develop an adaptive pruning algorithm based on the bottom-to-top strategy, which can deal with this drawback. In the proposed algorithm, the incremental and decremental learning procedures are used alternately and a small support vector set, which can cover most of the information in the training set, can be formed adaptively. Using this set, one can construct the final classifier. In general, the number of the elements in the support vector set is much smaller than that in the training set and a sparse solution is obtained. In order to test the efficiency of the proposed algorithm, we apply it to eight UCI datasets and one benchmarking dataset. The experimental results show that the presented algorithm can obtain adaptively the sparse solutions with losing a little generalization performance for the classification problems with no-noises or noises, and its training speed is much faster than sequential minimal optimization algorithm (SMO) for the large-scale classification problems with no-noises.
Please use this identifier to cite or link to this item: