Improved 2-norm Based Fuzzy Least Squares Twin Support Vector Machine
- Publication Type:
- Conference Proceeding
- Proceedings of the 2018 IEEE Symposium Series on Computational Intelligence, SSCI 2018, 2019, pp. 412 - 419
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© 2018 IEEE. In order to reduce the higher training cost of support vector machine (SVM) and its sensitivity towards noise and outliers, two fuzzy based approaches are proposed in this paper. The proposed approaches are based on least squares twin support vector machine (LSTWSVM) and fuzzy support vector machine (FSVM). The effects of noise and outliers are reduced by assigning lower membership values to the data points which are away from the class centers. Further, 2-norm of the slack vectors of the LSTWSVM formulation is taken after multiplying to their respective diagonal matrices of the membership values to effectively utilize the fuzzy membership principle and to make the optimization problem strongly convex. Moreover, the proposed approaches solve linear equations instead of quadratic programming problems which help in training faster. The effectiveness of the proposed approaches are established by comparing the classification accuracies and training time with support vector machine, fuzzy support vector machine, twin support vector machine and least squares twin support vector machine.
Please use this identifier to cite or link to this item: