Sublinear dual coordinate ascent for regularized loss minimization

Publication Type:
Conference Proceeding
Citation:
Proceedings - IEEE International Conference on Data Mining, ICDM, 2017, pp. 1065 - 1070
Issue Date:
2017-01-31
Full metadata record
Files in This Item:
Filename Description Size
07837950.pdfPublished version254.8 kB
Adobe PDF
© 2016 IEEE. We present a sublinear version of the dual coordinate ascent method for solving a group of regularized loss minimization problems in machine learning. The proposed method seamlessly integrates sampling techniques, the dual coordinate ascent method, and a multiplicative update algorithm. The sampling techniques choose the "expected" examples, and estimate the corresponding inner products. The dual coordinate ascent method generates an updated iterative step, which outperforms the time-learning step used in the previous sublinear perceptron algorithm. The multiplicative update algorithm updates the example weighting. The proposed method is implemented with an iterative step of order O(log(n)), where n is the size of examples, and achieves a better result than other methods, with high probability. We present a theoretical analysis of the sublinear iterative in order to justify its benefits. We then apply the proposed optimization method to support vector machine and conduct experiments on three large-scale datasets. Our experimental results validate our theoretical findings.
Please use this identifier to cite or link to this item: