Constrained stochastic gradient descent for large-scale least squares problem

Publication Type:
Conference Proceeding
Citation:
Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2013, Part F128815 pp. 883 - 891
Issue Date:
2013-08-11
Filename Description Size
Thumbnail2013003399OK_Mu.pdf749.65 kB
Adobe PDF
Full metadata record
Copyright © 2013 ACM. The least squares problem is one of the most important regression problems in statistics, machine learning and data mining. In this paper, we present the Constrained Stochastic Gradient Descent (CSGD) algorithm to solve the largescale least squares problem. CSGD improves the Stochastic Gradient Descent (SGD) by imposing a provable constraint that the linear regression line passes through the mean point of all the data points. It results in the best regret bound O(log T), and fastest convergence speed among all first order approaches. Empirical studies justify the effectiveness of CSGD by comparing it with SGD and other state-of-Theart approaches. An example is also given to show how to use CSGD to optimize SGD based least squares problems to achieve a better performance.
Please use this identifier to cite or link to this item: