Constrained stochastic gradient descent for large-scale least squares problem

Publisher:
ACM
Publication Type:
Conference Proceeding
Citation:
ACM SIGKDD international conference on Knowledge discovery and data mining, 2013, pp. 883 - 891
Issue Date:
2013-01
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2013003399OK_Mu.pdf749.65 kB
Adobe PDF
The least squares problem is one of the most important re- gression problems in statistics, machine learning and data mining. In this paper, we present the Constrained Stochas- tic Gradient Descent (CSGD) algorithm to solve the large- scale least squares problem. CSGD improves the Stochastic Gradient Descent (SGD) by imposing a provable constraint that the linear regression line passes through the mean point of all the data points. It results in the best regret bound O (log T ), and fastest convergence speed among all first or- der approaches. Empirical studies justify the effectiveness of CSGD by comparing it with SGD and other state-of-the- art approaches. An example is also given to show how to use CSGD to optimize SGD based least squares problems to achieve a better performance.
Please use this identifier to cite or link to this item: