Backward-Forward Least Angle Shrinkage for Sparse Quadratic Optimization

Publication Type:
Conference Proceeding
Proceedings, Part I of the 17th International Conference on Neural Information Processing: Theory and Algorithms (ICONIP 2010), 2010, pp. 388 - 396
Issue Date:
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2010001749OK.pdf322.17 kB
Adobe PDF
In compressed sensing and statistical society, dozens of algorithms have been developed to solve â1 penalized least square regression, but constrained sparse quadratic optimization (SQO) is still an open problem. In this paper, we propose backward-forward least angle shrinkage (BF-LAS), which provides a scheme to solve general SQO including sparse eigenvalue minimization. BF-LAS starts from the dense solution, iteratively shrinks unimportant variablesâ magnitudes to zeros in the backward step for minimizing the â1 norm, decreases important variablesâ gradients in the forward step for optimizing the objective, and projects the solution on the feasible set defined by the constraints. The importance of a variable is measured by its correlation w.r.t the objective and is updated via least angle shrinkage (LAS). We show promising performance of BF-LAS on sparse dimension reduction.
Please use this identifier to cite or link to this item: