Backward-forward least angle shrinkage for sparse quadratic optimization
- Publication Type:
- Conference Proceeding
- Citation:
- Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2010, 6443 LNCS (PART 1), pp. 388 - 396
- Issue Date:
- 2010-12-21
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
2010001749OK.pdf | 322.17 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
In compressed sensing and statistical society, dozens of algorithms have been developed to solve ℓ1 penalized least square regression, but constrained sparse quadratic optimization (SQO) is still an open problem. In this paper, we propose backward-forward least angle shrinkage (BF-LAS), which provides a scheme to solve general SQO including sparse eigenvalue minimization. BF-LAS starts from the dense solution, iteratively shrinks unimportant variables' magnitudes to zeros in the backward step for minimizing the ℓ1 norm, decreases important variables' gradients in the forward step for optimizing the objective, and projects the solution on the feasible set defined by the constraints. The importance of a variable is measured by its correlation w.r.t the objective and is updated via least angle shrinkage (LAS). We show promising performance of BF-LAS on sparse dimension reduction. © 2010 Springer-Verlag.
Please use this identifier to cite or link to this item: