On Convergence Analysis of Gradient Based Primal-Dual Method of Multipliers

Publication Type:
Conference Proceeding
Citation:
2018 IEEE Statistical Signal Processing Workshop, SSP 2018, 2018, pp. 353 - 357
Issue Date:
2018-08-29
Filename Description Size
GPDMM_SSP.pdfAccepted Manuscript709.48 kB
Adobe PDF
Full metadata record
© 2018 IEEE. Recently, the primal-dual method of multipliers (PDMM) has been proposed and successfully applied to solve a number of decomposable convex optimizations distributedly and iteratively. In this work, we study the gradient based PDMM (GPDMM), where the objective functions are approximated using the gradient information per iteration. It is shown that for a certain class of decomposable convex optimizations, synchronous GPDMM has a sublinear convergence rate of O(1/K) (where K denotes the iteration index). Experiments on a problem of distributed ridge regularized logistic regression demonstrate the efficiency of synchronous GPDMM.
Please use this identifier to cite or link to this item: