On Convergence Analysis of Gradient Based Primal-Dual Method of Multipliers
- Publication Type:
- Conference Proceeding
- 2018 IEEE Statistical Signal Processing Workshop, SSP 2018, 2018, pp. 353 - 357
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is currently unavailable due to the publisher's embargo.
The embargo period expires on 30 Aug 2020
© 2018 IEEE. Recently, the primal-dual method of multipliers (PDMM) has been proposed and successfully applied to solve a number of decomposable convex optimizations distributedly and iteratively. In this work, we study the gradient based PDMM (GPDMM), where the objective functions are approximated using the gradient information per iteration. It is shown that for a certain class of decomposable convex optimizations, synchronous GPDMM has a sublinear convergence rate of O(1/K) (where K denotes the iteration index). Experiments on a problem of distributed ridge regularized logistic regression demonstrate the efficiency of synchronous GPDMM.
Please use this identifier to cite or link to this item: