Improved iteration complexity bounds of cyclic block coordinate descent for convex problems

Ruoyu Sun, Mingyi Hong

Research output: Contribution to journalConference articlepeer-review

25 Scopus citations

Abstract

The iteration complexity of the block-coordinate descent (BCD) type algorithm has been under extensive investigation. It was recently shown that for convex problems the classical cyclic BCGD (block coordinate gradient descent) achieves an O(1/r) complexity (r is the number of passes of all blocks). However, such bounds are at least linearly depend on K (the number of variable blocks), and are at least K times worse than those of the gradient descent (GD) and proximal gradient (PG) methods. In this paper, we close such theoretical performance gap between cyclic BCD and GD/PG. First we show that for a family of quadratic nonsmooth problems, the complexity bounds for cyclic Block Coordinate Proximal Gradient (BCPG), a popular variant of BCD, can match those of the GD/PG in terms of dependency on K (up to a log2(K) factor). Second, we establish an improved complexity bound for Coordinate Gradient Descent (CGD) for general convex problems which can match that of GD in certain scenarios. Our bounds are sharper than the known bounds as they are always at least K times worse than GD. Our analyses do not depend on the update order of block variables inside each cycle, thus our results also apply to BCD methods with random permutation (random sampling without replacement, another popular variant).

Original languageEnglish (US)
Pages (from-to)1306-1314
Number of pages9
JournalAdvances in Neural Information Processing Systems
Volume2015-January
StatePublished - Jan 1 2015

Fingerprint

Dive into the research topics of 'Improved iteration complexity bounds of cyclic block coordinate descent for convex problems'. Together they form a unique fingerprint.

Cite this