On the linear convergence of descent methods for convex essentially smooth minimization

Zhi Quan Luo, Paul Tseng

Research output: Contribution to journalArticlepeer-review

147 Scopus citations

Abstract

Consider the problem of minimizing, over a polyhedral set, the composition of an affine mapping with a strictly convex essentially smooth function. A general result on the linear convergence of descent methods for solving this problem is presented. By applying this result, the linear convergence of both the gradient projection algorithm of Goldstein and Levitiu and Polyak, and a matrix splitting algorithm using regular splitting, is established. The results do not require that the cost function be strongly convex or that the optimal solution set be bounded. The key to the analysis lies in a new error bound for estimating the distance from a feasible point to the optimal solution set.

Original languageEnglish (US)
Pages (from-to)408-425
Number of pages18
JournalSIAM Journal on Control and Optimization
Volume30
Issue number2
DOIs
StatePublished - 1992

Fingerprint

Dive into the research topics of 'On the linear convergence of descent methods for convex essentially smooth minimization'. Together they form a unique fingerprint.

Cite this