Alternating Gradient Descent Ascent for Nonconvex Min-Max Problems in Robust Learning and GANs

Songtao Lu, Rahul Singh, Xiangyi Chen, Yongxin Chen, Mingyi Hong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

We study a class of nonconvex-strongly-concave min-max optimization problems. A most commonly used algorithm for such problems in machine learning applications is the class of first-order algorithms where gradient descent and ascent steps are performed simultaneously or alternatively in each step. Despite its great success in practice, its theoretical properties are far from being understood. In fact, not much has been said about its convergence once the convex-concave assumption is absent. This is considerably different from minimization problems where many techniques are available to analyze nonconvex problems. It is not clear that if these techniques can be applied to min-max optimization. Despite the simplicity of this type of first-order methods, its properties are extremely difficult to analyze due to the nonlinear and nonconvex coupling between the maximization and minimization steps.(p)(/p)In this paper, we take a step toward this direction by examining a special class of nonconvex-strongly-concave min-max problems. We show that, with a proper stepsize choice, a simple alternating gradient descent/ascent (AGDA) algorithm would, in fact, converge to a stationary solution with a sublinear rate mathcal{O}left( {1/t} right), where t is the iteration number. We hope our analysis sheds light on future studies on the theoretical properties of relevant machine learning problems.

Original languageEnglish (US)
Title of host publicationConference Record - 53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019
EditorsMichael B. Matthews
PublisherIEEE Computer Society
Pages680-684
Number of pages5
ISBN (Electronic)9781728143002
DOIs
StatePublished - Nov 2019
Event53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019 - Pacific Grove, United States
Duration: Nov 3 2019Nov 6 2019

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
Volume2019-November
ISSN (Print)1058-6393

Conference

Conference53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019
CountryUnited States
CityPacific Grove
Period11/3/1911/6/19

Bibliographical note

Publisher Copyright:
© 2019 IEEE.

Keywords

  • Min-max saddle points
  • generative adversarial networks (GANs)
  • non-convex

Fingerprint

Dive into the research topics of 'Alternating Gradient Descent Ascent for Nonconvex Min-Max Problems in Robust Learning and GANs'. Together they form a unique fingerprint.

Cite this