A Provably Communication-Efficient Asynchronous Distributed Inference Method for Convex and Nonconvex Problems

Jineng Ren, Jarvis Haupt

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

This paper proposes and analyzes a communication-efficient distributed optimization framework for general nonconvex nonsmooth signal processing and machine learning problems under an asynchronous protocol. At each iteration, worker machines compute gradients of a known empirical loss function using their own local data, and a master machine solves a related minimization problem to update the current estimate. We prove that for nonconvex nonsmooth problems, the proposed algorithm converges to a stationary point with a sublinear rate over the number of communication rounds, coinciding with the best theoretical rate that can be achieved for this class of problems. Linear convergence to a global minimum is established without any statistical assumptions on the local data for problems characterized by composite loss functions whose smooth part is strongly convex. Extensive numerical experiments verify that the performance of the proposed approach indeed improves - sometimes significantly - over other state-of-the-art algorithms in terms of total communication efficiency.

Original languageEnglish (US)
Article number9098031
Pages (from-to)3325-3340
Number of pages16
JournalIEEE Transactions on Signal Processing
Volume68
DOIs
StatePublished - 2020

Bibliographical note

Publisher Copyright:
© 1991-2012 IEEE.

Keywords

  • Communication-efficiency
  • asynchronous
  • convergence
  • distributed algorithm
  • nonconvex
  • strongly convex

Fingerprint

Dive into the research topics of 'A Provably Communication-Efficient Asynchronous Distributed Inference Method for Convex and Nonconvex Problems'. Together they form a unique fingerprint.

Cite this