On the Fundamental Limits of Coded Data Shuffling for Distributed Machine Learning

Adel Elmahdy, Soheil Mohajer

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the data shuffling problem in a distributed learning system, in which a master node is connected to a set of worker nodes, via a shared link, in order to communicate a set of files to the worker nodes. The master node has access to a database of files. In every shuffling iteration, each worker node processes a new subset of files, and has excess storage to partially cache the remaining files, assuming the cached files are uncoded. The caches of the worker nodes are updated every iteration, and they should be designed to satisfy any possible unknown permutation of the files in subsequent iterations. For this problem, we characterize the exact load-memory trade-off for worst-case shuffling by deriving the minimum communication load for a given storage capacity per worker node. As a byproduct, the exact load-memory trade-off for any shuffling is characterized when the number of files is equal to the number of worker nodes. We propose a novel deterministic coded shuffling scheme, which improves the state of the art, by exploiting the cache memories to create coded functions that can be decoded by several worker nodes. Then, we prove the optimality of our proposed scheme by deriving a matching lower bound and showing that the placement phase of the proposed coded shuffling scheme is optimal over all shuffles.

Original languageEnglish (US)
Article number8955811
Pages (from-to)3098-3131
Number of pages34
JournalIEEE Transactions on Information Theory
Volume66
Issue number5
DOIs
StatePublished - May 2020

Bibliographical note

Funding Information:
Manuscript received July 9, 2018; revised July 29, 2019; accepted November 12, 2019. Date of publication January 10, 2020; date of current version April 21, 2020. This work was supported in part by the National Science Foundation under Grant CCF-1749981. This article was presented in part at the 2018 IEEE International Symposium on Information Theory. The authors are with the Department of Electrical and Computer Engineering, University of Minnesota, Minneapolis, MN 55455 USA (e-mail: adel@umn.edu; soheil@umn.edu). Communicated by E. Abbe, Associate Editor for Machine Learning. Color versions of one or more of the figures in this article are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TIT.2020.2964547

Keywords

  • Data shuffling
  • coded caching
  • distributed computing
  • distributed machine learning

Fingerprint Dive into the research topics of 'On the Fundamental Limits of Coded Data Shuffling for Distributed Machine Learning'. Together they form a unique fingerprint.

Cite this