A Scalable Parallel Formulation of the Backpropagation Algorithm for Hypercubes and Related Architectures

Vipin Kumar, Shekhar Shashi, Minesh B. Amin

Research output: Contribution to journalArticlepeer-review

38 Scopus citations


In this paper, we present a new technique for mapping the backpropagation algorithm on hypercubes and related architectures. A key component of this technique is a network partitioning scheme called checkerboarding. Checkerboarding allows us to replace the all-to-all broadcast operation performed by the commonly used vertical network partitioning scheme, with operations that are much faster on the hypercubes and related architectures. Checkerboarding can be combined with the pattern partitioning technique to form a hybrid scheme that performs better than either one of these schemes. Theoretical analysis and experimental results on nCUBE® and CM5® show that our scheme performs better than the other schemes, for both uniform and nonuniform networks.

Original languageEnglish (US)
Pages (from-to)1073-1090
Number of pages18
JournalIEEE Transactions on Parallel and Distributed Systems
Issue number10
StatePublished - Oct 1994

Bibliographical note

Funding Information:
Manuscript received October 13, 1992; revised June I, 1993. This work was supported by Amy Research Office under Grant 28408-MA-SDI to the University of Minnesota, by a grant from the Graduate School of University of Minnesota, and by the Army High-Performance Computing Research Center at the University of Minnesota. The authors are with the Department of Computer Science, University of Minnesota, Minneapolis, MN 55455 USA; e-mail: kumar@cs.umn.edu, shekhar@cs.umn.edu, amin@cs.umn.edu. IEEE Log Number 9403 102.


  • Backpropagation algorithm
  • hypercube
  • network partitioning
  • neural networks
  • pattern partitioning
  • scalability

Fingerprint Dive into the research topics of 'A Scalable Parallel Formulation of the Backpropagation Algorithm for Hypercubes and Related Architectures'. Together they form a unique fingerprint.

Cite this