Measuring ensemble consistency without measuring tuning curves

Jadin C. Jackson, A. David Redish

Research output: Contribution to journalArticlepeer-review

Abstract

An important question in information processing is the extent to which neural firing patterns remain consistent while processing representations. Transient changes in representational consistency can provide clues to the dynamics of neural processing. We present a generalized framework for measuring the consistency of a neuronal representation that does not require explicit knowledge of the parameters encoded by the ensemble. It requires only neuronal ensembles and a training set of neuronal activity that samples behavioral parameters equally. This will be useful in structures where the behavioral parameters signalled by the neural activity are controversial or unknown.

Original languageEnglish (US)
Pages (from-to)91-99
Number of pages9
JournalNeurocomputing
Volume58-60
DOIs
StatePublished - Jun 2004

Bibliographical note

Funding Information:
We thank N.C. Schmitzer-Torbert for generally helpful dialogues. This work was supported by NIH Grant MH68029-01. JCJ was also partly supported by NSF-IGERT training Grant #9870633.

Keywords

  • Distributed representation
  • Kernel density estimation (KDE)
  • Neural ensemble
  • Reconstruction

Fingerprint Dive into the research topics of 'Measuring ensemble consistency without measuring tuning curves'. Together they form a unique fingerprint.

Cite this