Autonomous patch-clamp robot for functional characterization of neurons in vivo: Development and application to mouse visual cortex

Gregory L. Holst, William Stoy, Bo Yang, Ilya Kolb, Suhasa B. Kodandaramaiah, Lu Li, Ulf Knoblich, Hongkui Zeng, Bilal Haider, Edward S. Boyden, Craig R. Forest

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

Patch clamping is the gold standard measurement technique for cell-type characterization in vivo, but it has low throughput, is difficult to scale, and requires highly skilled operation. We developed an autonomous robot that can acquire multiple consecutive patch-clamp recordings in vivo. In practice, 40 pipettes loaded into a carousel are sequentially filled and inserted into the brain, localized to a cell, used for patch clamping, and disposed. Automated visual stimulation and electrophysiology software enables functional cell-type classification of whole cell-patched cells, as we show for 37 cells in the anesthetized mouse in visual cortex (V1) layer 5. We achieved 9% yield, with 5.3 min per attempt over hundreds of trials. The highly variable and low-yield nature of in vivo patch-clamp recordings will benefit from such a standardized, automated, quantitative approach, allowing development of optimal algorithms and enabling scaling required for large-scale studies and integration with complementary techniques. NEW & NOTEWORTHY In vivo patch-clamp is the gold standard for intracellular recordings, but it is a very manual and highly skilled technique. The robot in this work demonstrates the most automated in vivo patch-clamp experiment to date, by enabling production of multiple, serial intracellular recordings without human intervention. The robot automates pipette filling, wire threading, pipette positioning, neuron hunting, break-in, delivering sensory stimulus, and recording quality control, enabling in vivo cell-type characterization.

Original languageEnglish (US)
Pages (from-to)2341-2357
Number of pages17
JournalJournal of neurophysiology
Volume121
Issue number6
DOIs
StatePublished - Jun 2019

Bibliographical note

Funding Information:
This article received the following grant support: NSF Integrative Graduate Education Research Traineeship (0965945), Georgia Institute of Technology Presidential Fellowship, NSF Graduate Research Fellowship, NIH Computational Neuroscience Training grant (DA032466-02), Georgia Tech Neural Engineering Center Seed Grant, NIH grants (NIH R01NS102727, 1-U01- MH106027-01, 1-R01-EY023173, 5-R44-NS083108-03), Georgia Tech Fund for Innovation in Research and Education (GT-FIRE), Georgia Tech Institute for Bioengineering and Biosciences Junior Faculty Award, Georgia Tech Technology Fee Fund, Georgia Tech Invention Studio, George W. Woodruff School of Mechanical Engineering, and the Paul G. Allen and Jody Patton, founders of the Allen Institute for Brain Science.

Funding Information:
This article received the following grant support: NSF Integrative Graduate Education Research Traineeship (0965945), Georgia Institute of Technology Presidential Fellowship, NSF Graduate Research Fellowship, NIH Computational Neuroscience Training grant (DA032466-02), Georgia Tech Neural Engineering Center Seed Grant, NIH grants (NIH R01NS102727, 1-U01-MH106027-01, 1-R01-EY023173, 5-R44-NS083108-03), Georgia Tech Fund for Innovation in Research and Education (GT-FIRE), Georgia Tech Institute for Bioengineering and Biosciences Junior Faculty Award, Georgia Tech Technology Fee Fund, Georgia Tech Invention Studio, George W. Woodruff School of Mechanical Engineering, and the Paul G. Allen and Jody Patton, founders of the Allen Institute for Brain Science.

Publisher Copyright:
© 2019 the American Physiological Society.

Keywords

  • Automated
  • In vivo
  • Layer 5
  • Patch clamp
  • Robotic
  • Visual cortex

Fingerprint

Dive into the research topics of 'Autonomous patch-clamp robot for functional characterization of neurons in vivo: Development and application to mouse visual cortex'. Together they form a unique fingerprint.

Cite this