Contrast-enhanced serial optical coherence scanner with deep learning network reveals vasculature and white matter organization of mouse brain

Tianqi Li, Chao J. Liu, Taner Akkin

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Optical coherence tomography provides volumetric reconstruction of brain structure with micrometer resolution. Gray matter and white matter can be highlighted using conventional and polarization-based contrasts; however, vasculature in ex-vivo fixed brain has not been investigated at large scale due to lack of intrinsic contrast. We present contrast enhancement to visualize the vasculature by perfusing titanium dioxide particles transcardially into the mouse vascular system. The brain, after dissection and fixation, is imaged by a serial optical coherence scanner. Accumulation of particles in blood vessels generates distinguishable optical signals. Among these, the cross-polarization images reveal the vasculature organization remarkably well. The conventional and polarization-based contrasts are still available for probing the gray matter and white matter structures. The segmentation and reconstruction of the vasculature are presented by using a deep learning algorithm. Axonal fiber pathways in the mouse brain are delineated by utilizing the retardance and optic axis orientation contrasts. This is a low-cost method that can be further developed to study neurovascular diseases and brain injury in animal models.

Original languageEnglish (US)
Article number035004
JournalNeurophotonics
Volume6
Issue number3
DOIs
StatePublished - Jul 1 2019

Keywords

  • Brain vasculature
  • Contrast enhancement
  • Deep learning
  • Polarization-sensitive optical coherence tomography

PubMed: MeSH publication types

  • Journal Article

Fingerprint Dive into the research topics of 'Contrast-enhanced serial optical coherence scanner with deep learning network reveals vasculature and white matter organization of mouse brain'. Together they form a unique fingerprint.

Cite this