Training-free compressed sensing for wireless neural recording

Biao Sun, Yuming Ni, Wenfeng Zhao

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

Signal compression is crucial for resource-constrained wireless neural recording applications with limited data bandwidth, and Compressed Sensing (CS) has successfully demonstrated its potential in this field. However, the conventional CS approaches rely on data-dependent and computationally intensive dictionary learning processes to find out the sparse representation of neural signals, and dictionary re-training is inevitable during real experiments. This paper proposes a training-free CS approach for wireless neural recording. By adopting the analysis model to enforce the signal sparsity and constructing a multi-order difference matrix as the analysis operator, it avoids the dictionary learning procedure and reduces the need for previously acquired data and computational complexity. In addition, a group weighted analysis 11-minimization method is developed to recover the neural signals. Experimental results reveal that the proposed approach outperforms the state-of-the-art CS methods for wireless neural recording.

Original languageEnglish (US)
Title of host publicationProceedings - 2016 IEEE Biomedical Circuits and Systems Conference, BioCAS 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages18-21
Number of pages4
ISBN (Electronic)9781509029594
DOIs
StatePublished - Jan 1 2016
Event12th IEEE Biomedical Circuits and Systems Conference, BioCAS 2016 - Shanghai, China
Duration: Oct 17 2016Oct 19 2016

Publication series

NameProceedings - 2016 IEEE Biomedical Circuits and Systems Conference, BioCAS 2016

Other

Other12th IEEE Biomedical Circuits and Systems Conference, BioCAS 2016
CountryChina
CityShanghai
Period10/17/1610/19/16

Fingerprint Dive into the research topics of 'Training-free compressed sensing for wireless neural recording'. Together they form a unique fingerprint.

Cite this