Compressing Deep Networks Using Fisher Score of Feature Maps

Mohammadreza Soltani, Suya Wu, Yuerong Li, Robert Ravier, Jie Ding, Vahid Tarokh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we propose a new structural technique for pruning deep neural networks with skip-connections. Our approach is based on measuring the importance of feature maps in predicting the output of the model using their Fisher scores. These scores subsequently used for removing the less informative layers from the graph of the network. Extensive experiments on the classification of CIFAR-10, CIFAR-100, and SVHN data sets demonstrate the efficacy of our compressing method both in the number of parameters and operations.

Original languageEnglish (US)
Title of host publicationProceedings - DCC 2021
Subtitle of host publication2021 Data Compression Conference
EditorsAli Bilgin, Michael W. Marcellin, Joan Serra-Sagrista, James A. Storer
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages371
Number of pages1
ISBN (Electronic)9780738112275
DOIs
StatePublished - Mar 2021
Event2021 Data Compression Conference, DCC 2021 - Snowbird, United States
Duration: Mar 23 2021Mar 26 2021

Publication series

NameData Compression Conference Proceedings
Volume2021-March
ISSN (Print)1068-0314

Conference

Conference2021 Data Compression Conference, DCC 2021
Country/TerritoryUnited States
CitySnowbird
Period3/23/213/26/21

Bibliographical note

Publisher Copyright:
© 2021 IEEE.

Fingerprint

Dive into the research topics of 'Compressing Deep Networks Using Fisher Score of Feature Maps'. Together they form a unique fingerprint.

Cite this