In this paper, we propose a new structural technique for pruning deep neural networks with skip-connections. Our approach is based on measuring the importance of feature maps in predicting the output of the model using their Fisher scores. These scores subsequently used for removing the less informative layers from the graph of the network. Extensive experiments on the classification of CIFAR-10, CIFAR-100, and SVHN data sets demonstrate the efficacy of our compressing method both in the number of parameters and operations.
|Original language||English (US)|
|Title of host publication||Proceedings - DCC 2021|
|Subtitle of host publication||2021 Data Compression Conference|
|Editors||Ali Bilgin, Michael W. Marcellin, Joan Serra-Sagrista, James A. Storer|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||1|
|State||Published - Mar 2021|
|Event||2021 Data Compression Conference, DCC 2021 - Snowbird, United States|
Duration: Mar 23 2021 → Mar 26 2021
|Name||Data Compression Conference Proceedings|
|Conference||2021 Data Compression Conference, DCC 2021|
|Period||3/23/21 → 3/26/21|
Bibliographical notePublisher Copyright:
© 2021 IEEE.