|

ICIP2021 – Comprehensive Online Network Pruning via Learnable Scaling Factors

Muhammad Umair Haider and Murtaza Taj

Abstract:

One of the major challenges in deploying deep neural network architectures is their size which has an adverse effect on their inference time and memory requirements. Deep CNNs can either be pruned width-wise by removing filters or depth-wise by removing layers and blocks. Width wise pruning (filter pruning) is commonly performed via learnable gates or switches and sparsity regularizers whereas pruning of layers has so far been performed arbitrarily by manually designing a smaller network usually referred to as a student network. We propose a comprehensive pruning strategy that can perform both width-wise as well as depth-wise pruning. This is achieved by introducing gates at different granularities (neuron, filter, layer, block) which are then controlled via an objective function that simultaneously performs pruning at different granularity during each forward pass. Our approach is applicable to wide-variety of architectures without any constraints on spatial dimensions or connection type (sequential, residual, parallel or inception). Our method has resulted in a compression ratio of 70% to 90% without noticeable loss in accuracy when evaluated on benchmark datasets..

Resources
PDF: Paper

Text Reference:

M. U. Haider and M. Taj, 
"Comprehensive Online Network Pruning Via Learnable Scaling Factors," 
IEEE International Conference on Image Processing (ICIP), 2021, pp. 3557-3561, 
doi: 10.1109/ICIP42928.2021.9506252.

Bibtex Reference:

@INPROCEEDINGS{TajICIP2021_1,  
author={Haider, Muhammad Umair and Taj, Murtaza},  
booktitle={2021 IEEE International Conference on Image Processing (ICIP)},   
title={Comprehensive Online Network Pruning Via Learnable Scaling Factors},   
year={2021},  
pages={3557-3561},  
doi={10.1109/ICIP42928.2021.9506252}}

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *