LOss-Based SensiTivity rEgulaRization: Towards deep sparse neural networks
- PMID: 34906759
- DOI: 10.1016/j.neunet.2021.11.029
LOss-Based SensiTivity rEgulaRization: Towards deep sparse neural networks
Abstract
LOBSTER (LOss-Based SensiTivity rEgulaRization) is a method for training neural networks having a sparse topology. Let the sensitivity of a network parameter be the variation of the loss function with respect to the variation of the parameter. Parameters with low sensitivity, i.e. having little impact on the loss when perturbed, are shrunk and then pruned to sparsify the network. Our method allows to train a network from scratch, i.e. without preliminary learning or rewinding. Experiments on multiple architectures and datasets show competitive compression ratios with minimal computational overhead.
Keywords: Deep learning; Pruning; Regularization; Sparsity.
Copyright © 2021 Elsevier Ltd. All rights reserved.
Conflict of interest statement
Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
MeSH terms
LinkOut - more resources
Full Text Sources
