NewinML: Dynamic Sparse Neural Networks

Mon, Dec 09, 2019 8:30 AM — 5:00 PM
Mon, Dec 09, 2019 8:30 AM — 5:00 PM

Poster Presentation Abstract:

Neural networks that are pruned after training have been shown to achieve similar or higher accuracy compared to the original network, but with a vastly reduced number of parameters. Achieving such highly sparse networks as a natural consequence of training, without a post-processing step, is an open problem.  Neuroscience findings suggest that, in the brain, connections are constantly added and pruned as a function of neural activity. In this paper, we explore a biologically inspired approach where sparse connections are dynamically discovered during training using a local unsupervised Hebbian-style learning rule. Each layer maintains co-activation statistics that reflect correlated input-output activity. A sparse connectivity mask over the weights is continuously updated based on these statistics while backpropagation provides updates to the weight values themselves. We show through simulations that this hybrid approach can lead to networks whose sparsity is comparable to pruning without any further post-processing step.

Authors

Michaelangelo Caporale • Senior Research Engineer

Share