Guided Regularizers for Structured Reduction of Neural Networks

Traditional regularization techniques based on [Formula: see text] and [Formula: see text] norms of the weight vectors are widely used for sparsifying neural networks. However, the resulting sparsity patterns are scattered, as weights are pruned based solely on their magnitude, but without considera...

Full description

Saved in:
Bibliographic Details
Main Authors: Ali Haisam Muhammad Rafid, Adrian Sandu
Format: Article
Language:English
Published: Taylor & Francis Group 2025-12-01
Series:Data Science in Science
Subjects:
Online Access:https://www.tandfonline.com/doi/10.1080/26941899.2025.2524558
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Traditional regularization techniques based on [Formula: see text] and [Formula: see text] norms of the weight vectors are widely used for sparsifying neural networks. However, the resulting sparsity patterns are scattered, as weights are pruned based solely on their magnitude, but without consideration to the network architecture. In order to simplify a network one needs to prune entire neurons or channels by zeroing out all weight elements related to those neurons or channels. This paper proposes a simple new approach named “Guided Regularization” that prioritizes the weights of certain neural network units more than others, and trains a model while also pruning out entire (less important) units. The approach leads to a structured sparsity pattern. In a Bayesian framework, the proposed approach is a model selection technique, where the guidance coefficients encapsulate the prior probability distribution over the space of models. Following Occam’s razor, simpler models are assigned higher prior probabilities; the structure of what constitutes “simpler” models is user defined. The guided techniques can be applied in conjunction with other existing structured regularizers, to improve their performance. We empirically demonstrate how the proposed guided regularization method is effective in pruning neural networks while maintaining performance. The GitHub repository is available at https://github.com/ComputationalScienceLaboratory/Guided-Regularization.
ISSN:2694-1899