Spectral Adaptive Dropout: Frequency-Based Regularization for Improved Generalization
Deep neural networks are often susceptible to overfitting, necessitating effective regularization techniques. This paper introduces Spectral Adaptive Dropout, a novel frequency-based regularization technique that dynamically adjusts dropout rates based on the spectral characteristics of network grad...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-06-01
|
Series: | Information |
Subjects: | |
Online Access: | https://www.mdpi.com/2078-2489/16/6/475 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Deep neural networks are often susceptible to overfitting, necessitating effective regularization techniques. This paper introduces Spectral Adaptive Dropout, a novel frequency-based regularization technique that dynamically adjusts dropout rates based on the spectral characteristics of network gradients. The proposed approach addresses the limitations of traditional dropout methods by adaptively targeting high-frequency components that typically contribute to overfitting while preserving essential low-frequency information. Through extensive experimentation on character-level language modeling tasks, the study demonstrates that the method achieves a 1.10% improvement in validation loss while maintaining competitive inference speeds. Thise research explores several implementations including FFT-based analysis, wavelet decomposition, and per-attention-head adaptation, culminating in an optimized approach that balances computational efficiency with regularization effectiveness. Our results highlight the significant potential of incorporating frequency-domain information into regularization strategies for deep neural networks. |
---|---|
ISSN: | 2078-2489 |