Attention-Based Batch Normalization for Binary Neural Networks
Batch normalization (BN) is crucial for achieving state-of-the-art binary neural networks (BNNs). Unlike full-precision neural networks, BNNs restrict activations to discrete values <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><...
Saved in:
Main Authors: | Shan Gu, Guoyin Zhang, Chengwei Jia, Yanxia Wu |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-06-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/27/6/645 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Optimization of deep neural networks for multiclassification of dental X-rays using transfer learning
by: G. Divya Deepak, et al.
Published: (2024-12-01) -
Dual-Mode Method for Generating Adversarial Examples to Attack Deep Neural Networks
by: Hyun Kwon, et al.
Published: (2025-01-01) -
BTCP: Binary Temporal Convolutional Network-Based Data Prefetcher for Low Inference Latency and Storage Overhead
by: Chang Ho Ryu, et al.
Published: (2025-01-01) -
Training multi-layer binary neural networks with random local binary error signals
by: Luca Colombo, et al.
Published: (2025-01-01) -
SpeakerNet for Cross-lingual Text-Independent Speaker Verification
by: Hafsa HABIB, et al.
Published: (2020-11-01)