Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly Detection

Unsupervised anomaly detection (AD) remains a notable challenge in computer vision research, due to the inherent absence of annotated anomalous data and the unpredictable nature of anomaly manifestations. To address these challenges, a novel Transformer-based knowledge distillation framework is prop...

Full description

Saved in:
Bibliographic Details
Main Authors: Danyang Wang, Bingyan Wang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11062580/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Unsupervised anomaly detection (AD) remains a notable challenge in computer vision research, due to the inherent absence of annotated anomalous data and the unpredictable nature of anomaly manifestations. To address these challenges, a novel Transformer-based knowledge distillation framework is proposed through a hierarchical architecture designed for improved anomaly recognition. This three-stage architecture consists of a fixed pretrained teacher network serving as the upstream feature extractor, a dedicated multi-feature aggregation and filtering module integrated with Vision Transformer components as the intermediate processor, and a trainable student network functioning as the downstream reconstruction module. Extensive evaluations on benchmark datasets demonstrate that the proposed method achieves high performance in both AD accuracy and anomaly localization precision, while also maintaining strong generalization across diverse anomaly types. Notably, it shows particular effectiveness in industrial inspection scenarios, where anomalous patterns are subtle and training data are strictly limited to normal samples.
ISSN:2169-3536