Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly Detection

Unsupervised anomaly detection (AD) remains a notable challenge in computer vision research, due to the inherent absence of annotated anomalous data and the unpredictable nature of anomaly manifestations. To address these challenges, a novel Transformer-based knowledge distillation framework is prop...

Full description

Saved in:
Bibliographic Details
Main Authors: Danyang Wang, Bingyan Wang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11062580/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1839635769003606016
author Danyang Wang
Bingyan Wang
author_facet Danyang Wang
Bingyan Wang
author_sort Danyang Wang
collection DOAJ
description Unsupervised anomaly detection (AD) remains a notable challenge in computer vision research, due to the inherent absence of annotated anomalous data and the unpredictable nature of anomaly manifestations. To address these challenges, a novel Transformer-based knowledge distillation framework is proposed through a hierarchical architecture designed for improved anomaly recognition. This three-stage architecture consists of a fixed pretrained teacher network serving as the upstream feature extractor, a dedicated multi-feature aggregation and filtering module integrated with Vision Transformer components as the intermediate processor, and a trainable student network functioning as the downstream reconstruction module. Extensive evaluations on benchmark datasets demonstrate that the proposed method achieves high performance in both AD accuracy and anomaly localization precision, while also maintaining strong generalization across diverse anomaly types. Notably, it shows particular effectiveness in industrial inspection scenarios, where anomalous patterns are subtle and training data are strictly limited to normal samples.
format Article
id doaj-art-4e9acbbbc8d34e5bb12ee55af0f17ef2
institution Matheson Library
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-4e9acbbbc8d34e5bb12ee55af0f17ef22025-07-08T23:00:21ZengIEEEIEEE Access2169-35362025-01-011311420811421510.1109/ACCESS.2025.358489211062580Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly DetectionDanyang Wang0https://orcid.org/0009-0008-4247-5430Bingyan Wang1https://orcid.org/0009-0005-0497-7841School of Electrical and Control Engineering, Henan University of Urban Construction, Pingdingshan, ChinaSchool of Electrical and Control Engineering, Henan University of Urban Construction, Pingdingshan, ChinaUnsupervised anomaly detection (AD) remains a notable challenge in computer vision research, due to the inherent absence of annotated anomalous data and the unpredictable nature of anomaly manifestations. To address these challenges, a novel Transformer-based knowledge distillation framework is proposed through a hierarchical architecture designed for improved anomaly recognition. This three-stage architecture consists of a fixed pretrained teacher network serving as the upstream feature extractor, a dedicated multi-feature aggregation and filtering module integrated with Vision Transformer components as the intermediate processor, and a trainable student network functioning as the downstream reconstruction module. Extensive evaluations on benchmark datasets demonstrate that the proposed method achieves high performance in both AD accuracy and anomaly localization precision, while also maintaining strong generalization across diverse anomaly types. Notably, it shows particular effectiveness in industrial inspection scenarios, where anomalous patterns are subtle and training data are strictly limited to normal samples.https://ieeexplore.ieee.org/document/11062580/Teacher-student modelautoencodertransformerknowledge distillation
spellingShingle Danyang Wang
Bingyan Wang
Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly Detection
IEEE Access
Teacher-student model
autoencoder
transformer
knowledge distillation
title Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly Detection
title_full Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly Detection
title_fullStr Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly Detection
title_full_unstemmed Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly Detection
title_short Transformer-Guided Serial Knowledge Distillation for High-Precision Anomaly Detection
title_sort transformer guided serial knowledge distillation for high precision anomaly detection
topic Teacher-student model
autoencoder
transformer
knowledge distillation
url https://ieeexplore.ieee.org/document/11062580/
work_keys_str_mv AT danyangwang transformerguidedserialknowledgedistillationforhighprecisionanomalydetection
AT bingyanwang transformerguidedserialknowledgedistillationforhighprecisionanomalydetection