The Detection Optimization of Low-Quality Fake Face Images: Feature Enhancement and Noise Suppression Strategies

With the rapid advancement of deepfake technology, the detection of low-quality synthetic facial images has become increasingly challenging, particularly in cases involving low resolution, blurriness, or noise. Traditional detection methods often exhibit limited performance under such conditions. To...

Full description

Saved in:
Bibliographic Details
Main Authors: Ge Wang, Yue Han, Fangqian Xu, Yuteng Gao, Wenjie Sang
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/13/7325
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:With the rapid advancement of deepfake technology, the detection of low-quality synthetic facial images has become increasingly challenging, particularly in cases involving low resolution, blurriness, or noise. Traditional detection methods often exhibit limited performance under such conditions. To address these limitations, this paper proposes a novel algorithm, YOLOv9-ARC, which is designed to enhance the accuracy of detecting low-quality fake facial images. The proposed algorithm introduces an innovative convolution module, Adaptive Kernel Convolution (AKConv), which dynamically adjusts kernel sizes to effectively extract image features, thereby mitigating the challenges posed by low resolution, blurriness, and noise. Furthermore, a hybrid attention mechanism, Convolutional Block Attention Module (CBAM), is integrated to amplify salient features while suppressing irrelevant information. Extensive experiments demonstrate that YOLOv9-ARC achieves a mean average precision (mAP) of 75.1% on the DFDC (DeepFake Detection Challenge) dataset, representing a 3.5% improvement over the baseline model. The proposed YOLOv9-ARC not only addresses the challenges of low-quality deepfake detection but also demonstrates significant improvements in accuracy within this domain.
ISSN:2076-3417