MCAF-Net: Multi-Channel Temporal Cross-Attention Network with Dynamic Gating for Sleep Stage Classification

Automated sleep stage classification is essential for objective sleep evaluation and clinical diagnosis. While numerous algorithms have been developed, the predominant existing methods utilize single-channel electroencephalogram (EEG) signals, neglecting the complementary physiological information a...

Full description

Saved in:
Bibliographic Details
Main Authors: Xuegang Xu, Quan Wang, Changyuan Wang, Yaxin Zhang
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/14/4251
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Automated sleep stage classification is essential for objective sleep evaluation and clinical diagnosis. While numerous algorithms have been developed, the predominant existing methods utilize single-channel electroencephalogram (EEG) signals, neglecting the complementary physiological information available from other channels. Standard polysomnography (PSG) recordings capture multiple concurrent biosignals, where sophisticated integration of these multi-channel data represents a critical factor for enhanced classification accuracy. Conventional multi-channel fusion techniques typically employ elementary concatenation approaches that insufficiently model the intricate cross-channel correlations, consequently limiting classification performance. To overcome these shortcomings, we present MCAF-Net, a novel network architecture that employs temporal convolution modules to extract channel-specific features from each input signal and introduces a dynamic gated multi-head cross-channel attention mechanism (MCAF) to effectively model the interdependencies between different physiological channels. Experimental results show that our proposed method successfully integrates information from multiple channels, achieving significant improvements in sleep stage classification compared to the vast majority of existing methods.
ISSN:1424-8220