U-Net Inspired Transformer Architecture for Multivariate Time Series Synthesis
This study introduces a Multiscale Dual-Attention U-Net (TS-MSDA U-Net) model for long-term time series synthesis. By integrating multiscale temporal feature extraction and dual-attention mechanisms into the U-Net backbone, the model captures complex temporal dependencies more effectively. The model...
Saved in:
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-06-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/25/13/4073 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This study introduces a Multiscale Dual-Attention U-Net (TS-MSDA U-Net) model for long-term time series synthesis. By integrating multiscale temporal feature extraction and dual-attention mechanisms into the U-Net backbone, the model captures complex temporal dependencies more effectively. The model was evaluated in two distinct applications. In the first, using multivariate datasets from 70 real-world electric vehicle (EV) trips, TS-MSDA U-Net achieved a mean absolute error below 1% across key parameters, including battery state of charge, voltage, acceleration, and torque—representing a two-fold improvement over the baseline TS-p2pGAN. While dual-attention modules provided only modest gains over the basic U-Net, the multiscale design enhanced overall performance. In the second application, the model was used to reconstruct high-resolution signals from low-speed analog-to-digital converter data in a prototype resonant CLLC half-bridge converter. TS-MSDA U-Net successfully learned nonlinear mappings and improved signal resolution by a factor of 36, outperforming the basic U-Net, which failed to recover essential waveform details. These results underscore the effectiveness of transformer-inspired U-Net architectures for high-fidelity multivariate time series modeling in both EV analytics and power electronics. |
---|---|
ISSN: | 1424-8220 |