Variational autoencoders for at-source data reduction and anomaly detection in high energy particle detectors

Detectors in next-generation high-energy physics experiments face several daunting requirements, such as high data rates, damaging radiation exposure, and stringent constraints on power, space, and latency. To address these challenges, machine learning in readout electronics can be leveraged for sma...

Full description

Saved in:
Bibliographic Details
Main Authors: Alexander Yue, Haoyi Jia, Julia Gonski
Format: Article
Language:English
Published: IOP Publishing 2025-01-01
Series:Machine Learning: Science and Technology
Subjects:
Online Access:https://doi.org/10.1088/2632-2153/adf0c0
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Detectors in next-generation high-energy physics experiments face several daunting requirements, such as high data rates, damaging radiation exposure, and stringent constraints on power, space, and latency. To address these challenges, machine learning in readout electronics can be leveraged for smart detector designs, enabling intelligent inference and data reduction at-source. Variational autoencoders (VAEs) offer a variety of benefits for front-end readout; an on-sensor encoder can perform efficient lossy data compression while simultaneously providing a latent space representation that can be used for anomaly detection. Results are presented from low-latency and resource-efficient VAEs for front-end data processing in a futuristic silicon pixel detector. Encoder-based data compression is found to preserve good performance of off-detector analysis while significantly reducing the off-detector data rate as compared to a similarly sized data filtering approach. Furthermore, the latent space information is found to be a useful discriminator in the context of real-time sensor defect monitoring. Together, these results highlight the multifaceted utility of autoencoder-based front-end readout schemes and motivate their consideration in future detector designs.
ISSN:2632-2153