A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity
With the exponential growth of cyberbullying cases on social media, there is a growing need to develop effective mechanisms for its detection and prediction, which can create a safer and more comfortable digital environment. One of the areas with such potential is the application of natural language...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-05-01
|
Series: | Technologies |
Subjects: | |
Online Access: | https://www.mdpi.com/2227-7080/13/6/223 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1839652541028106240 |
---|---|
author | Aleksandr Chechkin Ekaterina Pleshakova Sergey Gataullin |
author_facet | Aleksandr Chechkin Ekaterina Pleshakova Sergey Gataullin |
author_sort | Aleksandr Chechkin |
collection | DOAJ |
description | With the exponential growth of cyberbullying cases on social media, there is a growing need to develop effective mechanisms for its detection and prediction, which can create a safer and more comfortable digital environment. One of the areas with such potential is the application of natural language processing (NLP) and artificial intelligence (AI). This study applies a novel hybrid-structure Hybrid Transformer–Enriched Attention with Multi-Domain Dynamic Attention Network (Hyb-KAN), which combines a transformer-based architecture, an attention mechanism, and BiLSTM recurrent neural networks. In this study, a multi-class classification method is used to identify comments containing cyberbullying features. For better verification, we compared the proposed method with baseline methods. The Hyb-KAN model demonstrated high results on the multi-class classification dataset, achieving an accuracy of 95.25%. The synergy of BiLSTM, Transformer, MD-DAN, and KAN components provides flexibility and accuracy of text analysis. The study used explainable visualization techniques, including SHAP and LIME, to analyze the interpretability of the Hyb-KAN model, providing a deeper understanding of the decision-making mechanisms. In the final stage of the study, the results were compared with current research data to confirm their relevance to current trends. |
format | Article |
id | doaj-art-b75b51e9bbd64a649dcdd14fcd9c8474 |
institution | Matheson Library |
issn | 2227-7080 |
language | English |
publishDate | 2025-05-01 |
publisher | MDPI AG |
record_format | Article |
series | Technologies |
spelling | doaj-art-b75b51e9bbd64a649dcdd14fcd9c84742025-06-25T14:28:09ZengMDPI AGTechnologies2227-70802025-05-0113622310.3390/technologies13060223A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for CybersecurityAleksandr Chechkin0Ekaterina Pleshakova1Sergey Gataullin2Department of Mathematics and Data Analysis, Financial University under the Government of the Russian Federation, 49 Leningradsky Prospect, Moscow 125993, RussiaMIREA—Russian Technological University, 78 Vernadsky Avenue, Moscow 119454, RussiaMIREA—Russian Technological University, 78 Vernadsky Avenue, Moscow 119454, RussiaWith the exponential growth of cyberbullying cases on social media, there is a growing need to develop effective mechanisms for its detection and prediction, which can create a safer and more comfortable digital environment. One of the areas with such potential is the application of natural language processing (NLP) and artificial intelligence (AI). This study applies a novel hybrid-structure Hybrid Transformer–Enriched Attention with Multi-Domain Dynamic Attention Network (Hyb-KAN), which combines a transformer-based architecture, an attention mechanism, and BiLSTM recurrent neural networks. In this study, a multi-class classification method is used to identify comments containing cyberbullying features. For better verification, we compared the proposed method with baseline methods. The Hyb-KAN model demonstrated high results on the multi-class classification dataset, achieving an accuracy of 95.25%. The synergy of BiLSTM, Transformer, MD-DAN, and KAN components provides flexibility and accuracy of text analysis. The study used explainable visualization techniques, including SHAP and LIME, to analyze the interpretability of the Hyb-KAN model, providing a deeper understanding of the decision-making mechanisms. In the final stage of the study, the results were compared with current research data to confirm their relevance to current trends.https://www.mdpi.com/2227-7080/13/6/223artificial intelligenceKANBiLSTMcybersecurityhybrid modelsRNN |
spellingShingle | Aleksandr Chechkin Ekaterina Pleshakova Sergey Gataullin A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity Technologies artificial intelligence KAN BiLSTM cybersecurity hybrid models RNN |
title | A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity |
title_full | A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity |
title_fullStr | A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity |
title_full_unstemmed | A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity |
title_short | A Hybrid KAN-BiLSTM Transformer with Multi-Domain Dynamic Attention Model for Cybersecurity |
title_sort | hybrid kan bilstm transformer with multi domain dynamic attention model for cybersecurity |
topic | artificial intelligence KAN BiLSTM cybersecurity hybrid models RNN |
url | https://www.mdpi.com/2227-7080/13/6/223 |
work_keys_str_mv | AT aleksandrchechkin ahybridkanbilstmtransformerwithmultidomaindynamicattentionmodelforcybersecurity AT ekaterinapleshakova ahybridkanbilstmtransformerwithmultidomaindynamicattentionmodelforcybersecurity AT sergeygataullin ahybridkanbilstmtransformerwithmultidomaindynamicattentionmodelforcybersecurity AT aleksandrchechkin hybridkanbilstmtransformerwithmultidomaindynamicattentionmodelforcybersecurity AT ekaterinapleshakova hybridkanbilstmtransformerwithmultidomaindynamicattentionmodelforcybersecurity AT sergeygataullin hybridkanbilstmtransformerwithmultidomaindynamicattentionmodelforcybersecurity |