Consensus Guided Multi-View Unsupervised Feature Selection with Hybrid Regularization
Multi-source heterogeneous data has been widely adopted in developing artificial intelligence systems in recent years. In real-world scenarios, raw multi-source data are generally unlabeled and inherently contain multi-view noise and feature redundancy, leading to extensive research on unsupervised...
Saved in:
Main Authors: | , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-06-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/12/6884 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Multi-source heterogeneous data has been widely adopted in developing artificial intelligence systems in recent years. In real-world scenarios, raw multi-source data are generally unlabeled and inherently contain multi-view noise and feature redundancy, leading to extensive research on unsupervised multi-view feature selection. However, existing approaches mainly utilize the local adjacency relationships and the <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mi>L</mi><mrow><mn>2</mn><mo>,</mo><mn>1</mn></mrow></msub></semantics></math></inline-formula>-norm to guide the feature selection process, which may lead to instability in performance. To address these problems, this paper proposes a Consensus Guided Multi-view Unsupervised Feature Selection with Hybrid Regularization (CGMvFS). Specifically, CGMvFS integrates multiple view-specific basic partitions into a unified consensus matrix, which is constructed to guide the feature selection process by preserving comprehensive pairwise constraints across diverse views. A hybrid regularization strategy incorporating the <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mi>L</mi><mrow><mn>2</mn><mo>,</mo><mn>1</mn></mrow></msub></semantics></math></inline-formula>-norm and the Frobenius norm is introduced into the feature selection objective function, which not only promotes feature sparsity but also effectively prevents overfitting, thereby improving the stability of the model. Extensive empirical evaluations demonstrate that CGMVFS outperforms state-of-the-art comparison approaches across diverse multi-view datasets. |
---|---|
ISSN: | 2076-3417 |