Machine Learning Ship Classifiers for Signals from Passive Sonars

The accurate automatic classification of underwater acoustic signals from passive SoNaR is vital for naval operational readiness, enabling timely vessel identification and real-time maritime surveillance. This study evaluated seven supervised machine learning algorithms for ship identification using...

Full description

Saved in:
Bibliographic Details
Main Authors: Allyson A. da Silva, Lisandro Lovisolo, Tadeu N. Ferreira
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/13/6952
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The accurate automatic classification of underwater acoustic signals from passive SoNaR is vital for naval operational readiness, enabling timely vessel identification and real-time maritime surveillance. This study evaluated seven supervised machine learning algorithms for ship identification using passive SoNaR recordings collected by the Brazilian Navy. The dataset encompassed 12 distinct ship classes and was processed in two ways—full-resolution and downsampled inputs—to assess the impacts of preprocessing on the model accuracy and computational efficiency. The classifiers included standard Support Vector Machines, K-Nearest Neighbors, Random Forests, Neural Networks and two less conventional approaches in this context: Linear Discriminant Analysis (LDA) and the XGBoost ensemble method. Experimental results indicate that data decimation significantly affects classification accuracy. LDA and XGBoost delivered the strongest performance overall, with XGBoost offering particularly robust accuracy and computational efficiency suitable for real-time naval applications. These findings highlight the promise of advanced machine learning techniques for complex multiclass ship classification tasks, enhancing acoustic signal intelligence for military maritime surveillance and contributing to improved naval situational awareness.
ISSN:2076-3417