Introducing an Evolutionary Method to Create the Bounds of Artificial Neural Networks

Artificial neural networks are widely used in applications from various scientific fields and in a multitude of practical applications. In recent years, a multitude of scientific publications have been presented on the effective training of their parameters, but in many cases overfitting problems ap...

Full description

Saved in:
Bibliographic Details
Main Authors: Ioannis G. Tsoulos, Vasileios Charilogis, Dimitrios Tsalikakis
Format: Article
Language:English
Published: MDPI AG 2025-03-01
Series:Foundations
Subjects:
Online Access:https://www.mdpi.com/2673-9321/5/2/11
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1839653963582930944
author Ioannis G. Tsoulos
Vasileios Charilogis
Dimitrios Tsalikakis
author_facet Ioannis G. Tsoulos
Vasileios Charilogis
Dimitrios Tsalikakis
author_sort Ioannis G. Tsoulos
collection DOAJ
description Artificial neural networks are widely used in applications from various scientific fields and in a multitude of practical applications. In recent years, a multitude of scientific publications have been presented on the effective training of their parameters, but in many cases overfitting problems appear, where the artificial neural network shows poor results when used on data that were not present during training. This text proposes the incorporation of a three-stage evolutionary technique, which has roots in the differential evolution technique, for the effective training of the parameters of artificial neural networks and the avoidance of the problem of overfitting. The new method effectively constructs the parameter value range of the artificial neural network with one processing level and sigmoid outputs, both achieving a reduction in training error and preventing the network from experiencing overfitting phenomena. This new technique was successfully applied to a wide range of problems from the relevant literature and the results were extremely promising. From the conducted experiments, it appears that the proposed method reduced the average classification error by 30%, compared to the genetic algorithm, and the average regression error by 45%, as compared to the genetic algorithm.
format Article
id doaj-art-d22e5a79a6bc4e2b8f7ebfa98a5956f5
institution Matheson Library
issn 2673-9321
language English
publishDate 2025-03-01
publisher MDPI AG
record_format Article
series Foundations
spelling doaj-art-d22e5a79a6bc4e2b8f7ebfa98a5956f52025-06-25T13:51:38ZengMDPI AGFoundations2673-93212025-03-01521110.3390/foundations5020011Introducing an Evolutionary Method to Create the Bounds of Artificial Neural NetworksIoannis G. Tsoulos0Vasileios Charilogis1Dimitrios Tsalikakis2Department of Informatics and Telecommunications, University of Ioannina, 45110 Ioannina, GreeceDepartment of Informatics and Telecommunications, University of Ioannina, 45110 Ioannina, GreeceDepartment of Engineering Informatics and Telecommunications, University of Western Macedonia, 50100 Kozani, GreeceArtificial neural networks are widely used in applications from various scientific fields and in a multitude of practical applications. In recent years, a multitude of scientific publications have been presented on the effective training of their parameters, but in many cases overfitting problems appear, where the artificial neural network shows poor results when used on data that were not present during training. This text proposes the incorporation of a three-stage evolutionary technique, which has roots in the differential evolution technique, for the effective training of the parameters of artificial neural networks and the avoidance of the problem of overfitting. The new method effectively constructs the parameter value range of the artificial neural network with one processing level and sigmoid outputs, both achieving a reduction in training error and preventing the network from experiencing overfitting phenomena. This new technique was successfully applied to a wide range of problems from the relevant literature and the results were extremely promising. From the conducted experiments, it appears that the proposed method reduced the average classification error by 30%, compared to the genetic algorithm, and the average regression error by 45%, as compared to the genetic algorithm.https://www.mdpi.com/2673-9321/5/2/11neural networksevolutionary algorithmsstochastic methodsdifferential evolution
spellingShingle Ioannis G. Tsoulos
Vasileios Charilogis
Dimitrios Tsalikakis
Introducing an Evolutionary Method to Create the Bounds of Artificial Neural Networks
Foundations
neural networks
evolutionary algorithms
stochastic methods
differential evolution
title Introducing an Evolutionary Method to Create the Bounds of Artificial Neural Networks
title_full Introducing an Evolutionary Method to Create the Bounds of Artificial Neural Networks
title_fullStr Introducing an Evolutionary Method to Create the Bounds of Artificial Neural Networks
title_full_unstemmed Introducing an Evolutionary Method to Create the Bounds of Artificial Neural Networks
title_short Introducing an Evolutionary Method to Create the Bounds of Artificial Neural Networks
title_sort introducing an evolutionary method to create the bounds of artificial neural networks
topic neural networks
evolutionary algorithms
stochastic methods
differential evolution
url https://www.mdpi.com/2673-9321/5/2/11
work_keys_str_mv AT ioannisgtsoulos introducinganevolutionarymethodtocreatetheboundsofartificialneuralnetworks
AT vasileioscharilogis introducinganevolutionarymethodtocreatetheboundsofartificialneuralnetworks
AT dimitriostsalikakis introducinganevolutionarymethodtocreatetheboundsofartificialneuralnetworks