Ensembles of Gradient Boosting Recurrent Neural Network for Time Series Data Prediction

Ensemble deep learning can combine strengths of neural network and ensemble learning, which gradually becomes a new emerging research direction. However, the existing methods either lack theoretical support or demand large integrated models. In this paper, Ensembles of Gradient Boosting Recurrent Ne...

Full description

Saved in:
Bibliographic Details
Main Authors: Shiqing Sang, Fangfang Qu, Pengcheng Nie
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9438681/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Ensemble deep learning can combine strengths of neural network and ensemble learning, which gradually becomes a new emerging research direction. However, the existing methods either lack theoretical support or demand large integrated models. In this paper, Ensembles of Gradient Boosting Recurrent Neural Network (EGB-RNN) is proposed, which combines the gradient boosting ensemble framework with three types of recurrent neural network models, namely Minimal Gated Unit (MGU), Gated Recurrent Unit (GRU) and Long Short-Term Memory (LSTM). RNN model is used as base learner to integrate an ensemble learner, through the way of gradient boosting. Meanwhile, for ensuring the ensemble model fit data better, Step Iteration Algorithm is designed to find an appropriate learning rate before models being integrated. The proposed method is tested on four time-series datasets. Experimental results demonstrate that with the number of integration increasing, the performance of three types of EGB-RNN models tend to converge and the best EGB-RNN model and the best degree of ensemble vary with data sets. It is also shown in statistical results that the designed EGB-RNN models perform better than six baselines.
ISSN:2169-3536