A New Conjugate Gradient for Efficient Unconstrained Optimization with Robust Descent Guarantees
The Conjugate Gradient method is a powerful iterative algorithm aims to find the minimum of a function by iteratively searching along conjugate directions. This work presents a nonlinear conjugate gradient approach for unconstrained optimization, resulting from the resolution of a novel optimizatio...
Saved in:
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
College of Computer and Information Technology – University of Wasit, Iraq
2025-06-01
|
Series: | Wasit Journal of Computer and Mathematics Science |
Subjects: | |
Online Access: | https://wjcm.uowasit.edu.iq/index.php/wjcm/article/view/358 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The Conjugate Gradient method is a powerful iterative algorithm aims to find the minimum of a function by iteratively searching along conjugate directions. This work presents a nonlinear conjugate gradient approach for unconstrained optimization, resulting from the resolution of a novel optimization problem. The theoretical framework of the proposed method is discussed, and on the application of the descent condition. To evaluate its performance, numerical experiments were conducted, comparing the proposed method against established algorithms () and (Lia and Story) methods. The results demonstrate that the new method not only exhibits enhanced efficiency but also significantly outperforms the () and (Lia and Story) methods in terms of optimization effectiveness. These findings suggest that the proposed approach offers a competitive and promising alternative for solving unconstrained optimization problems.
|
---|---|
ISSN: | 2788-5879 2788-5887 |