|
|
|
 |
Search published articles |
 |
|
Showing 3 results for Global Convergence
N. Hoseini Monjezi, Volume 5, Issue 1 (5-2014)
Abstract
Here, a quasi-Newton algorithm for constrained multiobjective optimization is proposed. Under suitable assumptions, global convergence of the algorithm is established.
Dr Zohreh Akbari , Dr Zeinab Saeidian, Volume 12, Issue 2 (11-2021)
Abstract
In this paper, a nonmonotone line search strategy is presented for minimization of the locally Lipschitz continuous function. First, the Armijo condition is generalized along a descent direction at the current point. Then, a step length is selected along a descent direction satisfying the generalized Armijo condition. We show that there exists at least one step length satisfying the generalized Armijo condition. Next, the nonmonotone line search algorithm is proposed and its global convergence is proved. Finally, the proposed algorithm is implemented in the MATLAB environment and compared with some methods in the subject literature. It can be seen that the proposed method not only computes the global optimum also reduces the number of function evaluations than the monotone line search method.
Ali Ashrafi, Meysam Ranjbar, Volume 16, Issue 1 (3-2025)
Abstract
In this paper, a modified hybrid three-term conjugate gradient (CG) method is proposed for solving unconstrained optimization problems. The search direction is a three-term hybrid form of the Hestenes-Stiefel (HS) and Liu–Storey (LS) CG parameters. It is established that the method ensures the sufficient descent property independent of line search techniques. The convergence analysis of the proposed method is carried out under standard assumptions for general functions. Numerical experiments on CUTEr problems and image denoising tasks demonstrate that our method outperforms existing approaches in terms of efficiency, accuracy, and robustness, particularly under high levels of salt-and-pepper noise.
|
|
|
|
|
|