|
|
|
 |
Search published articles |
 |
|
Showing 2 results for Optimization Problem
S. Ahmadi, N. Movahedian, Volume 5, Issue 1 (5-2014)
Abstract
Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constraint qualifications. It is proved that nonsmooth complementary approximate Karush-Kuhn-Tucker conditions are stronger than nonsmooth approximate gradient projection conditions. Sufficiency for differentiable generalized convex programming is established.
Dr. Elham Basiri, Dr. S.m.t.k. Mirmostafaee, Volume 14, Issue 2 (12-2023)
Abstract
This paper considers the progressively Type-II censoring and determines the optimal sample size using a Bayesian prediction approach. To this end, two criteria, namely the Bayes risk function of the point predictor for a future progressively censored order statistic and the designing cost of the experiment are considered. In the Bayesian prediction, the general entropy loss function is applied. We find the optimal sample size such that the Bayes risk function and the cost of the experiment do not exceed two pre-fixed values. To show the usefulness of the results, some numerical computations are presented.
|
|
|
|
|
|