Superlinear Convergence of a Modified Newton's Method for Convex Optimization Problems With Constraints

  •  Bouchta RHANIZAR    


We consider the constrained optimization problem  defined by:
$$f (x^*) = \min_{x \in  X} f(x)\eqno (1)$$

where the function  f : \pmb{\mathbb{R}}^{n} → \pmb{\mathbb{R}} is convex  on a closed bounded convex set X.
To solve problem (1), most methods transform this problem into a problem without constraints, either by introducing Lagrange multipliers or a projection method.
The purpose of this paper is to give a new method to solve some constrained optimization problems, based on the definition of a descent direction and a step while remaining in the X convex domain. A convergence theorem is proven. The paper ends with some numerical examples.

This work is licensed under a Creative Commons Attribution 4.0 License.
  • ISSN(Print): 1916-9795
  • ISSN(Online): 1916-9809
  • Started: 2009
  • Frequency: bimonthly

Journal Metrics

  • h-index (December 2020): 21
  • i10-index (December 2020): 64
  • h5-index (December 2020): N/A
  • h5-median (December 2020): N/A

( The data was calculated based on Google Scholar Citations. Click Here to Learn More. )