Superlinear Convergence of a Modified Newton's Method for Convex Optimization Problems With Constraints
- Bouchta RHANIZAR
Abstract
We consider the constrained optimization problem defined by:
$$f (x^*) = \min_{x \in X} f(x)\eqno (1)$$
where the function f : \pmb{\mathbb{R}}^{n} → \pmb{\mathbb{R}} is convex on a closed bounded convex set X.
To solve problem (1), most methods transform this problem into a problem without constraints, either by introducing Lagrange multipliers or a projection method.
The purpose of this paper is to give a new method to solve some constrained optimization problems, based on the definition of a descent direction and a step while remaining in the X convex domain. A convergence theorem is proven. The paper ends with some numerical examples.
- Full Text:
PDF
- DOI:10.5539/jmr.v13n2p90
Journal Metrics
- h-index (December 2020): 21
- i10-index (December 2020): 64
- h5-index (December 2020): N/A
- h5-median (December 2020): N/A
( The data was calculated based on Google Scholar Citations. Click Here to Learn More. )
Index
- Academic Journals Database
- Aerospace Database
- BASE (Bielefeld Academic Search Engine)
- Civil Engineering Abstracts
- COPAC
- EBSCOhost
- EconPapers
- Elektronische Zeitschriftenbibliothek (EZB)
- Google Scholar
- Harvard Library
- IDEAS
- Infotrieve
- JournalTOCs
- LOCKSS
- MathGuide
- MathSciNet
- MIAR
- NewJour
- Open J-Gate
- PKP Open Archives Harvester
- Publons
- RePEc
- SHERPA/RoMEO
- SocioRePEc
- Standard Periodical Directory
- Technische Informationsbibliothek (TIB)
- The Keepers Registry
- UCR Library
- Ulrich's
- Universe Digital Library
- WorldCat
Contact
- Sophia WangEditorial Assistant
- jmr@ccsenet.org