A Regularized Newton Method with Correction for Unconstrained Nonconvex Optimization


  •  Heng Wang    
  •  Mei Qin    

Abstract

In this paper, we present a modified regularized Newton method for minimizing a nonconvex function whose Hessian matrix may be singular. We show that if the gradient and Hessian of the objective function are Lipschitz continuous, then the method has a global convergence property. Under the local error bound condition which is weaker than nonsingularity, the method has cubic convergence.


This work is licensed under a Creative Commons Attribution 4.0 License.