كليدواژه :
Trust region method , adaptive radius , cubic regularization , global convergence.
چكيده فارسي :
This paper studies the solution of unconstrained optimization problems based on trust region methods with simple subproblem, in which the approximations of the gradient and Hessian are calculated through subsampling. In this framework, we propose a new adaptive rule for updating the radius. Also, in order to improve the efficiency of the algorithm, we try to use more available information about function values and gradient values, as soon as possible. In this regard, we introduced a scalar approximation of the Hessian at the current point using a modified quasi-Newton equation. Specifically, we focus our attention on a variant of trust region methods known as cubic regularization. By employing a suitable sampling scheme, we established the local and global convergence properties.