New Class of Rank 1 Update for Solving Unconstrained Optimization Problem

The focus of this article is to add a new class of rank one of modified QuasiNewton techniques to solve the problem of unconstrained optimization by updating the inverse Hessian matrix with an update of rank 1, where a diagonal matrix is the first component of the next inverse Hessian approximation, The inverse Hessian matrix is generated by the method proposed which is symmetric and it satisfies the condition of modified quasi-Newton, so the global convergence is retained. In addition, it is positive definite that guarantees the existence of the minimizer at every iteration of the objective function. We use the program MATLAB to solve an algorithm function to introduce the feasibility of the proposed procedure. Various numerical examples are given`.


Introduction:
There are many efforts to achieve a better approximation of the Hessian matrix. The modified Quasi-Newton (secant) condition is proposed by Zhang J, and Xu Ch. [1]. They use both gradient and function meaning knowledge to achieve a higher order accuracy in approximating the second curvature of the objective function.

Mahmood and Eidi
Iraqi Journal of Science, 2022, Vol. 63, No. 2, pp: 683-689 684 In [2], the authors find a revamped Broyden family composed of BFGS ( Broyden, Fletcher, Goldfard, andShanno), which analogs to the changing that suggested by authors in [1,3]. The BFGS update was updated on the basis of the new Quasi-Newton condition where is a matrix. The symmetric rank one update based on the extended quasi-Newton condition that was updated by authors in [4], and they provided that the method retains the symmetric and positive definite property and they also provided the proposed method's global and superlinear convergence. In [5], the authors suggested the Broyden update to guarantee the positive definite property of Hessian matrix and to give the global convergence of the proposed process. In [6], the authors use the Taylor theorem for the modified BFGS approach. It was developed for solving system of nonlinear equations. In [7], the authors presented the modified BFGS approach for solving the nonlinear system. Updated quasi-Newton equation was suggested in [7] to get a more detailed approximation of the second curvature of the objective function. In [8], the authors introduced the modified DFP (Davidon-Fletcher-Powell) update based on modified quasi-Newton condition and provided the global and superlinear convergence of the proposed method. A new family of updated BFGS updates was proposed in [9] to resolve the unconstrained issue of optimization for non-convex functions based on a new modified weak Wolfe-Powell line search technique. In [10], the authors proposed the modified BFGS update (H-version) by updating the vector s ( next solution-current solution) and they provided that the proposed method preserve the strong positive definite property and globally convergent. The method of Newton ( ( ( is efficient because it uses the Hessian matrix that provides the useful curvature data, however the computational efforts of the Hessian matrix are very costly or it is difficult to test the Hessian matrix even though the Hessian matrix is not analytically usable this refers to a class of techniques that use only the values of the equation and the gradients of the objective function which are closely related to the method of Newton. Even the Hessian matrix is not analytically available, or the Hessian's estimation is challenging. These refer to a class of techniques that use only the values of the equation and the gradients of the objective function that are closely related to the method of Newton. The Quasi-Newton system is a class of techniques that does not need to calculate the Hessian method. Quasi-Newton is a class of methods that does not need to calculate the Hessian matrix, but it generates a series of Hessian approximations at the same time preserves a high rate of convergence. In this work we will construct a Hessian approximation instead of evaluating the Hessian matrix ( ,as well as we will also create a Hessian approximation. We predict that the Hessian matrix sequence of approximates { } has positive definiteness, it has a ( ( downward direction, and it works like the process of Newton. Moreover, it is also important for its measurement to be effortless to handle. Consider the unconstrained optimization problem ( 1 ) The objective function f is assumed to be twice continuous and differentiable in a convex open set D, moreover f is uniformly convex.The most efficient quasi-Newton method is the (BFGS( method, which is introduced independently by Broyden, Fletcher, Goldfarb, and Shannoin in [10]. Note that the Hessian approximation B k+1 can be modified in the BFGS approach and ( ) ( ) Where B (k+1) is the next Hessian matrix approximation, H (k+1) is the next inverse Hessian matrix approximation, , is the current solution, is the next solution, ( ( , and is the gradient of the objective function f, BFGS update is considered a common update to solve the unregulated problem of optimization where the BFGS update only preserves the positive definite property if at each iteration. The problem is to solve problem (1) by generating a symmetrical and positive definite inverse Hessian matrix sequence that satisfies the quasi-Newton condition at any iteration.

New class of Rank 1 Update
Consider problem (1) with next inverse Hessian matrix approximation The modifiedquasi-Newton equation is given as follows: ( 2 ) The solution of Eq. (2) for such that is a diagonal matrix which is given in [10] ( Where is the identity matrix and the best choice of the vector which satisfy the equation (2) is Eq. (3) satisfies the modified quasi-Newton condition, symmetric, and positive definite if , however, at every iteration it is only in the diagonal form, thus it is very minimal to solve the optimization problem with large dimensions. In this case, computation is very minimal, hence it is useful to solve the optimization problem with large dimensions such that the computation in this case is very minimum. Since the Hessian matrix is not always in the diagonal form, so Eq. (3) can be modified to get an update which have the same property of Eq. (3), but it is not diagonal so that it is called the new class of rank 1 updatefor unconstrained optimization. Let us consider the rank 1 update formula ( 4 ) Where represents the dimension of the problem.
If changed by then, the new class of rank 1 update can be written as follows: To determine , let us consider the following equation: and if then we have the following equation ( 6 ) that leads to the solution of the Eq. (2) as follows: From Eq. (7), it is clear that Then Eq.(7) is called the new rank 1 update formula.

Algorithm of the method
Step l Consider a starting point an initial symmetric positive definite matrix , an error tolerance ; .
Step 2 Compute ( If ‖ ( ‖ then, the solution is and stop, otherwise , compute ( Step3Do exact or Inaccurate line search to identify a step size such that ( ( .

Step 4 Set
Step 5 Set and ( ( Step 6 Compute from equation (7) Step 7 and go to step 2 Theorem 1: The new rank 1 update which is given in Eq. (7) is symmetric, Positive definite, and satisfies the modified quasi-Newton equation.

Proof:
Let , hence Eq. (7) becomes , and since the first term is in the diagonal form then, Eq. (7) is symmetric. By Eq. (3) and Eq. (6), it is clear that Eq.(2) is hold and hence the modified quasi-Newton condition is satisfied. Finally we need only to prove that , that means ( ( ( ) ( ( ( ( , by exact line search and the positive definiteness of inverse of Hessian matrix. Hence is generated by the new rank 1update which is positive definite. Powell developed the global convergence of the Quasi-Newton methods in [11]. These observations were applied to the small class of Broyden by Byrd, Nocedal and Yuan in [11].

Lemma 2: [10]
Let that satisfies the ssumption of problem (1) From previous examples, it is clear that the proposed method can solve all the problems with different dimensions and different starting points. The iterations and f evaluation explain that the proposed method is terminated at the minimizer of the objective function for all starting points.

Conclusions
The new rank one update was proposed to address the unconstrained question of optimization. This update satisfies the modified quasi-Newton condition and symmetric, moreover the positive definite property of inverse Hessian approximation holds. Four examples explain the affectivity of the proposed update.