1Practical task 1: Gradient descent and Newton methods.1.1Experiment: Gradient descent trajectory on a quadratic functionAnalyze the gradient descent trajectory for several quadratic functions:come up with two or threequadratictwo-dimensionalfunctions on which the method will work differently, draw graphs with function-level lines and method trajectories.Try to answer the following question:How does the behavior of the method differ depending on the numberof conditionality of the function, the choice of the starting point, and the step selection strategy (constantstrategy, Armijo, Wolfe)?To draw level lines, you can use theplot_levelsfunction, and to draw trajectoriesplot_trajectoryfrom the fileplot_trajectory_2d.py, attached to the task.Also note that the oracle of the quadratic functionQuadraticOracleis already implemented in theoraclesmodule. It implements the functionf(x) = (1/2)hAx, xi - hb, xi, whereA∈Sn++,b∈Rn.1.2Experiment: Dependence of the number of iterations of gradient descenton the number of conditionality and dimension of the spaceInvestigate how the number of iterations required for gradient descent to converge depends on the fol-lowing two parameters: 1) conditionality numbersκ≥1of the optimized function and 2) the dimension ofthe spacenof the optimized variables.To do this, for the given parametersnandκ, generate a random quadratic problem of sizenwiththe condition numberκand run a gradient descent on it with some fixed required accuracy. Measure thenumber of iterations ofT(n, κ)that the method needed to make before convergence (successful exit by thestop criterion).