bv_cvxbook_extra_exercises

# A find expressions for the gradient and hessian of b

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: that the iterates are (k ) x1 = γ γ−1 γ+1 k , (k ) x2 = − γ−1 γ+1 k . Therefore x(k) converges to (0, 0). However, this is not the optimum, since f is unbounded below. 8.2 A characterization of the Newton decrement. Let f : Rn → R be convex and twice diﬀerentiable, and let A be a p × n-matrix with rank p. Suppose x is feasible for the equality constrained problem ˆ minimize f (x) subject to Ax = b. Recall that the Newton step ∆x at x can be computed from the linear equations ˆ ∇2 f (ˆ) AT x A 0 ∆x u = −∇f (ˆ) x 0 , and that the Newton decrement λ(ˆ) is deﬁned as x λ(ˆ) = (−∇f (ˆ)T ∆x)1/2 = (∆xT ∇2 f (ˆ)∆x)1/2 . x x x Assume the coeﬃcient matrix in the linear equations above is nonsingular and that λ(ˆ) is positive. x Express the solution y of the optimization problem minimize ∇f (ˆ)T y x subject to Ay = 0 y T ∇2 f ()y ≤ 1 x in terms of Newton step ∆x and the Newton decrement λ(ˆ). x 8.3 Suggestions for exercises 9.30 in Convex Optimization. We recommend the following to generate a problem instance: 74 n = 100...
View Full Document

## This note was uploaded on 09/10/2013 for the course C 231 taught by Professor F.borrelli during the Fall '13 term at Berkeley.

Ask a homework question - tutors are online