**Unformatted text preview: **s , 0 must satisfy, 0.
, Now consider the constraint,
, ,
, 0 . A path for , | , , Intuitively, what we are saying is: we obtain a max for
must travel along the path dictated by the constraint, , that conforms to 0, or
,
,
subject to the fact that our
,
0. , Now, with our Lagranian method, we had, as first‐order conditions,
∗, ∗ 2.
3.
Assuming ∗ ∗, ∗ ∗, ∗ 1. ∗ ∗ ∗, ∗ ∗ , ∗ , ∗ 0
0 0
0, we have: ∗, ∗ ∗, ∗ ∗, ∗ ∗, ∗ . That is, the slopes of the level set and the constraint must be the same – and, since we are on the constraint (Condition 3 holds)
∗∗
,
0.
Good 2
we know ∗ , ∗ 0
Good 1 Example: Find the critical points of the function to 1 0. Here, 0;
0.
,,
First‐order conditions:
a) 2
2 1) 0 b) subject 0 c) = 1 0. a) and b) together imply 1 0, we can find . Using the constraint, , and then . 2nd order conditions and constrained optimization.
So far, we have been talking about the first‐order conditions that must prevail for an optimum.
We want to turn to a discussion to what we need, along the lines of second‐order conditions for
our problem. To motivate this discussion, we’ll focus our discussion around the example we
have been working with previously, namely, ∶
→ ; we want to choose
,
∈
to
maximize
, subject to the constraint
,
0.
We formed the Lagrangian for this problem,
, , ≡ ,
,
and we
noted it was easy to deal with the problem in the sense that it ‘acts’ like an unconstrained
optimization problem in 3 variables , , and .
Now, we know a little bit about these sorts of problems based on the properties of our Hessian
matrix.
∗∗∗
∗∗∗
∗∗∗
,,
,,
,,
∗∗∗
∗∗∗
∗∗∗
∗∗∗
,,
,,
,,
,,
∗∗∗
∗∗∗
∗∗∗
,,
,,
,,
∗∗∗
In particular, we stated that if
,,
is negative semidefinite, our critical point,
∗∗∗
,,
will be a local maximum of (and as a consequence, of subject to the constraint,
∗∗∗
,
0). But what are these elements of
,,?
∗
∗ , ∗ , ∗ = ∗ ,
∗
, ∗
∗ ∗ ,
∗
, , ∗ ,
∗
, ∗ ∗...

View
Full Document