This preview shows page 1. Sign up to view the full content.
Unformatted text preview: ilarly: Therefore the general form of the tangent plane is:
20 and is the plane that and . We can rewrite this as: Linear Approximation
Definition (Linearization): We define linearization as: We call this the linearization of f at
We can use this result as a linear approximation: Recall: x changes from to Increment of y:
We say that is the linear approximation and is the error f is differentiable at a implies: For multivariable calculus: x changes from to and y changes from to Increment of z:
Definition (Differentiability): f is differentiable at 21 if: Put simply, f is differentiable at if its linear approximation is a good approximation near Theorem: If the partial derivatives and exist near is differentiable at
Theorem: If f is differentiable at then f is continuous at Differentials
Recall:
In single variable calculus, In multivariable calculus, this is very similar, 22 and are continuous at , then f 14.5 The Chain Rule
Recall:
For single variable: The chain rule is: For multiple variables: The chain rule is: We can generalize it even further: 23 14.6 Direction Derivatives and Gradient Vector
Recall: We call this the rate of change of in the direction of the xaxis at Similarly we also can call this rate of change of in the direction of the yaxis at Now we can generalize this statement by saying it approaches
some unit vector
We now can define the line through . in the direction of We call this the instantaneous rate of change of from the direction of as: in the direction of the unit vector Definition (Directional Derivative): The directional derivative of
direction of
is: at In some particular cases: Calculation of the Directional Derivative
We let , if we think of the chain rule we can see that: We can rearrange the definition of the directional derivative as: 24 in the . If we want to generalize then: As we can see this is simply the chain rule and also a linear combination of the two partial
derivatives.
Now we can use this to further generalize the definition: In the special case when then: Definition (Gradient Vector): If we look at the definition of the directional derivative, we can
see it is simply a dot product when it is rearranged in the form: The vector is the gradient vector and is denoted by: In essence, The directional derivative is merely the dot product of the gradient vector and the unit vector.
We can further expand this to functions of three variables: And similar to before: 25 Where, Maximizing the Directional Derivative
Question: On which directions does the function change the fastest and what is the maximum
rate of change? This quantity is maximized when
is at its maximum, 1, which implies that the angle
subtended by the gradient vector and the unit vector is 0. Therefore, Therefore the rate of change of the function is maximized in the direction of the gradient vector,
and the maximum rate of change is
at the point
. Of note,
changing the angle allows us to find, half of the maximum, the minimum and making the
directional derivative zero. 26 14.7 Maximum and Minimum Values
If Then,
Definition (Local Extrema): f has a local maximum or minimum at if F...
View Full
Document
 Spring '08
 Dr.AnthonyDixon

Click to edit the document details