This preview shows page 1. Sign up to view the full content.
Unformatted text preview: e we prove this note the following observation. If f (t) =< f (t), f (t) >= 1, then
< f ( t ) , f ( t ) > + < f ( t ) , f ( t ) > = 0.
Proof. ⇒: Assume Ay = λy and let f : R → S be diﬀerentiable with f (0) = y . Then
d
< Af (t), f (t) >= < Af (t), f (t) > + < Af (t), f (t) >
dt
=< f (t), Af (t) > + < Af (t), f (t) >
When t = 0 this equation becomes
< f (0), Af (0) > + < Af (0), f (0) >= λ[< f (0), y > + < y, f (0) >] = 0
By the observation.
d
⇐: Assume that dt t=0 qA (f (t)) = 0 ∀ diﬀerentiable f with f (y ) = y. The claim is that y
is an eigenvector of A. Let v ∈ Cn be such that v ⊥ y . Now deﬁne fv (t) = (cos(t))y + (sin(t))v.
Note that fv (t) = 1. Now by assumption, we have that
d
t=0 qA (f (t)) = < Afv (t), fv (t) > + < Afv (t), fv (t) > t=0
dt
=< Av, y > + < Ay, v...
View
Full
Document
This note was uploaded on 01/16/2014 for the course MATH 6304 taught by Professor Bernhardbodmann during the Fall '12 term at University of Houston.
 Fall '12
 BernhardBodmann
 Math, Matrices

Click to edit the document details