This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Math 215 HW #6 Solutions 1. Problem 3.1.14. Show that x y is orthogonal to x + y if and only if k x k = k y k . Proof. First, suppose x y is orthogonal to x + y . Then 0 = h x y , x + y i = ( x y ) T ( x + y ) = x T x + x T y y T x y T x = h x , x i + h x , y i  h y , x ii y , y i = h x , x i  h y , y i since h x , y i = h y , x i . In other words, 0 = k x k 2 k y k 2 , so k x k 2 = k y k 2 . Since the norm of a vector can never be negative, this implies that k x k = k y k . Thus, we see that if x y is orthogonal to x + y , then k x k = k y k . On the other hand, suppose k x k = k y k . Then h x y , x + y i = ( x y ) T ( x + y ) = x T x + x T y y T x y T x = h x , x i + h x , y i  h y , x ii y , y i = h x , x i  h y , y i = k x k 2 k y k 2 = 0 , so we see that x y is orthogonal to x + y . We’ve seen that the implication goes both ways, so we conclude that x y is orthogonal to x + y if and only if k x k = k y k , as desired. 2. Problem 3.1.20. Let S be a subspace of R n . Explain what ( S ⊥ ) ⊥ = S means and why it is true. Answer: First, ( S ⊥ ) ⊥ is the orthogonal complement of S ⊥ , which is itself the orthogonal complement of S , so ( S ⊥ ) ⊥ = S means that S is the orthogonal complement of its orthogonal complement. To show that it is true, we want to show that S is contained in ( S ⊥ ) ⊥ and, conversely, that ( S ⊥ ) ⊥ is contained in S ; if we can show both containments, then the only possible conclusion is that ( S ⊥ ) ⊥ = S . To show the first containment, suppose v ∈ S and w ∈ S ⊥ . Then h v , w i = 0 1 by the definition of S ⊥ . Thus, S is certainly contained in ( S ⊥ ) ⊥ (which consists of all vectors in R n which are orthogonal to S ⊥ ). To show the other containment, suppose v ∈ ( S ⊥ ) ⊥ (meaning that v is orthogonal to all vectors in S ⊥ ); then we want to show that v ∈ S . I’m sure there must be a better way to see this, but here’s one that works. Let { u 1 , . . . , u p } be a basis for S and let { w 1 , . . . , w q } be a basis for S ⊥ . If v / ∈ S , then { u 1 , . . . , u p , v } is a linearly independent set. Since each vector in that set is orthogonal to all of S ⊥ , the set { u 1 , . . . , u p , v , w 1 , . . . , w q } is linearly independent. Since there are p + q +1 vectors in this set, this means that p + q +1 ≤ n or, equivalently, p + q ≤ n 1. On the other hand, if A is the matrix whose i th row is u T i , then the row space of A is S and the nullspace of A is S ⊥ . Since S is pdimensional, the rank of A is p , meaning that the dimension of nul( A ) = S ⊥ is q = n p . Therefore, p + q = p + ( n p ) = n, contradicting the fact that p + q ≤ n 1. From this contradiction, then, we see that, if1....
View
Full
Document
This document was uploaded on 08/20/2011.
 Spring '09
 Math

Click to edit the document details