{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

4 a convenient mnemonic for the definition is given

Info iconThis preview shows pages 2–6. Sign up to view the full content.

View Full Document Right Arrow Icon
4. A convenient mnemonic for the definition is given by the formal determinant expression, x × y = det ˆ ı ˆ ˆ k x 1 x 2 x 3 y 1 y 2 y 3 .
Background image of page 2

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Cross product 3 In this expression the entries in the first row are the standard unit coordinate vectors, and the “determinant” is to be calculated by expansion by the minors along the first row. 5. We are left with one true ambiguity in the definition, and that is which sign to take. In our development we chose z 3 = x 1 y 2 - x 2 y 1 , but we could have of course chosen z 3 = x 2 y 1 - x 1 y 2 . In this case, the entire mathematical community agrees with the choice we have made. 6. Nice special cases: ˆ ı × ˆ = ˆ k, ˆ × ˆ k = ˆ ı, ˆ k × ˆ ı = ˆ . 7. By the way, two vectors in R 3 have a dot product (a scalar) and a cross product (a vector). The words “dot” and “cross” are somehow weaker than “scalar” and “vector,” but they have stuck. ALGEBRAIC PROPERTIES . The cross product is linear in each factor, so we have for example for vectors x , y , u , v , ( ax + by ) × ( cu + dv ) = acx × u + adx × v + bcy × u + bdy × v. It is anticommutative : y × x = - x × y. It is not associative: for instance, ˆ ı × ı × ˆ ) = ˆ ı × ˆ k = - ˆ ; ı × ˆ ı ) × ˆ = 0 × ˆ j = 0 . PROBLEM 7–1. Let x R 3 be thought of as fixed. Then x × y is a linear function from R 3 to R 3 and thus can be represented in a unique way as a matrix times the column vector y . Show that in fact x × y = 0 - x 3 x 2 x 3 0 - x 1 - x 2 x 1 0 y.
Background image of page 3
4 Chapter 7 PROBLEM 7–2. Assuming x 6 = 0 in the preceding problem, find the characteristic polynomial of the 3 × 3 matrix given there. What are its eigenvalues? B. The norm of the cross product The approach I want to take here goes back to the Schwarz inequality on p. 1–15, for which we are now going to give an entirely different proof. Suppose then that x , y R n . We are going to prove | x y | ≤ k x k k y k by calculating the difference of the squares of the two sides, as follows: k x k 2 k y k 2 - ( x y ) 2 = n X i =1 x 2 i n X i =1 y 2 i - ˆ n X i =1 x i y i ! 2 = n X i =1 x 2 i n X j =1 y 2 j - n X i =1 x i y i n X j =1 x j y j = n X i,j =1 x 2 i y 2 j - n X i,j =1 x i y i x j y j = X i<j x 2 i y 2 j + n X i =1 x 2 i y 2 i + X i>j x 2 i y 2 j - X i<j x i y i x j y j - n X i =1 x 2 i y 2 i - X i>j x i y i x j y j = X i<j x 2 i y 2 j + X j>i x 2 j y 2 i - 2 X i<j x i y i x j y j = X i<j ( x 2 i y 2 j - 2 x i y i x j y j + x 2 j y 2 i ) = X i<j ( x i y j - x j y i ) 2 . This then proves that k x k 2 k y k 2 - ( x y ) 2 0. If we specialize to R 3 , we have k x k 2 k y k 2 - ( x y ) 2 = ( x 1 y 2 - x 2 y 1 ) 2 + ( x 2 y 3 - x 3 y 2 ) 2 + ( x 1 y 3 - x 3 y 1 ) 2 . But the right side is precisely z 2 3 + z 2 1 + z 2 2 in the above notation. Thus we have proved the
Background image of page 4

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Cross product 5 wonderful Lagrange’s identity k x k 2 k y k 2 = ( x y ) 2 + k x × y k 2 .
Background image of page 5
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}