This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: AMS 510.01 Fall 2005, Midterm #1 Solutions Name: Student ID: Score: /100 ( + /15 bonus) 1. Indicate whether each of the following statements is true or false. m and n should be taken as any constant positive integer. (2 points each). (a) ~u + ~v = ~v + ~u for all ~u,~v V , where V is any vector space. True: addition is commutative in all vector spaces. (b) The set of all square matrices forms a field. False: The multiplicative inverse ( A 1 A = AA 1 = I ) does not exist for all square matrices; matrix multiplication is not generally commutative ( AB 6 = BA ). (c) ~u ~v = ~v ~u for all ~u,~v C n . False: For complex vectors, ~u ~v = ( ~v ~u ) * , where * is the complex conjugate. (d) The set of all vectors ~x R n forms an algebra. False: An algebra requires closure under addition, scalar multiplication, and multiplication; there is no general product of two vectors giving another vec tor. (e) The expression f ( X ) = 2 X 2 + X + 5 is defined when X is any n n (square) matrix. True: The set of nsquare matrices form an algebra, meaning any polynomial is defined for nsquare matrices (with scalars replaced by scalar matrices). (f) The set of all m n matrices forms a vector space. True: Addition and scalar multiplication are closed and are appropriately commutative, associative, and distributive; the additive identity and inverse (and the scalar identity) also exist. (g) If f ( a~u + b~v ) = af ( ~u ) + bf ( ~v ) for all a,b K and all ~u,~v V , then f ( ~x ) is a linear map acting on V. True: The definition of a linear map is one that preserves linear combinations of vectors. (h) If AB = I then A and B must be square matrices. False: AB may equal the n n identity matrix when A is n m and B is m n , even if m 6 = n . (i) { (1 , 1) , (1 , 0) , (1 , 1) } is a basis for R 2 . False: R 2 is a 2dimensional vector space, and thus any basis must contain two vectors; (1 , 0) = (1 , 1) + (1 , 1), and thus (1 , 0) is linearly dependant on { (1 , 1) , (1 , 1) } . (j) ( ~u ~v ) ~u = ( ~u ~v ) ~v = 1 for all ~u,~v R 3 . False: ( ~u ~v ) ~u = ( ~u ~v ) ~v = 0 6 = 1; ~u ~v is orthogonal to both ~u and ~v . 2. Let ~u = (1 , 1 , 1) and ~v = (0 , 1 , 0). (a) Find a nonzero vector, ~w , in the plane of ~u and ~v such that ~w is orthogonal to ~u . (4 points) If ~w is in the plane of ~u and ~v , then ~w = a~u + b~w ( i.e. ~w is a linear combination of ~u and ~v . Thus: ~w = a~u + b~v = a (1 , 1 , 1) + b (0 , 1 , 0) = ( a,a + b,a ) If ~w is orthogonal to ~u , then ~w ~u = 0: ~w ~u = 0 ( a,a + b,a ) (1 , 1 , 1) = 0 a + ( a + b ) + a = 0 3 a + b = 0 Let a be any nonzero number (say a = 1), then b = 3 a = 3, and: ~w = ~u 3 ~v = (1 , 1 , 1) 3(0 , 1 , 0) = (1 , 2 , 1) (b) Find a vector ~x that is orthogonal to both ~u and ~v . (4 points) The crossproduct of two vectors in R 3 is orthogonal to both vectors. Thus, let ~x = ~u ~v :...
View
Full
Document
This note was uploaded on 02/09/2010 for the course AMS 510 taught by Professor Feinberg,e during the Fall '08 term at SUNY Stony Brook.
 Fall '08
 Feinberg,E
 Applied Mathematics

Click to edit the document details