F10HW10 - M341, Fall 2010, Homework 10. Section 4.4 Problem...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: M341, Fall 2010, Homework 10. Section 4.4 Problem 10. Since det A =det A T , we may consider the determinant of the matrix with columns u , v , w . These 3 vectors are linearly independent if and only if the three columns of this matrix are pivot columns which happens if and only if there are 3 pivots in the matrix or if and only if the matrix is invertible or if and only if the determinant of the matrix is ̸ = 0. Problem 17. The premise means that in f ( x ) there are two monomials, say a k x k and a ℓ x ℓ , with k ̸ = ℓ and a k ̸ = 0, a ℓ ̸ = 0. Then if we had a dependence relation between f ( x ) and x f ′ ( x ), say u f ( x ) + vx f ′ ( x ) = , we should have u ( a k x k ) + v ( xka k x k − 1 ) = and u ( a ℓ x ℓ ) + v ( xℓa ℓ x ℓ − 1 ) = . The homogeneous system: { ua k + vka k = 0 ua ℓ + vℓa ℓ = 0 should have non trivial solutions in u and v . Since a k ̸ = 0, a ℓ ̸ = 0, this system is equivalent to the system: { u + vk = 0 u + vℓ = 0 the determinant of which is k − ℓ ̸ = 0 and there are no non trivial solutions. Problem 19. a) T = { A v 1 ,...,A v k } is linearly independent ⇒ S = { v 1 ,..., v k } is linearly inde- pendent. (otherwise:Premise: T = { A v 1 ,...,A v k } is linearly independent. Conclusion: S = { v 1 ,..., v k } is linearly independent.) Assume T is linearly independent. By contradiction. If S was not linearly independent there would be real numbers a 1 ,...,a k not all zeroes, such that a 1 v 1 + ... + a k v k = . When A acts on each member of this equality, by the properties of the matrix multi- plication (distributivity, A ( c v ) = cA ( v ), A = ), we get: a 1 A v 1 + ... + a k A v k = , and this would be a dependence relation in T . But this is not possible since T is assumed to be linearly independent. b) Converse: S is linearly independent ⇒ T is linearly independent. For a counter example don’t try to be smart, just make it simple and take A = O , the zero matrix. (Actually any non invertible matrix would give a counter example.) c) If A is square and non singular (invertible), the converse is true. If A is invertible, A − 1 exists and if we put T = { w 1 ,..., w k } , we have S = { A − 1 w 1 ,...,A − 1 w k } and we may use a) in this new context. Section 4.5 Preliminary. In an n-dimensional vector space, for a set S of vectors any 2 of the 3 following properties yield the third one:- S # = n ( S contains n vectors)....
View Full Document

This note was uploaded on 12/05/2011 for the course M 341 taught by Professor Hietmann during the Spring '08 term at University of Texas.

Page1 / 5

F10HW10 - M341, Fall 2010, Homework 10. Section 4.4 Problem...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online