{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

LSCond - E t b te and solve for ˙ x(0 ˙ x(0 = A t A-1 A t...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Sensitivity of Linear Least Squares Assume that A R m × n has full column rank. We know that the problem min x k Ax - b k 2 (1) has a unique solution, say x LS , which satisfies the (nonsingular) normal equations A t Ax = A t b. (2) Here we will exploit the equivalence of (1) and (2) to investigate the sensitivity of (1). Note that how we compute x LS is not at issue; we may or may not use the normal equations approach, but the normal equations give us an explicit functional relationship from which we can analyze the conditioning. Let’s use k · k ≡ k · k 2 throughout. Consider the perturbed problem min x k ( A + tE ) x ( t ) - ( b + te ) k , (3) where t is a real parameter, and E and e are fixed. The normal equations here are ( A + tE ) t ( A + tE ) x ( t ) = ( A + tE ) t ( b + te ) . (4) If t is small enough, then A + tE has full column rank, ( A + tE ) t ( A + tE ) is symmetric positive definite, x ( t ) is continuously differentiable and x LS = x (0). From Taylor’s theorem we have (as with our analysis of Ax = b ) k x LS - x ( t ) k k x LS k = t k ˙ x (0) k k x LS k + O( t 2 ) , (5) so we differentiate (4): ( A + tE ) t ( A + tE ) ˙ x ( t ) + [( A + tE ) t E + E t ( A + tE )] x ( t ) = ( A + tE ) t e +
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: E t ( b + te ) and solve for ˙ x (0): ˙ x (0) = ( A t A )-1 [ A t ( e-Ex LS ) + E t ( b-Ax LS )] . Now define κ 2 ( A ) = k A kk ( A t A )-1 A t k . (generalizing the condition number for Ax = b to rectangular matrices). Then with r ≡ ( b-Ax LS ), k ˙ x (0) k k x LS k ≤ t n k ( A t A )-1 A t ( e-E x LS ) k k x LS k + k ( A t A )-1 E t r k k x LS k o + O( t 2 ) ≤ κ 2 ( A )( k te k k A kk x LS k + k tE k k A k ) + k ( A t A )-1 kk A kk tE kk r k k A kk x LS k + O( t 2 ) Let θ be the angle between b and ColSp( A )). Since r ⊥ Ax LS , k Ax LS k = p k b k 2- k r k 2 = k b k cos ( θ ) ≤ k A kk x LS k . Thus, the condition number of (1) is roughly κ ( A ) ± 1 + 1 | cos ( θ ) | κ ( A ) k r k k b k ² . This signals trouble when κ ( A ) is large, and becomes quadratic in κ ( A ) if b points away from ColSp( A )....
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online