JaGSSRNotes - x new = Tx old d with T JA =-D-1 L U d JA =...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Solving Ax = b by Iteration (Jacobi (JA), Gauss-Seidel (GS), and Successive Relaxation (SR)) a. Iterative methods converge if A is diagonally dominant: each diagonal element is larger in absolute value compared to the sum of the absolute values of the off-diagonals. Some matrix problems may converge even when A is not diagonally dominant. b. GS is generally faster than JA and SR is usually the fastest. GS updates the solution as new x i values are computed, but JA does not update until the end of a complete iteration cycle. c. Let A = L + D + U for Ax = b where L is strictly lower, D is diagonal, and U is strictly upper. Then the basic iteration step is
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: (*) x new = Tx old + d with T JA = -D-1 ( L + U ), d JA = D-1 b T GS = -(D + L)-1 U , d GS = ( D + L )-1 b T SR = ( D + α L )-1 [(1-α ) D-α U ], d SR = α ( D + α L )-1 b where values in the range of 0.4 to 1.8 work well for α . Some references use an ω in place of α . The α =1 SR is same as GS. d. A necessary and sufficient condition for (*) above to converge for any starting x is that the spectral radius (i.e. magnitude of the largest eigenvalue) of the T matrix is less then 1. http://www.cs.wright.edu/~rtaylor/ceg416/other/content/LinearEqsEigNotes/JaGSSRNotes.pdf...
View Full Document

  • Winter '10
  • Taylor
  • Necessary and sufficient condition, Diagonally dominant matrix, complete iteration cycle., basic iteration step, Successive Relaxation

{[ snackBarMessage ]}

Ask a homework question - tutors are online