# lec7 - Introduction to Simulation Lecture 7 Krylov-Subspace...

This preview shows pages 1–11. Sign up to view the full content.

1 Introduction to Simulation - Lecture 7 Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy Krylov-Subspace Matrix Solution Methods Part II Jacob White

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 Outline • Reminder about GCR – Residual minimizing solution – Krylov Subspace – Polynomial Connection • Review Eigenvalues and Norms – Induced Norms – Spectral mapping theorem • Estimating Convergence Rate – Chebychev Polynomials • Preconditioners – Diagonal Preconditioners – Approximate LU preconditioners
3 SMA-HPC ©2003 MIT Generalized Conjugate Residual Algorithm With Normalization For j = 0 to k-1 () 1 T jj Mp Mp pp Normalize j j pr = ( ) j j i i j T Mp Mp p ←− Orthogonalize Search Direction 1 j j j T j j r x xp Mp + =+ Update Solution 00 rb A x =− 1 j j j T j j rM p r M rp + Update Residual For i = 0 to j-1 Residual is next search direction

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
4 SMA-HPC ©2003 MIT k M p k r 1) orthogonalize the ' i M rs 2) compute the minimizing solution k rx p 0 p 1 p 2 p 3 Algorithm Steps by Picture Generalized Conjugate Residual Algorithm With Normalization k+1 r
5 SMA-HPC ©2003 MIT First Few Steps 11 0 0 1 rb M xr M r γ =− = • First search direction • Residual minimizing solution • Second Search Direction ( ) ( ) 10 00 T M p p = 0 0 0 , r M xb p Mr = = () 1 1,0 0 1 1 0 rp p M β = Generalized Conjugate Residual Algorithm

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
6 SMA-HPC ©2003 MIT First few steps Continued… 22 0 0 2 0 2,1 2,0 rb M xr M r M r γγ =− = • Residual minimizing solution • Third Search Direction ( ) ( ) 21 1 11 T x M p p =+ () 1 0 2,1 1 2 1 0 2,1 1 rpp p M ββ −− = Generalized Conjugate Residual Algorithm
7 SMA-HPC ©2003 MIT The kth step of GCR () 1 0 k T kk kj j j p rM r M p p = =− ± ( ) T k p α = 1 x xp + =+ 1 rr M p + Determine optimal stepsize in kth search direction Update the solution and the residual Orthogonalize and normalize search direction k k k p p Mp = ± ± Generalized Conjugate Residual Algorithm

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document