CgConvergence

CgConvergence - Administrivia: November 9, 2009...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
Administrivia: November 9, 2009 Administrivia: November 9, 2009 Final project proposals are due this Wednesday – please come discuss with me. Reading in Saad (either edition): Sections 10.1 – 10.5: Preconditioning Recommended: Sections 6.5.1 – 6.5.4, 6.5.6: GMRES Section 6.7.1: Conjugate gradient Section 7.4.2: BiCGSTAB Chapters 6 & 7 of Saad are excellent references on other Krylov subspace methods, but we won’t cover most of them in class.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Conjugate gradient iteration Conjugate gradient iteration One matrix-vector multiplication per iteration Two vector dot products per iteration Four n-vectors of working storage x 0 = 0, r 0 = b, d 0 = r 0 for k = 1, 2, 3, . . . α k = (r T k-1 r k-1 ) / (d T k-1 Ad k-1 ) step length x k = x k-1 + α k d k-1 approx solution r k = r k-1 α k Ad k-1 residual β k = (r T k r k ) / (r T k-1 r k-1 ) improvement d k = r k + β k d k-1 search direction
Background image of page 2
Conjugate gradient: Orthogonal sequences Conjugate gradient:
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/27/2011 for the course CMPSC 290h taught by Professor Chong during the Fall '09 term at UCSB.

Page1 / 5

CgConvergence - Administrivia: November 9, 2009...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online