This preview shows pages 1–5. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CS 290H 26 October CS 290H 26 October Sparse approximate inverses, support graphs Sparse approximate inverses, support graphs Homework 2 due Mon 7 Nov. Sparse approximate inverse preconditioners Introduction to support theory Preconditioned conjugate gradient iteration Preconditioned conjugate gradient iteration x = 0, r = b, d = B1 r 0, y = B1 r for k = 1, 2, 3, . . . k = (y T k1 r k1 ) / (d T k1 Ad k1 ) step length x k = x k1 + k d k1 approx solution r k = r k1 k Ad k1 residual y k = B1 r k preconditioning solve k = (y T k r k ) / (y T k1 r k1 ) improvement d k = y k + k d k1 search direction Several vector inner products per iteration (easy to parallelize) One matrixvector multiplication per iteration (medium to parallelize) One solve with preconditioner per iteration (hard to parallelize) Sparse approximate inverses Sparse approximate inverses Compute B1 A explicitly Minimize  A B1 I  F (in parallel, by columns) Variants: factored form of B1 , more fill, . . Good: very parallel, seldom breaks down Bad: effectiveness varies widely A B1 Support Graph Preconditioning Support Graph Preconditioning +: New analytic tools, some new preconditioners +: Can use existing directmethods software: Current theory and techniques limited CFIM: Complete factorization of incomplete matrix...
View
Full
Document
This note was uploaded on 12/27/2011 for the course CMPSC 290h taught by Professor Chong during the Fall '09 term at UCSB.
 Fall '09
 Chong

Click to edit the document details