This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Numerical Linear Algebra Software (based on slides written by Michael Grant) • BLAS, ATLAS • LAPACK • sparse matrices Prof. S. Boyd, EE364b, Stanford University Numerical linear algebra in optimization most memory usage and computation time in optimization methods is spent on numerical linear algebra, e.g. , • constructing sets of linear equations ( e.g. , Newton or KKT systems) – matrixmatrix products, matrixvector products, . . . • and solving them – factoring, forward and backward substitution, . . . . . . so knowing about numerical linear algebra is a good thing Prof. S. Boyd, EE364b, Stanford University 1 Why not just use Matlab? • Matlab (Octave, . . . ) is OK for prototyping an algorithm • but you’ll need to use a real language ( e.g. , C, C++, Python) when – your problem is very large, or has special structure – speed is critical ( e.g. , realtime) – your algorithm is embedded in a larger system or tool – you want to avoid proprietary software • in any case, the numerical linear algebra in Matlab is done using standard free libraries Prof. S. Boyd, EE364b, Stanford University 2 How to write numerical linear algebra software DON’T! whenever possible, rely on existing, mature software libraries • you can focus on the higherlevel algorithm • your code will be more portable, less buggy, and will run faster—sometimes much faster Prof. S. Boyd, EE364b, Stanford University 3 Netlib the grandfather of all numerical linear algebra web sites http://www.netlib.org • maintained by University of Tennessee, Oak Ridge National Laboratory, and colleagues worldwide • most of the code is public domain or freely licensed • much written in FORTRAN 77 (gasp!) Prof. S. Boyd, EE364b, Stanford University 4 Basic Linear Algebra Subroutines (BLAS) written by people who had the foresight to understand the future benefits of a standard suite of “kernel” routines for linear algebra. created and organized in three levels : • Level 1 , 19731977: O ( n ) vector operations: addition, scaling, dot products, norms • Level 2 , 19841986: O ( n 2 ) matrixvector operations: matrixvector products, triangular matrixvector solves, rank1 and symmetric rank2 updates • Level 3 , 19871990: O ( n 3 ) matrixmatrix operations: matrixmatrix products, triangular matrix solves, lowrank updates Prof. S. Boyd, EE364b, Stanford University 5 BLAS operations Level 1 addition/scaling αx , αx + y dot products, norms x T y , bardbl x bardbl 2 , bardbl x bardbl 1 Level 2 matrix/vector products αAx + βy , αA T x + βy rank 1 updates A + αxy T , A + αxx T rank 2 updates A + αxy T + αyx T triangular solves αT 1 x , αT T x Level 3 matrix/matrix products αAB + βC , αAB T + βC αA T B + βC , αA T B T + βC rank k updates αAA T + βC , αA T A + βC rank 2 k updates αA T B + αB T A + βC triangular solves αT 1 C , αT T C Prof. S. Boyd, EE364b, Stanford University 6 Level 1 BLAS naming convention BLAS routines have a Fortraninspired naming convention:...
View
Full
Document
This note was uploaded on 04/09/2010 for the course EE 364B at Stanford.
 '09
 BOYD,S

Click to edit the document details