100%(2)2 out of 2 people found this document helpful
This preview shows page 170 - 173 out of 213 pages.
..............ln1ln2· · ·ln,n-11,(10.39)where thelij,j= 1, . . . , n-1,i=j+ 1, . . . , nare the multipliers (com-puted after all the rows have been rearranged), we arrive at the anticipatedfactorizationPA=LU. Incidentally, up to sign, Gaussian elimination alsoproduces the determinant ofAbecausedet(PA) =±det(A) = det(LU) = det(U) =a(1)11a(2)22· · ·a(n)nn(10.40)and so det(A) is plus or minus the product of all the pivots in the eliminationprocess.In the implementation of Gaussian elimination the array storing the aug-mented matrixAbis overwritten to save memory.The pseudo code withpartial pivoting (assumingai,n+1=bi, i= 1, . . . , n) is presented in Algo-rithm 126.96.36.199The Cost of Gaussian EliminationWe now do an operation count of Gaussian elimination to solve ann×nlinear systemAx=b.We focus on the elimination as we already know that the work for thestep of backward substitution isO(n2). For each round of elimination,j=1, . . . , n-1, we need one division to compute each of then-jmultipliers and(n-j)(n-j+ 1) multiplications and to (n-j)(n-j+ 1) sums (subtracts)perform the eliminations. Thus, the total number number of operations isW(n) =n-1Xj=1[2(n-j)(n-j+ 1) + (n-j)] =n-1Xj=12(n-j)2+ 3(n-j)(10.41)and using (10.10) andmXi=1i2=m(m+ 1)(2m+ 1)6,(10.42)
162CHAPTER 10.LINEAR SYSTEMS OF EQUATIONS IAlgorithm 3Gaussian Elimination with Partial Pivoting1:forj= 1, . . . , n-1do2:Findmsuch that|amj|= maxj≤i≤n|aij|3:if|amj|= 0then4:stop.Matrix is singular5:end if6:ajk↔amk,k=j, . . . , n+ 1.Exchange rows7:fori=j+ 1, . . . , ndo8:m←aij/ajj.Compute multiplier9:aik←aik-m*ajk,k=j+ 1, . . . , n+ 1.Elimination10:aij←m.Store multiplier11:end for12:end for13:fori=n, n-1, . . . ,1do.Backward Substitution14:xi←ai,n+1-nXj=i+1aijxj!/aii15:end forwe getW(n) =23n3+O(n2).(10.43)Thus, Gaussian elimination is computationally rather expensive for largesystems of equations.10.3LUand Choleski FactorizationsIf Gaussian elimination can be performed without row interchanges, then weobtain anLUfactorization ofA, i.e.A=LU.This factorization can beadvantageous when solving many linear systems with the samen×nmatrixAbut different right hand sides because we can turn the problemAx=bintotwo triangular linear systems, which can be solved much more economicallyinO(n2) operations. Indeed, fromLUx=band settingy=Uxwe haveLy=b,(10.44)Ux=y.(10.45)
10.3.LUAND CHOLESKI FACTORIZATIONS163Givenb, we can solve the first system forywith forward substitution andthen we solve the second system forxwith backward substitution.Thus,while theLUfactorization ofAhas anO(n3) cost, subsequent solutions tothe linear system with the same matrixAbut different right hand sides canbe done inO(n2) operations.When can we obtain the factorizationA=LU?the following resultprovides a useful sufficient condition.