[Strang_G.]_Linear_algebra_and_its_applications(4)[5881001] - FourthEdition GilbertStrang xyz Ax y Ay 0 z b b Az b 0 0 Contents Preface iv 1 1.1

[Strang_G.]_Linear_algebra_and_its_applications(4)[5881001]...

This preview shows page 1 out of 542 pages.

You've reached the end of your free preview.

Want to read all 542 pages?

Unformatted text preview: Linear Algebra and Its Applications Fourth Edition Gilbert Strang xyz Ax y Ay 0 z b b Az b 0 0 Contents Preface iv 1 Matrices and Gaussian Elimination 1.1 Introduction . . . . . . . . . . . . . . . . . 1.2 The Geometry of Linear Equations . . . . . 1.3 An Example of Gaussian Elimination . . . 1.4 Matrix Notation and Matrix Multiplication . 1.5 Triangular Factors and Row Exchanges . . 1.6 Inverses and Transposes . . . . . . . . . . . 1.7 Special Matrices and Applications . . . . . Review Exercises . . . . . . . . . . . . . . . . . . . . . . 1 1 4 13 21 36 50 66 72 . . . . . . . 77 77 86 103 115 129 140 154 . . . . . . 159 159 171 180 195 211 221 . . . . . . . . 2 Vector Spaces 2.1 Vector Spaces and Subspaces . . . . . . . . . 2.2 Solving Ax = 0 and Ax = b . . . . . . . . . . 2.3 Linear Independence, Basis, and Dimension . 2.4 The Four Fundamental Subspaces . . . . . . 2.5 Graphs and Networks . . . . . . . . . . . . . 2.6 Linear Transformations . . . . . . . . . . . . Review Exercises . . . . . . . . . . . . . . . 3 Orthogonality 3.1 Orthogonal Vectors and Subspaces . . 3.2 Cosines and Projections onto Lines . . 3.3 Projections and Least Squares . . . . 3.4 Orthogonal Bases and Gram­Schmidt 3.5 The Fast Fourier Transform . . . . . . Review Exercises . . . . . . . . . . . i . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . CONTENTS ii 4 Determinants 4.1 Introduction . . . . . . . . . . 4.2 Properties of the Determinant . 4.3 Formulas for the Determinant . 4.4 Applications of Determinants . Review Exercises . . . . . . . . . . . . . . . . . . . . . . 5 Eigenvalues and Eigenvectors 5.1 Introduction . . . . . . . . . . . . . 5.2 Diagonalization of a Matrix . . . . . 5.3 Difference Equations and Powers Ak 5.4 Differential Equations and eAt . . . 5.5 Complex Matrices . . . . . . . . . . 5.6 Similarity Transformations . . . . . Review Exercises . . . . . . . . . . 6 Positive Definite Matrices 6.1 Minima, Maxima, and Saddle Points 6.2 Tests for Positive Definiteness . . . 6.3 Singular Value Decomposition . . . 6.4 Minimum Principles . . . . . . . . 6.5 The Finite Element Method . . . . . . . . . . . . . . . . . . . . . . 7 Computations with Matrices 7.1 Introduction . . . . . . . . . . . . . . 7.2 Matrix Norm and Condition Number . 7.3 Computation of Eigenvalues . . . . . 7.4 Iterative Methods for Ax = b . . . . . 8 Linear Programming and Game Theory 8.1 Linear Inequalities . . . . . . . . . 8.2 The Simplex Method . . . . . . . . 8.3 The Dual Problem . . . . . . . . . . 8.4 Network Models . . . . . . . . . . 8.5 Game Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A Intersection, Sum, and Product of Spaces A.1 The Intersection of Two Vector Spaces . . . . . A.2 The Sum of Two Vector Spaces . . . . . . . . . A.3 The Cartesian Product of Two Vector Spaces . . A.4 The Tensor Product of Two Vector Spaces . . . A.5 The Kronecker Product A ≠B of Two Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225 225 227 236 247 258 . . . . . . . 260 260 273 283 296 312 325 341 . . . . . 345 345 352 367 376 384 . . . . 390 390 391 399 407 . . . . . 417 417 422 434 444 451 . . . . . 459 459 460 461 461 462 CONTENTS iii B The Jordan Form 466 C Matrix Factorizations 473 D Glossary: A Dictionary for Linear Algebra 475 E MATLAB Teaching Codes 484 F Linear Algebra in a Nutshell 486 A⃗x = ⃗⃗ b C(A T ) C(A) A T ⃗⃗y = ⃗c dim r Row Space Column Space all A⃗x all AT ⃗⃗y Rn dim r A T ⃗⃗y ⃗⃗ = 0 ⃗⃗ 0 Rm ⃗⃗ 0 A⃗x = ⃗⃗ 0 N(A) dim n − r Null Space Left Null Space N(A T ) dim m − r Preface Revising this textbook has been a special challenge, for a very nice reason. So many people have read this book, and taught from it, and even loved it. The spirit of the book could never change. This text was written to help our teaching of linear algebra keep up with the enormous importance of this subject—which just continues to grow. One step was certainly possible and desirable—to add new problems. Teaching for all these years required hundreds of new exam questions (especially with quizzes going onto the web). I think you will approve of the extended choice of problems. The questions are still a mixture of explain and compute—the two complementary approaches to learning this beautiful subject. I personally believe that many more people need linear algebra than calculus. Isaac Newton might not agree! But he isn’t teaching mathematics in the 21st century (and maybe he wasn’t a great teacher, but we will give him the benefit of the doubt). Cer­ tainly the laws of physics are well expressed by differential equations. Newton needed calculus—quite right. But the scope of science and engineering and management (and life) is now so much wider, and linear algebra has moved into a central place. May I say a little more, because many universities have not yet adjusted the balance toward linear algebra. Working with curved lines and curved surfaces, the first step is always to linearize. Replace the curve by its tangent line, fit the surface by a plane, and the problem becomes linear. The power of this subject comes when you have ten variables, or 1000 variables, instead of two. You might think I am exaggerating to use the word “beautiful” for a basic course in mathematics. Not at all. This subject begins with two vectors v and w, pointing in different directions. The key step is to take their linear combinations. We multiply to get 3v and 4w, and we add to get the particular combination 3v +4w. That new vector is in the same plane as v and w. When we take all combinations, we are filling in the whole plane. If I draw v and w on this page, their combinations cv + dw fill the page (and beyond), but they don’t go up from the page. In the language of linear equations, I can solve cv +dw = b exactly when the vector b lies in the same plane as v and w. iv v Matrices I will keep going a little more to convert combinations of three­dimensional vectors into linear algebra. If the vectors are v=( 1,2,3) and w =( 1,3,4), put them into the columns of a matrix: 2 3 11 6 7 matrix = 423 5 . 34 To find combinations of those columns, “multiply” the matrix by a vector(c,d): 2 3 2 3 2 3 " # 1 1 11 6 7 6 7 c 6 7 = c 425 +d 4 35 . Linear combinations cv +dw 4 23 5 d 34 3 4 Those combinations fill a vector space. We call it the column space of the matrix. (For these two columns, that space is a plane.) To decide if b =( 2,5,7) is on that plane, we have three components to get right. So we have three equations to solve: 2 3 2 3 " # 11 2 c+ d =2 6 7 c 6 7 = 4 55 means 2c +3d = 5 . 423 5 d 7 3c +4d = 7 34 I leave the solution to you. The vector b =( 2,5,7) does lie in the plane of v and w. If the 7 changes to any other number, then b won’t lie in the plane—it will not be a combination of v and w, and the three equations will have no solution. Now I can describe the first part of the book, about linear equations Ax = b. The matrix A has n columns and m rows. Linear algebra moves steadily to n vectors in m­ dimensional space. We still want combinations of the columns (in the column space). We still get m equations to produce b (one for each row). Those equations may or may not have a solution. They always have a least­squares solution. The interplay of columns and rows is the heart of linear algebra. It’s not totally easy, but it’s not too hard. Here are four of the central ideas: 1. The column space (all combinations of the columns). 2. The row space (all combinations of the rows). 3. The rank (the number of independent columns) (or rows). 4. Elimination (the good way to find the rank of a matrix). I will stop here, so you can start the course. PREFACE vi Web Pages It may be helpful to mention the web pages connected to this book. So many messages come back with suggestions and encouragement, and I hope you will make free use of everything. You can directly access , which is continually updated for the course that is taught every semester. Linear algebra is also on MIT’s OpenCourseWare site , where 18.06 became exceptional by including videos of the lectures (which you definitely don’t have to watch...). Here is a part of what is available on the web: 1. Lecture schedule and current homeworks and exams with solutions. 2. The goals of the course, and conceptual questions. 3. Interactive Java demos (audio is now included for eigenvalues). 4. Linear Algebra Teaching Codes and MATLAB problems. 5. Videos of the complete course (taught in a real classroom). The course page has become a valuable link to the class, and a resource for the students. I am very optimistic about the potential for graphics with sound. The bandwidth for voiceover is low, and FlashPlayer is freely available. This offers a quick review (with active experiment), and the full lectures can be downloaded. I hope professors and students worldwide will find these web pages helpful. My goal is to make this book as useful as possible with all the course material I can provide. Other Supporting Materials Student Solutions Manual 0­495­01325­0 The Student Solutions Manual provides solutions to the odd­numbered problems in the text. Instructor’s Solutions Manual 0­030­10588­4 The Instructor’s Solutions Man­ ual has teaching notes for each chapter and solutions to all of the problems in the text. Structure of the Course The two fundamental problems are Ax = b and Ax = x for square matrices A. The first problem Ax = b has a solution when A has independent columns. The second problem Ax = x looks for independent eigenvectors. A crucial part of this course is to learn what “independence” means. I believe that most of us learn first from examples. You can see that 2 3 112 6 7 A = 4123 does not have independent columns. 5 134 vii Column 1 plus column 2 equals column 3. A wonderful theorem of linear algebra says that the three rows are not independent either. The third row must lie in the same plane as the first two rows. Some combination of rows 1 and 2 will produce row 3. You might find that combination quickly (I didn’t). In the end I had to use elimination to discover that the right combination uses 2 times row 2, minus row 1. Elimination is the simple and natural way to understand a matrix by producing a lot of zero entries. So the course starts there. But don’t stay there too long! You have to get from combinations of the rows, to independence of the rows, to “dimension of the row space.” That is a key goal, to see whole spaces of vectors: the row space and the column space and the nullspace. A further goal is to understand how the matrix acts. When A multiplies x it produces the new vector Ax. The whole space of vectors moves—it is “transformed” by A. Special transformations come from particular matrices, and those are the foundation stones of linear algebra: diagonal matrices, orthogonal matrices, triangular matrices, symmetric matrices. The eigenvalues of those matrices are special too. I think 2 by 2 matrices provide terrific examples of the information that eigenvalues can give. Sections 5.1 and 5.2 are worth careful reading, to see how Ax = x is useful. Here is a case in which small matrices allow tremendous insight. Overall, the beauty of linear algebra is seen in so many different ways: 1. Visualization. Combinations of vectors. Spaces of vectors. Rotation and reflection and projection of vectors. Perpendicular vectors. Four fundamental subspaces. 2. Abstraction. Independence of vectors. Basis and dimension of a vector space. Linear transformations. Singular value decomposition and the best basis. Elimination to produce zero entries. Gram­Schmidt to produce 3. Computation. orthogonal vectors. Eigenvalues to solve differential and difference equations. 4. Applications. Least­squares solution when Ax = b has too many equations. Dif­ ference equations approximating differential equations. Markov probability matrices (the basis for Google!). Orthogonal eigenvectors as principal axes (and more...). To go further with those applications, may I mention the books published by Wellesley­ Cambridge Press. They are all linear algebra in disguise, applied to signal processing and partial differential equations and scientific computing (and even GPS). If you look at , you will see part of the reason that linear algebra is so widely used. After this preface, the book will speak for itself. You will see the spirit right away. The emphasis is on understanding—I try to explain rather than to deduce. This is a book about real mathematics, not endless drill. In class, I am constantly working with examples to teach what students need. PREFACE viii Acknowledgments I enjoyed writing this book, and I certainly hope you enjoy reading it. A big part of the pleasure comes from working with friends. I had wonderful help from Brett Coonley and Cordula Robinson and Erin Maneri. They created the LATEX files and drew all the figures. Without Brett’s steady support I would never have completed this new edition. Earlier help with the Teaching Codes came from Steven Lee and Cleve Moler. Those follow the steps described in the book;MATLAB and Maple and Mathematica are faster for large matrices. All can be used (optionally) in this course. I could have added “Factorization” to that list above, as a fifth avenue to the understanding of matrices: [L, U, P] = lu(A) for linear equations [Q, R] = qr(A) to make the columns orthogonal [S, E] = eig(A) to find eigenvectors and eigenvalues. In giving thanks, I never forget the first dedication of this textbook, years ago. That was a special chance to thank my parents for so many unselfish gifts. Their example is an inspiration for my life. And I thank the reader too, hoping you like this book. Gilbert Strang Chapter 1 Matrices and Gaussian Elimination 1.1 Introduction This book begins with the central problem of linear algebra: solving linear equations. The most important ease, and the simplest, is when the number of unknowns equals the number of equations. We have n equations in n unknowns, starting with n = 2: Two equations 1x + 2y = 3 Two unknowns 4x + 5y = 6. (1) The unknowns are x and y. I want to describe two ways, elimination and determinants, to solve these equations. Certainly x and y are determined by the numbers 1, 2, 3, 4, 5, 6. The question is how to use those six numbers to solve the system. 1. Elimination Subtract 4 times the first equation from the second equation. eliminates x from the second equation. and it leaves one equation for y: (equation 2) ° 4(equation 1) This ° 3y = ° 6. (2) Immediately we know y = 2. Then x comes from the first equation 1x +2y = 3: Back­substitution 1x +2(2)= 3 gives x = ° 1. Proceeding carefully, we cheek that x and y also solve the second equation. should work and it does: 4 times (x = ° 1) plus 5 times (y = 2) equals 6. (3) This 2. Determinants The solution y = 2 depends completely on those six numbers in the equations. There most be a formula for y (and also x) It is a “ratio of determinants” and I hope you will allow me to write it down directly: Ø Ø Ø13 Ø Ø Ø Ø Ø Ø46 Ø 1 ·6 ° 3 ·4 ° 6 Ø y=Ø (4) Ø1 2Ø= 1 ·5 ° 2 ·4 = ° 3 = 2. Ø Ø Ø Ø Ø4 5Ø 2 Chapter 1 Matrices and Gaussian Elimination That could seem a little mysterious, unless you already know about 2 by 2 determi­ nants. They gave the same answer y = 2, coming from the same ratio of ° 6 to ° 3. If we stay with determinants (which we don’t plan to do), there will be a similar formula to compute the other unknown, x: Ø Ø Ø32 Ø Ø Ø Ø Ø Ø65 Ø 3 ·5 ° 2 ·6 3 Ø= = (5) x=Ø Ø1 2Ø 1 ·5 ° 2 ·4 ° 3 = ° 1. Ø Ø Ø Ø Ø4 5Ø Let me compare those two approaches, looking ahead to real problems when n is much larger (n = 1000 is a very moderate size in scientific computing). The truth is that direct use of the determinant formula for 1000 equations would be a total disaster. It would use the million numbers on the left sides correctly, but not efficiently. We will find that formula (Cramer’s Rule) in Chapter 4, but we want a good method to solve 1000 equations in Chapter 1. That good method is Gaussian Elimination. This is the algorithm that is constantly used to solve large systems of equations. From the examples in a textbook (n = 3 is close to the upper limit on the patience of the author and reader) too might not see much difference. Equations (2) and (4) used essentially the same steps to find y= 2. Certainly x came faster by the back­substitution in equation (3) than the ratio in (5). For larger n there is absolutely no question. Elimination wins (and this is even the best way to compute determinants). The idea of elimination is deceptively simple—you will master it after a few exam­ ples. It will become the basis for half of this book, simplifying a matrix so that we can understand it. Together with the mechanics of the algorithm, we want to explain four deeper aspects in this chapter. They are: 1. Linear equations lead to geometry of planes. It is not easy to visualize a nine­ dimensional plane in ten­dimensional space. It is harder to see ten of those planes, intersecting at the solution to ten equations—but somehow this is almost possible. Our example has two lines in Figure 1.1, meeting at the point (x,y)=(° 1,2). Linear algebra moves that picture into ten dimensions, where the intuition has to imagine the geometry (and gets it right) 2. We move to matrix notation, writing the n unknowns as a vector x and the n equa­ tions as Ax = b. We multiply A by “elimination matrices” to reach an upper trian­ gular matrix U . Those steps factor A into L times U , where L is lower triangular. I will write down A and its factors for our example, and explain them at the right time: " # " #" # 12 10 12 = = L times U . A= (6) Factorization 45 41 0 °3 1.1 Introduction y x = −1 y =2 3 y x +2y = 3 x x +2y =3 x 4x +5y =6 One solution (x, y)=(−1, 2) y x +2y =3 x 4x +8y =6 Parallel: No solution 4x +8y =12 Whole line of solutions Figure 1.1: The example has one solution. Singular cases have none or too many. First we have to introduce matrices and vectors and the rules for multiplication. Every matrix has a transpose AT . This matrix has an inverse A° 1 . 3. In most cases elimination goes forward without difficulties. The matrix has an inverse and the system Ax = b has one solution. In exceptional cases the method will break down—either the equations were written in the wrong order, which is easily fixed by exchanging them, or the equations don’t have a unique solution. That singular case will appear if 8 replaces 5 in our example: Singular case Two parallel lines 1x + 2y = 3 4x + 8y...
View Full Document

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern

Stuck? We have tutors online 24/7 who can help you get unstuck.
A+ icon
Ask Expert Tutors You can ask You can ask You can ask (will expire )
Answers in as fast as 15 minutes