MIT18_06S10_L24b

MIT18_06S10_L24b - 18.06 Linear Algebra, Spring 2010...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Transcript – Lecture 24b OK, this is quiz review day. The quiz coming up on Wednesday will before this lecture the quiz will be this hour one o'clock Wednesday in Walker, top floor of Walker, closed book, all normal. I wrote down what we've covered in this second part of the course, and actually I'm impressed as I write it. so that's chapter four on orthogonality and you're remembering these -- what this is suggesting, these are those columns are orthonormal vectors, and then we call that matrix Q and the -- what's the key -- how do we state the fact that those v- those columns are orthonormal in terms of Q, it means that Q transpose Q is the identity. So that's the matrix statement of the -- of the property that the columns are orthonormal, the dot products are either one or zero, and then we computed the projections onto lines and onto subspaces, and we used that to solve problems Ax=b in -- in the least square sense, when there was no solution, we found the best solution. And then finally this Graham-Schmidt idea, which takes independent vectors and lines them up, takes -- subtracts off the projections of the part you've already done, so that the new part is orthogonal and so it takes a basis to an orthonormal basis. And you -- those calculations involve square roots a lot because you're making things unit vectors, but you should know that step. OK, for determinants, the three big -- the big picture is the properties of the determinant, one to three d- properties one, two and three, d- that define the determinant, and then four, five, six through ten were consequences. Then the big formula that has n factorial terms, half of them have plus signs and half minus signs, and then the cofactor formula. So and which led us to a formula for the inverse. And finally, just so you know what's covered in from chapter three, it's section six point one and two, so that's the basic idea of eigenvalues and eigenvectors, the equation for the eigenvalues, the mechanical step, this is really Ax equal lambda x for all n eigenvectors at once, if we have n independent eigenvectors, and then using that to compute powers of a matrix. So you notice the differential equations not on this list, because that's six point three, that that's for the third quiz. OK. Shall I what I usually do for review is to take an old exam and just try to pick out questions that are significant and write them quickly on the board, shall I -- shall I proceed that way again? This -- this exam is really old. November nineteen -- nineteen eighty-four, so that was before the Web existed. So not only were the lectures not on the Web, nobody even had a Web page, my God. OK, so can I nevertheless linear algebra was still as great as ever.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 08/22/2011 for the course MATH 1806 taught by Professor Strang during the Fall '10 term at MIT.

Page1 / 13

MIT18_06S10_L24b - 18.06 Linear Algebra, Spring 2010...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online