This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 18.06 Linear Algebra, Spring 2010 Transcript Lecture 24  two, one and  okay. Here is a lecture on the applications of eigenvalues and, if I can  so that will be Markov matrices. I'll tell you what a Markov matrix is, so this matrix A will be a Markov matrix and I'll explain how they come in applications. And  and then if I have time, I would like to say a little bit about Fourier series, which is a fantastic application of the projection chapter. Okay. What's a Markov matrix? Can I just write down a typical Markov matrix, say .1, .2, .7, .01, .99 0, let's say, .3, .3, .4. Okay. There's a  a totally just invented Markov matrix. What makes it a Markov matrix? Two properties that this  this matrix has. So two properties are  one, every entry is greater equal zero. All entries greater than or equal to zero. And, of course, when I square the matrix, the entries will still be greater/equal zero. I'm going to be interested in the powers of this matrix. And this property, of course, is going to  stay there. It  really Markov matrices you'll see are connected to probability ideas and probabilities are never negative. The other property  do you see the other property in there? If I add down the columns, what answer do I get? One. So all columns add to one. All columns add to one. And actually when I square the matrix, that will be true again. So that the powers of my matrix are all Markov matrices, and I'm interested in, always, the eigenvalues and the eigenvectors. And this question of steady state will come up. You remember we had steady state for differential equations last time? When  what was the steady state  what was the eigenvalue? What was the eigenvalue in the differential equation case that led to a steady state? It was lambda equals zero. When  you remember that we did an example and one of the eigenvalues was lambda equals zero, and that  so then we had an E to the zero T, a constant one  as time went on, there that thing stayed steady. Now what  in the powers case, it's not a zero eigenvalue. Actually with powers of a matrix, a zero eigenvalue, that part is going to die right away. It's an eigenvalue of one that's all important. So this steady state will correspond  will be totally connected with an eigenvalue of one and its eigenvector. In fact, the steady state will be the eigenvector for that eigenvalue. Okay. So that's what's coming. Now, for some reason then that we have to see, this matrix has an eigenvalue of one. This property, that the columns all add to one  turns out  guarantees that one is an eigenvalue, so that you can actually find the eigenvalue  find that eigenvalue of a Markov matrix without computing any determinants of A minus lambda I  that matrix will have an eigenvalue of one, and we want to see why. And then the other thing is  so the key points  let me  let me write these underneath. underneath....
View
Full
Document
This note was uploaded on 08/22/2011 for the course MATH 1806 taught by Professor Strang during the Fall '10 term at MIT.
 Fall '10
 Strang
 Linear Algebra, Algebra, Matrices

Click to edit the document details