Lect1 - Stochastic Process Lecture 1 Review of Linear...

This preview shows pages 1–3. Sign up to view the full content.

Stochastic Process 9/15/2006 Lecture 1 Review of Linear Algebra and Probabilty NCTUEE Summary This lecture reviews several fudnamental concepts in Linear Algebra and Probability that we will see very often in this course. Speciﬁcally, I will discuss: Eigenvector and eigenvalue Hermitian matrices Singular value decomposition (SVD) Random variable Conditional probability Expectation and conditional expectation Notation We will use the following notation rules, unless otherwise noted, to represent symbols during this course. Boldface upper case letter to represent MATRIX Boldface lower case letter to represent vector Superscript ( · ) T and ( · ) H to denote transpose and hermitian (conjugate transpose), respectively 1-1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
1 Linear Algebra Eigenvector and Eigenvalue Let A C n × n . An eigenvector of A is a non-zero vector v C n × 1 such that A · v = λ · v . The constant λ C is called the eigenvalue associated with v . Finding Eigenvalues Use the fact that A · v = λ · v if and only if det( A - λ · I ) = 0 to ﬁnd eigenvalues. Having obtained all the eigenvalues, solve the linear equation ( A - λ · I ) · v = 0 to determine associated eigenvectors v 0 s . Matrix Decomposition Suppose that A C n × n admits n linearly independent eigenvectors v 1 , v 2 , ··· , v n with corresponding eigenvalues λ 1 2 , ··· n . Then, we can decompose the matrix A into A = E · Λ · E - 1 , where E = [ v 1 , v 2 , ··· , v n ] n × n and Λ = diag( λ 1 2 , ··· n ). With this, we say the matrix A is diagonalizable. Remark
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 11

Lect1 - Stochastic Process Lecture 1 Review of Linear...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online