This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CS205 Review Session #2 Notes Symmetric matrices and dot product Lemma An n n matrix A is symmetric x , y IR n x Ay = y Ax Proof In the forward direction, if A is symmetric we have: x Ay = x T Ay = ( x T Ay ) T = y T A T x = y T Ax = y Ax For the converse, recall that e i is the ith cartesian basis vector. Taking x = e i and y = e j and then the reverse, we have: e i A e j = e T i Ae j = A ij e j A e i = e T j Ae i = A ji Thus A ij = A ji for all i,j and A is symmetric. Lemma If A , B are symmetric n n matrices and x IR n x T Ax = x T Bx then A = B Proof First, take x = e i : e T i Ae i = e T i Be i A ii = B ii Similarly, taking x = e i + e j gives: ( e i + e j ) T A ( e i + e j ) = ( e i + e j ) T B ( e i + e j ) A ii + A jj + A ij + A ji = B ii + B jj + B ij + B ji 2 A ij = 2 B ij Fundamental Subspaces Recall that an n n matrix A is just a basis for a vector space. If the columns of A are all linearly independent, then A is a basis for IR n . If some columns are linear combinations of others, then the maximal subset of columns that are linearly independent forms a basis for a subspace. The subspace spanned by the columns is called the column space of the matrix, and is given as col( A ) =...
View
Full
Document
This note was uploaded on 01/29/2008 for the course CS 205A taught by Professor Fedkiw during the Fall '07 term at Stanford.
 Fall '07
 Fedkiw

Click to edit the document details