review_lecture_one - EECS 227A: Nonlinear and Convex...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EECS 227A: Nonlinear and Convex Optimization Fall 2009 Linear algebra/Analysis and Calculus review 1 Chapter 1 Linear Algebra 1.1 Matrices 1.1.1 Basics Nullspace The nullspace (or, kernel) of a m n matrix A is the following subspace of R n : N ( A ) := { x R n : Ax = 0 } . Range and rank The range (or, image) of a m n matrix A is defined as the following subset of R m : R ( A ) := { Ax : x R n } . The range is simply the span of the columns of A . The dimension of the range is called the rank of the matrix. As we will see later, the rank cannot exceed any one of the dimensions of the matrix A : r min( m,n ). It is equal to n minus the dimension of its nullspace. A basic result of linear algebra states that any vector in R n can be decomposed as x = y + z , with y N ( A ), z R ( A T ), and z,y are orthogonal. (One way to prove this is via the singular value decomposition, seen later.) Symmetric Matrices : A square matrix A R n n is symmetric if and only if A = A T . The set of symmetric n n matrices is denoted S n . Orthogonal matrices. A square, n n matrix U = [ u 1 ,...,u n ] is orthogonal if its columns form an orthonormal basis. The condition u T i u j = 0 if i 6 = j , and 1 otherwise, translates in matrix terms as U T U = I n with I n the n n identity matrix. Unitary matrix: An n n matrix U is unitary if UU * = U * U = I where U * is the transpose of the conjugate of U . Normal matrix: An n n matrix A is normal if AA * = A * A 2 EECS 227A Fall 2009 1.1.2 Eigenvalue decomposition A fundamental result of linear algebra states that any symmetric matrix can be decomposed as a weighted sum of normalized dyads that are orthogonal to each other. Precisely, for every A S n , there exist numbers 1 ,..., n and an orthonormal basis ( u 1 ,...,u n ), such that A = n X i =1 i u i u T i . In a more compact matrix notation, we have A = U U T , with = diag ( 1 ,..., n ), and U = [ u 1 ,...,u n ]. The numbers 1 ,..., n are called the eigenvalues of A , and are the roots of the charac- teristic equation det( I- A ) = 0 , where I n is the n n identity matrix. Eigenvalues and eigenvectors satisfies Au i = i u i , some other properties of eigenvalues det ( A ) = Q n i =1 i Tr ( A ) = n i =1 i For arbitrary square matrices, eigenvalues can be complex. In the symmetric case, the eigenvalues are always real. There are only n (possibly distinct) solutions to the above equation. It is interesting to see what the eigenvalue decomposition of a given symmetric matrix A tells us about the corresponding quadratic form, q A ( x ) := x T Ax . With A = U U T , we have q A ( x ) = ( U T x ) T ( U T x ) = n X i =1 i ( u T i x ) 2 ....
View Full Document

This note was uploaded on 09/03/2011 for the course EE 227A taught by Professor Staff during the Spring '11 term at University of California, Berkeley.

Page1 / 10

review_lecture_one - EECS 227A: Nonlinear and Convex...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online