Minimization rank null space etc computer vision

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ure of a matrix. Let σ1 ≥ · · · ≥ σr > σr +1 = · · · = σp = 0 then rank(A) = r null(A) = span{vr +1 , . . . , vn } ran(A) = span{u1 , . . . , ur } 12 / 17 Geometric interpretation of SVD Consider Ax = b where A ∈ IR3×2 , x ∈ IR2×1 and b ∈ IR3×1 , and A = U ΣV Apply left rotation to x using right singular vectors V , ξ = V x Scale with Σ, i.e., η = Σξ = ΣV x Apply right rotation using left singular vectors U , b = U η = U ΣV x Best approximation with r eigenvectors in 2-norm 13 / 17 SVD expansion We can decompose A in terms of singular values and vectors r A = U ΣV r = σi ui ⊗ vi σi u i vi = i =1 i =1 where ⊗ is the Kronecker product Matrix 2-norm and Frobenius norm A F = Ax 2 x2 minx=0 Ax 22 x = maxx=0 and | det(A)| = 2 2 σ1 + · · · + σp , p = min(m, n) A 2 = σ1 = σn , m ≥ n n i =1 σi Closely related to eigenvalues, eigen-decomposition and principal component analysis 14 / 17 Application of SVD Matrix algebra: pseudo inverse, solving homogeneous linear equation, least squares minimization, rank, null space, etc. Computer vision: denoise, eigenface, eigent...
View Full Document

This document was uploaded on 02/10/2014.

Ask a homework question - tutors are online