6.3 - 6.3 Orthogonal Projections Review{u1 un...

Info icon This preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 6.3 Orthogonal Projections Review: {u1 , · · ·, un } orthogonal basis for Rn , y = c1 u1 + · · · + cn un , ci = y · ui y ∈ Rn ⇒ ∴ {u1 , · · ·, un } orthonormal basis ⇒ ui · ui y = (y · u1 )u1 + (y · u2 )u2 + · · · + (y · un )un . Theorem: m × n matrix U has orthonormal columns ⇔ UTU = I uT 1 . Why? Because if U = [ u1 | · · · | un ] , U T = . . T un 1 0 ... U T U = uT uj = i 0 1 ⇒ for any x, y ∈ Rn , a. b. c. Ux = x (U x) · (U y ) = x · y Ux · Uy = 0 ⇔ x · y = 0 Theorem: m × n matrix U with orthonormal columns ∴ a. Preserves lengths b. Preserves angles cosθ = 1 Ux Ux · Uy Uy = x x·y y c. Preserves orthogonality Definition: if its columns are orthonormal. n × n matrix U is an orthogonal matrix Question: Which basic matrices in R2 are orthogonal? Sc = 10 0k 10 00 Ro = s c −s c Sh = 2cs 2s2 − 1 11 01 P= Re = 2c2 − 1 2cs EX: Note U −1 = U T ← important √ √ √ 1/ 6 1/ 3 1/ 2 √ √ U = 1/ 3 0 −2/ 6 √ √ √ 1/ 3 −1/ 2 1/ 6 W subspace of Rn with orthogonal basis Definition: {u1 , · · ·, up } as For any y ∈ Rn , define orthogonal projection of y onto W y = ProjW y = c1 u1 + c2 u2 + · · · + cp up ˆ , or ci = y · ui if it is orthonormal basis ci = ui · ui y · ui 2 Orthogonal Decomposition Theorem: y = y + z (or z = y − y ), then z ∈ W ⊥ . ˆ ˆ W sub- space of Rn with orthogonal basis {u1 , · · ·, up } and Proof: z · ui = (y − y ) · ui = y · ui − y · ui ˆ ˆ = y · ui − (c1 u1 + · · · + cp up ) · ui = y · ui − ci ui · ui = 0 for all i 1 1 3/2 3/2 −1/2 1/2 EX: In R2 , W =Span{u1 }, u1 = y= ˆ y · u1 u1 = 3 2 u1 = , y= 1 2 u1 · u1 z =y−y = ˆ 1 2 − 3/2 3/2 y z = y ˆ u1 3 EX: 1 u1 = 1 0 y= ˆ y · u2 y · u1 u1 + u2 u1 · u1 u2 · u2 3 2 = u1 + u2 2 3 3/2 + 2/3 13/6 = 3/2 − 2/3 = 5/6 0 + 2/3 2/3 1 u2 = −1 1 1 y = 2 3 1 13/6 −7/6 z = y − y = 2 − 5/6 = 7/6 ˆ 3 2/3 7/3 So z · u1 = 0 z · u2 = 0 y u2 y2 ˆ y= ˆ y1 ˆ y · u1 u1 + y · u2 u2 = y 1 + y 2 u1 · u1 u2 · u2 u1 Figure 1: The orthogonal projection of y is the sum of its projections onto one-dimensional subspaces that are mutually orthogonal. 4 EX: Above y1 = ˆ 3 2 u1 , y2 = ˆ 2 3 u2 Properties of orthogonal projections 1. Best approximation theorem: If W subspace of Rn , is closest to y in the sense that y−y < y−v ˆ y ∈ Rn , y = ProjW (y ) then y is the vector in W which ˆ ˆ for all other v ∈ W (v = y ). ˆ Question: Why? Answer: y y−y ˆ 0 y ˆ y−v y−v ˆ W v Figure 2: The orthogonal projection of y onto W is the closet point in W to y Take v = y , so y − v ∈ W is orthogonal to y − y . ˆ ˆ ˆ ∴ y−v 2 = y−y ˆ 2 + y−v ˆ 2 > y−y ˆ 2 (Pythagorean Theorem) 2. y ∈ W then y = ProjW y = y itself. ˆ 5 ...
View Full Document

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern