# Register now to access 7 million high quality study materials (What's Course Hero?) Course Hero is the premier provider of high quality online educational resources. With millions of study documents, online tutors, digital flashcards and free courseware, Course Hero is helping students learn more efficiently and effectively. Whether you're interested in exploring new subjects or mastering key topics for your next exam, Course Hero has the tools you need to achieve your goals.

16 Pages

### LeastSquares

Course: EE 520, Fall 2009
School: Iowa State
Rating:

Word Count: 1160

#### Document Preview

&lt;a href=&quot;/keyword/least-squares-estimation/&quot; &gt;least squares estimation&lt;/a&gt; Namrata Vaswani, namrata@iastate.edu &lt;a href=&quot;/keyword/least-squares-estimation/&quot; &gt;least squares estimation&lt;/a&gt; 1 Recall: Geometric Intuition for Least Squares Minimize J(x) = ||y - Hx||2 Solution satisfies: H T H x = H T y, i.e. x = (H T H)-1 H T...

Register Now

#### Unformatted Document Excerpt

Coursehero >> Iowa >> Iowa State >> EE 520

Course Hero has millions of student submitted documents similar to the one
below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.

Course Hero has millions of student submitted documents similar to the one below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.
<a href="/keyword/least-squares-estimation/" >least squares estimation</a> Namrata Vaswani, namrata@iastate.edu <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 1 Recall: Geometric Intuition for Least Squares Minimize J(x) = ||y - Hx||2 Solution satisfies: H T H x = H T y, i.e. x = (H T H)-1 H T y ^ ^ So H T (y - H x) = 0 ^ The least error (y - H x) is to column space of H ^ Think 3D: minimum error is always to plane of projection <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 2 Weighted Least Squares y = Hx + e Minimize J(x) = (y - Hx)T W (y - Hx) Solution: x = (H T W H)-1 H T W y ^ (2) ||y - Hx||2 W (1) Given that E[e] = 0 and E[eeT ] = V , Min. Variance Unbiased Linear Estimator of x: choose W = V -1 in (2) Min. Variance of a vector: variance in any direction is minimized <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 3 Proof (skip if you want to) Given x = Ly, find L, s.t. E[Ly] = E[LHx] = E[x], so LH = I ^ Let L0 = (H T V -1 H)-1 H T V -1 Error variance E[(x - x)(x - x)T ] ^ ^ E[(x - x)(x - x)T ] ^ ^ = = E[(x - LHx - Le)(x - LHX - Le)T ] E[LeeT LT ] = LV LT Say L = L - L0 + L0 . Here LH = I, L0 H = I, so (L - L0 )H = 0 LV LT = = = L0 V LT + (L - L0 )V (L - L0 )T + 2L0 V (L - L0 )T 0 L0 V LT + (L - L0 )V (L - L0 )T + (H T V -1 H)-1 H T (L - L0 )T 0 L0 V LT + (L - L0 )V (L - L0 )T L0 V LT 0 0 Thus L0 is the optimal estimator (Note: for matrices) <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 4 Regularized Least Squares Minimize J(x) = (x - x0 )T -1 (x - x0 ) + (y - Hx)T W (y - Hx) 0 x J(x) = = z x - x0 , y y - Hx0 (3) x T -1 x + y T W y 0 z M -1 z I 0 x - H y -1 0 0 0 W M <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 5 Solution: Use weighted least squares formula with y = ~ ~ H= Get: x = x0 + (-1 + H T W H)-1 H T W (y - Hx0 ) ^ 0 I H ~ , W = M 0 y , Advantage: improves condition number of H T H, incorporate prior knowledge about distance from x0 <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 6 Recursive Least Squares Use in one of following situations: When number of equations much larger than number of variables: Storage problem Getting data sequentially, do not want to re-solve the full problem again The dimension of x is large, want to avoid inverting matrices Goal: At step i - 1, have xi-1 : minimizer of ^ (x - x0 )T -1 (x - x0 ) + ||Hi-1 x - Yi-1 ||2 i-1 , Yi-1 = [y1 , ...yi-1 ]T 0 W Find xi : minimizer of (x - x0 )T -1 (x - x0 ) + ||Hi x - Yi ||2 i , ^ 0 W <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 7 Hi = Hi-1 hi (hi is a row vector), Yi = [y1 , ...yi ]T (column vector) For simplicity of notation, assume x0 = 0 and Wi = I. T Hi Hi T = Hi-1 Hi-1 + hT hi i T T = (-1 + Hi Hi )-1 Hi Yi 0 T T = (-1 + Hi-1 Hi-1 + hT hi )-1 (Hi-1 Yi-1 + hT yi ) i i 0 xi ^ Define Pi So Pi T = (-1 + Hi Hi )-1 , P0 = 0 0 -1 = [Pi-1 + hT hi ]-1 i Use Matrix Inversion identity: <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 8 (A + BCD)-1 = A-1 + A-1 B(C -1 + DA-1 B)-1 DA-1 Pi = Pi-1 - Ki hi Pi-1 where Ki = Pi-1 hT (1 + hi Pi-1 hT )-1 i i Thus x0 ^ xi ^ = 0 = = T Pi Hi Yi T = [Pi-1 - Ki hi Pi-1 ][Hi-1 Yi-1 + hT yi ] i (4) xi-1 + Ki (yi - hi xi-1 ) ^ ^ T The last equality uses the facts that (i) xi-1 = Pi-1 Hi-1 Yi-1 , (ii) ^ [Pi-1 - Ki hi Pi-1 ]hT yi = Ki yi (expand Ki , obtain this after a few i manipulations). <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 9 Here we considered the weight Wi = I. If Wi = I, the equation for Ki 1/2 1/2 modifies to (replace yi by wi yi &amp; hi by wi hi , where wi = (Wi )i,i ) Ki = Pi-1 hT (wi -1 + hi Pi-1 hT )-1 i i (5) Also, here we considered yi to be a scalar and hi to be a row vector. In general: yi can be a k-dim vector, hi will be a matrix with k rows, and the above formulae still apply, replace 1 by I everywhere RLS with Forgetting factor Weight older data with smaller weight J(x) = Exponential forgetting: (i, j) = i-j , &lt;1 i j=1 (yj - hj x)2 (i, j) Moving average: (i, j) = 0 if |i - j| &gt; and (i, j) = 1 otherwise <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 10 Summarizing Recursive LS In general can assume that yi is k dimensional and so hi has k rows. Weight matrix (Wi )i,i = wi . Solution is: x0 ^ Ki Pi xi ^ = = = = x0 , P0 = 0 Pi-1 hT (wi -1 + hi Pi-1 hT )-1 i i (I - Ki hi )Pi-1 xi-1 + Ki (yi - hi xi ) ^ (6) This is a recursive way to get the Regularized LS solution T xi = (-1 + Hi Wi Hi )-1 Yi ^ 0 T T T with Hi = [hT , hT , ...hT ]T , Yi = [y1 , y2 , ...yi ]T 1 2 i (7) <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 11 Connection with Kalman Filtering The above is also the Kalman filter estimate of the state for the following system model: xi yi = = xi-1 -1 hi xi + vi , vi N (0, Ri ), wi = Ri (8) <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 12 Kalman Filter Motivation RLS was for static data: estimate the signal x better and better as more and more data comes in, e.g. estimating the mean intensity of an object from a video sequence RLS with forgetting factor assumes slowly time varying x Kalman filter: if the signal is time varying, and we know (statistically) the dynamical model followed by the signal: e.g. tracking a moving object x0 xi = N (0, 0 ) Fi xi-1 + vx,i , vx,i N (0, Qi ) The observation model is as before: yi = hi xi + vi , vi N (0, Ri ) <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 13 Goal: get the best (minimum mean square error) estimate of xi from Yi Cost: J(^i ) = E[(xi - xi )2 |Yi ] x ^ Minimizer: conditional mean xi = E[xi |Yi ] ^ This is also the MAP estimate, i.e. xi also maximizes p(xi |Yi ) ^ <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 14 Example Applications Recursive LS: Adaptive noise cancelation Channel equalization using a training sequence Object intensity estimation: x = intensity, yi = vector of intensities of object region in frame i, hi = 1m (column vector of m ones) Keep updating estimate of location of an object that is static, using a sequence of location observations coming in sequentially Recursive LS with forgetting factor: object not static but drifts very slowly (e.g. floating object) or object intensity changes very slowly Kalman filter: Track a moving object (estimate its location, velocity at each time), when acceleration is assumed i.i.d. Gaussian <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 15 Material adapted from Chapters 2, 3 of Linear Estimation, by Kailath, Sayed, Hassibi <a href="/keyword/least-squares-estimation/" >least squares estimation</a> 16
Find millions of documents on Course Hero - Study Guides, Lecture Notes, Reference Materials, Practice Exams and more. Course Hero has millions of course specific materials providing students with the best way to expand their education.

Below is a small sample set of documents:

Iowa State - EE - 520
Iowa State - EE - 520
Iowa State - EE - 520
Iowa State - EE - 520
UVA - ASTRO - 124
Astronomy 124 Homework #2, Due in class Nov. 81) Scientific Method a. What does &quot;proof&quot; mean in science? b. Explain the difference between a hypothesis, a theory, and a natural law. c. Give an example for each of the above categories from the materi
San Diego State - ART - 496
Yuki Kusuhara ART348: 3D Graphics 3/02/09 Toy Project Design Breif: Gunny Labbit The original design of the toy was one of a well-known collectables of a smoking rabbit called Smorking Labbit, designed by Frank Kozik, a Spanish Music poster artist. 1
Iowa State - EE - 528
Course Information: EE 528 (Digital Image Processing) Instructor: Dr. Namrata Vaswani, Email: namrata AT iastate.edu Oce: 3121 Coover Hall Class webpage: http:/www.ece.iastate.edu/ namrata/EE528/ Location: 1120 Sweeney Time: 9:30 - 10:50 am, Mond
Iowa State - EE - 528
Iowa State - EE - 527
BAYESIAN INFERENCEx observable random variable; true state of nature; p(x | ) or px| (x | ) data model, likelihood [the same as the data model in the non-Bayesian approach, which we denoted as p(x; )]; () or () prior distribution on (epistem
Iowa State - EE - 527
Coherent DetectionReading: Ch. 4 in Kay-II. (Part of) Ch. III.B in Poor.EE 527, Detection and Estimation Theory, # 5b1Coherent Detection (of A Known Deterministic Signal) in Independent, Identically Distributed (I.I.D.) Noise whose Pdf/Pmf
Iowa State - EE - 527
Iowa State - EE - 527
UVA - ASTR - 553
Selection of Homework QuestionsTopic 4: Luminosity Functions(1) The Schechter Function : Lets evaluate some basic properties arising from the Schechter Luminosity function of galaxies. First, the function reads :1. Considering just the low lumino
Occidental - ECON - 101
ECONOMICS 101 PRINCIPLES OF ECONOMICS Revised and abbreviated syllabus Prof: Jennifer Olmsted Occidental College Office: South Swan 116 Dept. of Economics Phone: (323) 259-2775 Email:jolmsted@oxy.edu Office Hours: Tue 8:30 - 9:55 am Home page:www.oxy
Coastal Carolina University - WW - 139
Math 139, Exam 2 Review Sheet Chapter 2, Section 2.6: Know what a rational number is Know a couple of examples of irrational numbers HW assigned from 2.6: #22, 23, 26, 27, 33, 34, 35 Section 2.7: What are the Natural numbers? Rational number
Coastal Carolina University - WW - 139
Math 139, Exam 4: Topics to StudyFor each section, you should rework class examples and homework assigned for that section. (The homework that was assigned for each section is listed under each section.)Chapter 7: You should know the definitions o
Iowa State - EE - 528
UCLA - EDUC - 456
Questions for Education and the Cult of Efficiency (Questions 1 and 2) and Chapter 2 of Restoring Prosperity (Questions 3). 1. What impact do you suppose Leonard Ayres's 1909 study, Laggards in Our Schools, had on power relationships in the schools?
Allan Hancock College - CXJ - 109
Surface Science 600 (2006) L81L85 www.elsevier.com/locate/suscSurface Science LettersChemical states of nitrogen in ZnO studied by near-edge X-ray absorption fine structure and core-level photoemission spectroscopiesM. Petravicaa,*, P.N.K.
Binghamton - CS - 495
name points per mark max marks name Lippencott Thurber Camin Conway Dietz Host Laba Liu Loi Magardichian Marczyk Syrow Mclaughlin Miles Miranda Mo Sharkey Murygin Tsang Weinberg Weitzman Wong team 1 1 1 1 1 1 1 2 2 2 2 2 2 3 3 3 3 3 3 3 4 4transiti
Binghamton - CS - 495
name points per mark max marks name Lippencott Thurber Camin Conway Dietz Host Laba Liu Loi Magardichian Marczyk Syrow Mclaughlin Miles Miranda Mo Sharkey Murygin Tsang Weinberg Weitzman Wong team 1 1 1 1 1 1 1 2 2 2 2 2 2 3 3 3 3 3 3 3 4 4transiti
Coastal Carolina University - WW - 301
MSCI 301 Spring 2009 Lab # 5 Ocean ObservingI. In-Situ DataName : _The National Oceanic and Atmospheric Administration (NOAA) has made it a top priority to fund an Integrated Ocean Observing System (IOOS) in the coastal waters of the U.S. to pro
San Diego State - ART - 496
9 Unit GridTV Commercial ExampleLexus RX Commercialshttp:/www.le x us . c om / l i fe s t y l e / c om m e rc i a l s . h t m l Conceptual Commercial using 9 UNIT Grid: (integration of photography)&quot;What factors should you consider when buying
UVA - CS - 686
SCHEMA CALCULUS OPERATOR &quot;PRE&quot; Forms a new schema, but not one that is useful in building a specification Role is to help understand a specification Issue: What exactly is the precondition for a given schema? This is very important:- If you get it
Colorado - SPOT - 4999
Economics 4999 Spring 2009 Course Outline The topics to be discussed are subject to change after February 15th if we as a class decide to adjust the direction of focus for the material discussed in class. If you have a particular interest in any area
UVA - CS - 686
Department of Computer ScienceUniversity of VirginiaCS686 - DEPENDABLE COMPUTING ASSIGNMENT 3 DUE: FEBRUARY 20Please type the Reading and Individual Activity parts of the assignment single spaced, 12pt type with 1&quot; margins, and indented paragrap
San Diego State - ART - 496
San Diego State - ART - 496
San Diego State - ART - 496
San Diego State - ART - 496