As seen before if the error e is random this bound is a bit pes simistic

As seen before if the error e is random this bound is

This preview shows page 8 - 12 out of 18 pages.

As seen before, if the error e is random, this bound is a bit pes- simistic. Specifically, if each entry of e is an independent identically distributed Normal random variable with mean zero and variance ν 2 , then the expected noise error in the reconstruction will be E[ k Noise error k 2 2 ] = 1 M 1 σ 2 1 + 1 σ 2 2 + · · · + 1 σ 2 R 0 · E[ k e k 2 2 ] . 56 Georgia Tech ECE 6250 Fall 2019; Notes by J. Romberg and M. Davenport. Last updated 21:13, November 3, 2019
Image of page 8

Subscribe to view the full document.

Stable Reconstruction using Tikhonov Regularization Tikhonov 2 regularization is another way to stabilize the least-squares recovery. It has the nice features that: 1) it can be interpreted using optimization, and 2) it can be computed without direct knowledge of the SVD of A . Recall that we motivated the pseudo-inverse by showing that ˆ x LS = A y is a solution to minimize x R N k y - Ax k 2 2 . (5) When A has full column rank, ˆ x LS is the unique solution, otherwise it is the solution with smallest energy. When A has full column rank but has singular values which are very small, huge variations in x (in directions of the singular vectors v k corresponding to the tiny σ k ) can have very little effect on the residual k y - Ax k 2 2 . As such, the solution to ( 5 ) can have wildly inaccurate components in the presence of even mild noise. One way to counteract this problem is to modify ( 5 ) with a regu- larization term that penalizes the size of the solution k x k 2 2 as well as the residual error k y - Ax k 2 2 : minimize x R N k y - Ax k 2 2 + δ k x k 2 2 . (6) The parameter δ > 0 gives us a trade-off between accuracy and regularization; we want to choose δ small enough so that the residual 2 Andrey Tikhonov (1906-1993) was a 20 th century Russian mathematician. 57 Georgia Tech ECE 6250 Fall 2019; Notes by J. Romberg and M. Davenport. Last updated 21:13, November 3, 2019
Image of page 9
for the solution of ( 6 ) is close to that of ( 5 ), and large enough so that the problem is well-conditioned. Just as with ( 5 ), which is solved by applying the pseudo-inverse to y , we can write the solution to ( 6 ) in closed form. To see this, recall that we can decompose any x R N as x = V α + V 0 α 0 , where V is the N × R matrix (with orthonormal columns) used in the SVD of A , and V 0 is a N × N - R matrix whose columns are an orthogonal basis for the null space of A . This means that the columns of V 0 are orthogonal to each other and all of the columns of V . Similarly, we can decompose y as y = + U 0 β 0 , where U is the M × R matrix used in the SVD of A , and the columns of U 0 are an orthogonal basis for the left null space of A (everything in R M that is not in the range of A ). For any x , we can write y - Ax = + U 0 β - U Σ V T ( V α + V 0 α 0 ) = U ( β - Σ α ) + U 0 β . Since the columns of U are orthonormal, U T U = I , and also U T 0 U 0 = I , and U T U 0 = 0 , we have k y - Ax k 2 2 = h U ( β - Σ α ) + U 0 β 0 , U ( β - Σ α ) + U 0 β 0 i = k β - Σ α k 2 2 + k β 0 k 2 2 , and k x k 2 2 = k α k 2 2 + k α 0 k 2 2 . 58 Georgia Tech ECE 6250 Fall 2019; Notes by J. Romberg and M. Davenport. Last updated 21:13, November 3, 2019
Image of page 10

Subscribe to view the full document.

Using these facts, we can write the functional in ( 6 ) as k y - Ax k 2 2 + δ k x k 2 = k β - Σ α k 2 2 + δ k α k 2 2 + δ k α 0 k 2 2 . (7) We want to choose α and α 0 that minimize ( 7 ). It is clear that, just as in the standard least-squares problem, we need α 0 = 0 . The part of the functional that depends on α
Image of page 11
Image of page 12
  • Fall '08
  • Staff

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern

Ask Expert Tutors You can ask You can ask ( soon) You can ask (will expire )
Answers in as fast as 15 minutes