Symmetrical and Diagonal Matrices - Version 073 EXAM04...

This preview shows page 1 out of 8 pages.

Unformatted text preview: Version 073 – EXAM04 – gilbert – (56780) This print-out should have 14 questions. Multiple-choice questions may continue on the next column or page – find all choices before answering. 001 10.0 points If B = P DP −1 with P an orthogonal matrix and D a diagonal matrix, then B is a symmetric matrix. 1 8 2 −17 5. Explanation: The normal equations for a least-squares solution of Ax = b are by definition AT Ax = AT b. Now, True or False? 1. FALSE AT A = 2. TRUE correct Explanation: Note that for B to be symmetric, B T = B. Also note that all diagonal matrices are by definition symmetric. Consider B T . B T = (P DP T )T = (P T )T DT P T = P DT P T = P DP T = B. Thus B T = B and B is by definition symmetric. Consequently, the statement is TRUE . 002 2. 1 3 correct 2 −8 6 7 2 −1 6 −2 −2 1 1 1 2 0 1 0 −1 0 and AT b = 1 0 2 −1 6 17 1 . 7 = −7 0 −3 Hence the least squares solution of Ax = b is the solution x to the equation 6 −2 17 −2 . x= −7 1 This can be solved with row reduction or inverse matrices to determine that the solution is 1 1 2 2 2 6 1 3 . = 2 −8 (AT A)−1(AT b) = 17 −7 Consequently, the least squares solution to Ax = b is 17 −7 −5 3. 9 4. = 1 0 10.0 points Find the least-squares solution of Ax = b when 1 0 6 A = 2 −1 , b = 7 . 1 0 −3 1. 1 1 3 2 −8 003 . 10.0 points Determine the singular value σ1 for the matrix 2 0 A = 0 −2 . 1 2 Version 073 – EXAM04 – gilbert – (56780) 1. σ1 = 2 Since V has these three properties, V is a subspace of itself. √ 7 √ Consequently, the statement is 2. σ1 = 2 2 √ 3. σ1 = 5 TRUE . 4. σ1 = 3 correct √ 5. σ1 = 6 005 When Explanation: By definition, σ is a singular value of A √ when σ = λ and λ is an eigenvalue of AT A; σ1 is the largest of these singular values. Now 2 0 0 −2 AT A = 2 0 1 0 −2 2 1 2 5 2 . 2 8 = 5−λ 2 2 8−λ = λ2 − 13λ + 36 = (λ − 9)(λ − 4) . Consequently, σ1 = 3 . 004 1 , −2 u1 = u2 = 2 , 1 are eigenvectors of a symmetric 2 × 2 matrix A corresponding to eigenvalues λ1 = 2 , λ2 = −3 , find matrices D and P in an orthogonal diagonalization of A. 1. D = But then det(AT A − λI) = 10.0 points 1 −2 2 0 , P =√ 0 −3 5 1 2. D = −2 2 0 , P = 1 0 −3 1 2 3. D = 1 2 0 , P = −2 0 −3 2 1 4. D = −3 0 1 0 , P = −2 2 2 1 5. D = −3 0 1 0 1 , P =√ 2 5 −2 10.0 points A vector space V is a subspace of itself. True or False? 1. FALSE 2. TRUE correct Explanation: A set H is a subspace of a vector space V when (i) H contains the zero vector 0, (ii) the sum u + v of any u, v in H is in H, (iii) the scalar multiple cu of any scalar c and any u in H is in H. 6. D = correct 2 0 , 0 −3 1 2 2 1 1 1 P = √ −2 5 2 1 Explanation: When D= λ1 0 0 , Q = [u1 u2 ], λ then Q has orthogonal columns and A = QDQ−1 is a diagonalization of A, but it is Version 073 – EXAM04 – gilbert – (56780) not an orthogonal diagonalization because Q is not an orthogonal matrix. We have to normalize u1 and u2 : set v2 = u2 u2 1 2 = √ 5 1 i.e., λ1 = 2 and λ2 = −18. Associated eigenvectors are , . 1 2 v1 = √ , 5 1 1 1 0 , P =√ −3 5 −2 2 1 . xT A x = −2x2 + 16x1 x2 − 14x2 = 5 1 2 to a quadratic equation in y1 , y2 with no crossproduct term given that λ1 ≥ λ2 . 2 0 P −1 = P DP T 0 −18 is an orthogonal diagonalization of A. Now set x = x1 , x2 2 2 2y1 − 18y2 = −5 2. 2 2 2y1 + 18y2 = −5 3. 2 2 2y1 − 18y2 = 5 correct 4. 2 2 2y1 + 18y2 = 5 x = P y. Then = yT 2 0 2 2 y = 2y1 − 18y2 = 5 . 0 −18 Consequently, when x = P y, 2 2 −2x2 + 16x1 x2 − 14x2 = 2y1 − 18y2 = 5 . 1 2 xT Ax = −2x2 + 16x1 x2 − 14x2 1 2 −2 8 8 −14 x1 . x2 The eigenvalues λ1 , λ2 of A are the solutions of det y1 , y2 = yT (P T P )D(P T P )y = yT Dy Explanation: In matrix terms, −2 − λ 8 y = xT A x = (P y)T (P DP T )P y 1. = [x1 x2 ] −1 2 is an orthogonal matrix such that 10.0 points Make an orthogonal change of variables that reduces 1 −1 v2 = √ , 5 2 1 2 P = [v1 v2 ] = √ 5 1 A = P 006 −1 , 2 are thus orthonormal, and is an orthogonal diagonalization of A when 2 0 u2 = and these are orthogonal since λ1 = λ2 . The normalized eigenvectors Then P = [v1 v2 ] is an orthogonal matrix and A = P DP −1 D= 2 , 1 u1 = 1 1 = √ −2 5 u1 u1 v1 = 3 8 −14 − λ 007 10.0 points If AT = A and if vectors u and v satisfy Au = 3u and Av = 4v, then u · v = 0. True or False? 1. TRUE correct 2 = λ + 16λ − 36 = (λ − 2)(λ + 18) = 0 , 2. FALSE Explanation: Version 073 – EXAM04 – gilbert – (56780) The given vectors u and v are eigenvectors of A corresponding to eigenvalues λ1 = 3 and λ2 = 4. But when A is symmetric, then any two eigenvectors from different eigenspaces are orthogonal. Thus u and v must be orthogonal, that is, u · v = 0. Consequently, the statement is Now (−1, −2), 2. x-intercept = − (1, −3) . 4 5 3. x-intercept = − (0, 1), 14 15 13 15 1 −2 (0, 1), 4 −2 −2 6 ˆ x = = 4 −2 1 1 0 1 −2 1 −1 1 0 1 1 1 1 −2 −1 , 0 1 −2 6 b m = −1 . −7 3 −2 . b = 1 ˆ x = −1 −1 . −1 −7 −1 −7 x-intercept = − = −3 2 2 . 3 10.0 points If A is symmetric, then the change of variable x = P y transforms Q(x) = xT A x −3 b m −2 6 Consequently, the Least Squares Regression line is 3 y = − x−1 2 and so its (1, −3) . The least squares regression line for this data ˆ is y = mx + b where x is the solution of the normal equation ˆ AT A x = AT b , −2 . 6 4 −2 1 6 2 20 2 4 009 . , Thus the normal equation is = Explanation: The design matrix and list of observed values for the data 1 1 A = 1 1 −1 4 −2 = b m 11 5. x-intercept = − 15 (−1, −2), −1 −7 So 2 4. x-intercept = − correct 3 are given by 1 1 10.0 points 1. x-intercept = − (−2, 3), 1 0 3 −2 1 = −3 while AT A = Find the x-intercept of the Least Squares Regression line y = mx + b that best fits the data points (−2, 3), 1 1 −2 −1 AT b = TRUE . 008 4 into a quadratic form with no cross-product term for any orthogonal matrix P . True or False? 1. TRUE Version 073 – EXAM04 – gilbert – (56780) i.e., λ1 = 16, λ2 = 1. Eigenvectors x1 and x2 associated with λ1 and λ2 are 2. FALSE correct Explanation: When P is orthogonal and x = P y, then Q(x) = xT A x = (P y)T A (P y) = yT P T A P y = yT (P T AP )y . But this will contain cross-product terms unless P T AP is a diagonal matrix, i.e., unless A is orthogonally diagonalized by P . Consequently, the statement is FALSE . 010 2 0 3 2 , A = U ΣV T of A? 1 −2 2 1 −1 correct 2 1 1 3. V = √ 5 2 −2 1 2 −1 1 2 Explanation: Since AT A = 2 3 0 2 x2 = −1 ; 2 these are orthogonal. Associated orthonormal eigenvectors are thus 1 2 v1 = √ , 5 1 1 −1 v2 = √ . 5 2 Consequently, one choice for V is −1 2 . Since multiples of x1 and x2 are again eigenvectors of AT A corresponding respectively to λ1 = 16 and λ2 = 1, there are other choices for V . But these are not among the choices listed. 011 10.0 points If A is an m × n matrix and b is in Rm , the general least-squares problem is to find an x that makes Ax as close as possible to b. True or False? 1 2 2. V = √ 5 1 4. V = 2 , 1 10.0 points which of the following could be a choice for the matrix V in a Singular Value Decomposition 1. V = x1 = 1 2 V = [v1 v2 ] = √ 5 1 When A = 5 13 6 2 0 , = 6 4 3 2 1. TRUE correct 2. FALSE Explanation: ˆ A least squares solution of Ax = b is an x in Rn such that b − Aˆ ≤ b − Ax for all x x in Rn . Note that b − Ax is the distance from Ax to b, thus the goal is to minimize that distance. Consequently, the statement is TRUE . the eigenvalues of AT A are the solutions of (13 − λ)(4 − λ) − 36 = λ2 − 17λ + 16 = (λ − 16)(λ − 1) = 0 , 012 10.0 points Version 073 – EXAM04 – gilbert – (56780) 6 x2 Which one of the following is the graph of 4. 5x2 1 − 4x1 x2 + 2x2 2 = 16 ? x1 (All axes drawn to the same scale.) x2 1. x2 x1 5. correct x1 x2 2. x2 x1 6. x1 x2 3. Explanation: In matrix terms, 5x2 − 4x1 x2 + 2x2 = xT Ax 1 2 x1 with x = x1 , x2 A = 5 −2 −2 2 . But if λ1 , λ2 are the eigenvalues of A and v1 , v2 are respective corresponding normalized eigenvectors, then by the Principal Axes Version 073 – EXAM04 – gilbert – (56780) 7 theorem, 013 T x Ax = y λ1 0 T 0 2 2 y = λ1 y1 + λ2 y2 λ2 Use an orthogonal matrix P to identify where P = [v1 v2 ], y1 , y2 y = x2 + 4xy − 2y 2 = 0 x = Py . Now det[A − λI] = 5−λ −2 −2 2−λ as a conic section in x1 , y1 without cross-term when x1 x . = P y1 y 2 1. ellipse 2x2 − 3y1 = −1 1 = λ2 − 7λ + 6 = (λ − 6)(λ − 1) , so λ1 = 6, λ2 = 1 and 1 2 v1 = √ , 5 −1 10.0 points 2 2. hyperbola 2x2 − 3y1 = −1 1 2 3. ellipse 2x2 + 3y1 = 1 1 1 1 v2 = √ . 5 2 2 4. straight lines 2x2 − 3y1 = 0 correct 1 5. point (0, 0) Thus the graph of 2 6. hyperbola 2x2 + 3y1 = 1 1 5x2 − 4x1 x2 + 2x2 = 16 1 2 Explanation: The quadratic relation is that of the ellipse 2 2 6y1 + y2 = 16 x2 + 4xy − 2y 2 = 0 with respect to the y1 y2 -axes. Since x = P y = [y1 y2 ] y1 y2 xT Ax = xT = y1 v1 + y2 v2 , the y1 -axis is the line tv1 , while the y2 -axis is the line tv2 . Consequently, the graph of the ellipse is given by x2 can be written in matrix terms as y2 2 x = 0 −2 x . y To eliminate the cross-term we orthogonally diagonalize A by finding the eigenvalues λ1 , λ2 and corresponding normalized eigenvectors v1 , v2 of A. For then A = P DP T with where x = P = [v1 v2 ], x1 1 2 D = λ1 0 0 , λ2 and 2 xT Ax = yT Dy = λ1 x2 + λ2 y1 , 1 y1 setting P y = x, y = x1 , y1 x = x . y Version 073 – EXAM04 – gilbert – (56780) Since A is symmetric, P will be orthogonal if λ1 = λ2 . But λ1 and λ2 , then the spectral decomposition of A is given by T A = λ1 v1 v1 + λ2 v2 vT , det[A − λI] = (1 − λ)(−2 − λ) − 4 = λ2 + λ − 6 = (λ − 2)(λ + 3) = 0. and the component determined by λ1 is Thus λ1 = 2 and λ2 = −3, so that in x1 , y1 coordinates the quadratic relation becomes which as a degenerate conic section is the familiar form of a pair of straight lines. Compute the component of the spectral decomposition determined by λ1 when A is the symmetric 2 × 2 matrix with eigenvalues λ2 = −9 and corresponding eigenvectors 2 , −4 x1 = 1. 9 4 5 −2 1 1 5 −2 4. − 5. − 9 4 5 2 6. 2 . 1 = v2 = 1 √ 2 2 5 −4 x2 x2 1 1 = √ 5 −2 1 2 = √ . 5 1 Then T v1 v1 = 1 1 1 1 −2 [ 1 −2 ] = , 5 −2 5 −2 4 T v2 v2 = 1 2 1 4 2 [2 1] = . 1 5 5 2 1 Thus A = 2 4 −2 correct 4 9 1 5 −2 x1 x1 while 1 1 −2 5 2 1 2. − 3. x2 = v1 = and 10.0 points λ1 = 1, T λ1 v1 v1 . Now the given eigenvectors x1 , x2 are orthogonal, but not orthonormal, so set 2 2x2 − 3y1 = 0, 1 014 8 −2 4 2 1 1 4 2 5 2 1 Explanation: When v1 , v2 are orthonormal eigenvectors of A corresponding to respective eigenvalues λ1 5 1 −2 λ2 −2 + 4 5 4 2 2 . 1 Consequently, the component of the spectral decomposition of A determined by λ1 is 1 1 −2 5 −2 4 . ...
View Full Document

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture