This preview shows page 1. Sign up to view the full content.
Unformatted text preview: 1 D 2 2 cos nC1 .
2
1
1
2
5
14
16 A D
produces u0 D
, u1 D
, u2 D
, u3 D
. This
1
2
0
1
4
13
1
is converging to the eigenvector direction
with largest eigenvalue D 3. Divide
1
uk by kuk k. 15 In the j th component of Ax 1 ; 1 sin Solutions to Exercises 94
121
12
15
1 14
1=2
17 A D
gives u1 D
, u2 D
, u3 D
! u1 D
.
1=2
312
31
94
27 13
1 cos sin
cos .1 C sin2 /
sin3
18 R D QT A D
and A1 D RQ D
.
0
sin2
sin3
cos sin2
1 19 If A is orthogonal then Q D A and R D I . Therefore A1 D RQ D A again, and the QR method” doesn’t move from A. But shift A slightly and the method goes quickly
to ƒ. 20 If A c I D QR then A1 D RQ C cI D Q
in eigenvalues because A1 is similar to A. 1 .QR C cI /Q D Q 1 AQ. No change C bj q j C1 by q T to ﬁnd q T Aq j D aj (because the
j
j
q ’s are orthonormal). The matrix form (multiplying by columns) is AQ D QT where
T is tridiagonal. The entries down the diagonals of T are the a’s and b ’s. 21 Multiply Aq j D bj 1 q j 1 C aj q j 22 Theoretically the q ’s are orthonormal. In reality this important algorithm is not very stable. We must stop every few steps to reorthogonalize—or ﬁnd another more stable
way to orthogonalize q ; Aq ; A2 q; : : :
1 AQ D QT AQ is also symmetric. A1 D RQ D
R.QR/R D RAR has R and R 1 upper triangular, so A1 cannot have nonzeros
on a lower diagonal than A. If A is tridiagonal and symmetric then (by using symmetry
for the upper part of A1 / the matrix A1 D RAR 1 is also tridiagonal.
P
P
24 The proof of jj < 1 when every absolute row sum < 1 uses j aij xj j
jaij jjxi j <
jxi j. (Here xi is the largest component.) The application to the Gershgorin circle theorem (very useful) is printed after its statement in this problem.
23 If A is symmetric then A1 D Q
1 1 25 For A and K , the maximum row sums give all jj 1 and all jj 4. The circles j :5j :5 and j :4j :6 around diagonal entries of A give tighter bounds. The
circle j 2j 2 p K contains the circle j 2j 1 and all three eigenvalues
for
p
2 C 2; 2, and 2
2. 26 With diagonal dominance ai i > ri , the circles j ai i j ri don’t include D 0
(so A is invertible!). Notice that the 1; 2; 1 matrix is also invertible even though its
diagonals are only weakly dominant. They equal the offdiagonal row sums, 2 D 2
except in the ﬁrst and last rows, and more care is needed to prove invertibility. 27 From the last line of code, q 2 is in the direction of v D Aq 1 h11 q 1 D Aq 1
.q T Aq 1 /q 1 . The dot product with q 1 is zero. This is GramSchmidt with Aq 1 as the
1
second input vector. 28 Note The ﬁve lines in Solutions to Selected Exercises prove two key properties of
conjugate gradients—the residuals r k D b Ax k are orthogonal and the search directions are Aorthogonal .p T Ap i D 0/. Then each new guess x k C1 is the closest vector
i
to x among all combinations of b, Ab, Ak b. Ordinary iteration S x k C1 D T x k C b
does not ﬁnd this best possible comb...
View
Full
Document
This note was uploaded on 09/25/2012 for the course PHY 103 taught by Professor Minki during the Spring '12 term at Korea Advanced Institute of Science and Technology.
 Spring '12
 Minki
 Mass

Click to edit the document details