TrefethenLecture2

TrefethenLecture2 - Lecture 2 Orthogonal Vectors and...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 4
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 6
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture 2. Orthogonal Vectors and Matrices Since the 19608, many of the best algorithms of numerical linear algebra have been based in one way or another on orthogonality. In this lecture we present the ingredients: orthogonal vectors and orthogonal (unitary) matrices. Adjoint The complex conjugate of a scalar 2, written E or 2*, is obtained by negating its imaginary part. For real 2, E = Z. The hermitian conjugate or adjotnt of an m X a matrix A, written A*, is the n X m matrix whose 2',j entry is the complex conjugate of the j,z' entry of A. For example, a a 11 12 — — — a a a A: Q21 Q22 : 14*: _11 _21 _31 . a a 5512 (122 (132 31 32 If A = A*, A is hermitian. By definition, a hermitian matrix must be square. For real A, the adjoint simply interchanges the rows and columns of A. In this case, the adjoint is also known as the transpose, and is written AT. If a real matrix is hermitian, that is, A = AT, then it is also said to be symmetric. Most textbooks of numerical linear algebra assume that the matrices under discussion are real and thus use principally T instead of *. Since most of the ideas to be dealt with are not intrinsically restricted to the reals, however, we have followed the other course. Thus, for example, in this book a row vector 11 12 PART I FUNDAMENTALS will usually be denoted by, say, a* rather than CF. The reader who prefers to imagine that all quantities are real and that * is a synonym for T will rarely get into trouble. Inner Product The inner product of two column vectors 30,3; 6 (Um is the product of the adjoint of :1: by y: fly : (2.1) i:1 The Euclidean length of a: may be written (vector norms such as this are discussed systematically in the next lecture), and can be defined as the square root of the inner product of x with itself: m 1/2 HmH = = (2W) . (22) 1'21 The cosine of the angle 04 between 1: and y can also be expressed in terms of the inner product: {13*y llmll llyll' At various points of this book, as here, we mention geometric interpretations of algebraic formulas. For these geometric interpretations, the reader should think of the vectors as real rather than complex, although usually the in— terpretations can be carried over in one way or another to the complex case too. The inner product is bilinear, which means that it is linear in each vector separately: cosa = (2.3) (1‘1 + $2)*y = fly + $321; $1311 + 3/2) = 33*311 + 519312, (MVWy) = Mfg. We shall also frequently use the easily proved property that for any matrices or vectors A and B of compatible dimensions, (AB)* = B*A*. (2.4) This is analogous to the equally important formula for products of invertible square matrices, (AB)’1 : B’lA’l. (2.5) The notation A” is a shorthand for (A*)’1 or (A’1)*; these two are equal, as can be verified by applying (2.4) with B : A’l. LECTURE 2 ORTHOGONAL VECTORS AND MATRICES 13 Orthogonal Vectors A pair of vectors :1: and y are said to be orthogonal if fly : 0. If :1: and y are real, this means they lie at right angles to each other in IBM. Two sets of vectors X and Y are orthogonal (also stated “X is orthogonal to Y”) if every 1: E X is orthogonal to every y E Y. A set of nonzero vectors S is orthogonal if its elements are pairwise orthog— onal, i.e., if for 1:, y E S, :1: 7E y :> fly : 0. A set of vectors is orthonormal if it is orthogonal and in addition every :1: E S has = 1. Theorem 2.1. The vectors in an orthogonal set S are linearly independent. Proof. If the vectors in S are not independent, then some ’Uk 6 S can be expressed as a linear combination of other members '01, . . . ,1)” E S, TL ok 2 2 0,11,. i:1 igék Since ok 75 0, 71271,, : “ka2 > 0. Using the bilinearity of inner products and the orthogonality of S , we calculate TL * _ * _ Ulcvk — thvkvz‘ — 0: i 1 which contradicts the assumption that the vectors in S are nonzero. As a corollary of Theorem 2.1 it follows that if an orthogonal set S Q (Um contains m vectors, then it is a basis for (Um. Components of a Vector The most important idea to draw from the concepts of inner products and or— thogonality is this: inner products can be used to decompose arbitrary vectors into orthogonal components. For example, suppose that {q1, q2, . . . , an} is an orthonormal set, and let o be an arbitrary vector. The quantity q§v is a scalar. Utilizing these scalars as coordinates in an expansion, we find that the vector 7‘ = v - (QIle - (q3v)q2 - - -- - (@2an (2-6) is orthogonal to {(11, (12, ~ . - , on}. This can be verified by computing qfr: air = qt?) - (qiv)(q;"ql) - - -- - (inflqfqnl This sum collapses, since (1qu : 0 for t 7E j: qE‘r = QlU—(le)(q:%) = 0- 14 PART I FUNDAMENTALS Thus we see that ’U can be decomposed into n —1— 1 orthogonal components: TL TL 11 : r +Z(q;“v)q, = r +Z(q,~qf)v~ (27) i:1 i:1 In this decomposition, r is the part of v orthogonal to the set of vectors {(11, (12, . . . , qn}, or equivalently to the subspace spanned by this set of vectors, and (qfv)q, is the part of v in the direction of (1,. If is a basis for (Um, then rt must be equal to m and 7" must be the zero vector, so ’0 is completely decomposed into m orthogonal components in the directions of the q,: m vzzigfiwa==§lafifl- (2& 2:1 In both (2.7) and (2.8) we have written the formula in two different ways, once with (qf'u)q,~ and again with (q,q§‘)v. These expressions are equal, but they have different interpretations. In the first case, we view ’0 as a sum of coeflicients qf/U times vectors (1,. In the second, we view 71 as a sum of orthogonal projections of 12 onto the various directions q,. The ith projection operation is achieved by the very special rank—one matrix qiqf. We shall discuss this and other projection processes in Lecture 6. Unitary Matrices A square matrix Q E mem is unitary (in the real case we also say orthogonal) if Q* : Q’l, i.e, if Q*Q : I. In terms of the columns of Q, this product can be written qi‘ 1 fi 1 C11 C12 ' ' ‘ qm q; 1 In other words, (1qu = 62-], and the columns of a unitary matrix Q form an orthonormal basis of (Um. The symbol 6,]. is the Kronecker delta, equal to 1 if i=jmfl0fii#j Multiplication by a Unitary Matrix In the last lecture we discussed the interpretation of matrix—vector products Am and A’lb. If A is a unitary matrix Q, these products become Qa: and Q*b, and the same interpretations are of course still valid. As before, Qa: is the linear combination of the columns of Q with coefficients an Conversely, LECTURE 2 ORTHOGONAL VECTORS AND MATRICES 15 Q*b is the vector of coefficients of the expansion of b in the basis of columns of Q. Schematically, the situation looks like this: Multiplication by Q* b: coefficients of the expansion of b in {el,...,em} Q*b: coefficients of the expansion of b in {q1,...,qm} Multiplication by Q These processes of multiplication by a unitary matrix or its adjoint pre— serve geometric structure in the Euclidean sense7 because inner products are preserved. That is, for unitary Q, (Q$)*(Qy) = fly, (29) as is readily verified by (2.4). The invariance of inner products means that angles between vectors are preserved, and so are their lengths: HQIL‘H = llwll- (2-10) In the real case7 multiplication by an orthogonal matrix Q corresponds to a rigid rotation (if detQ : 1) or reflection (if detQ : —1) of the vector space. Exercises Show that if a matrix A is both triangular and unitary, then it is diagonal. The Pythagorean theorem asserts that for a set of n orthogonal vectors H Eat-H2 = Z: Han-H2- 1:1 1:1 (a) Prove this in the case n : 2 by an explicit computation of ||cc1 + m2||2. (b) Show that this computation also establishes the general case7 by induction. Let A E mem be hermitian. An eigenvector of A is a nonzero vector x E (Um such that Ax : A1: for some /\ E (D, the corresponding eigenvalue. (a) Prove that all eigenvalues of A are real. 16 PART I FUNDAMENTALS (b) Prove that if :1: and y are eigenvectors corresponding to distinct eigenvalues, then a: and y are orthogonal. What can be said about the eigenvalues of a unitary matrix? Let S E (Ume be skew-hermittan, i.e., 3* = —S. (a) Show by using Exercise 1 that the eigenvalues of S are pure imaginary. (b) Show that I — S is nonsingular. (c) Show that the matrix Q = (I—S)_1(I+S), known as the Cog/leg transform of S, is unitary. (This is a matrix analogue of a linear fractional transformation (1 + — s), which maps the left half of the complex s—plane conformally onto the unit disk.) If u and v are m—vectors, the matrix A = I + an“ is known as a rank-one perturbation of the identity. Show that if A is nonsingular, then its inverse has the form A‘1 : I + 0mm for some scalar 04, and give an expression for 04. For what u and v is A singular? If it is singular, what is null(A)? A Hadamard matrix is a matrix whose entries are all :|:1 and whose transpose is equal to its inverse times a constant factor. It is known that if A is a Hadarnard matrix of dimension m > 2, then m is a multiple of 4, but it is an unsolved problem whether there is a Hadarnard matrix for every such m, though examples are known for all cases m g 424. Show that the following recursive description provides a Hadarnard matrix of each dimension m : 2’“, k : 0,1,2,.. .. H0 H H PE + |—‘ || ...
View Full Document

Page1 / 6

TrefethenLecture2 - Lecture 2 Orthogonal Vectors and...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online