This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Notes on function spaces, Hermitian operators, and Fourier series S. G. Johnson, MIT Applied Mathematics November 21, 2007 1 Introduction In 18.06, we mainly worry about matrices and column vectors: finite-dimensional lin- ear algebra. But into the syllabus pops an odd topic: Fourier series. What do these have to do with linear algebra? Where do their interesting properties, like orthogonal- ity, come from? In these notes, written to accompany 18.06 lectures in Fall 2007, we discuss these mysteries: Fourier series come from taking concepts like eigenvalues and eigenvectors and Hermitian matrices and applying them to functions instead of finite column vectors . In this way, we see that important properties like orthogonality of the Fourier series arises not by accident, but as a special case of a much more general fact, analogous to the fact that Hermitian matrices have orthogonal eigenvectors. This material is important in at least two other ways. First, it shows you that the things you learn in 18.06 are not limited to matricesthey are tremendously more general than that. Second, in practice most large linear-algebra problems in science and engineering come from differential operators on functions, and the best way to analyze these problems in many cases is to apply the same linear-algebra concepts to the underlying function spaces. 2 Review: Finite-dimensional linear algebra Most of 18.06 deals with finite-dimensional linear algebra. In particular, lets focus on the portion of the course having to do with square matrices and eigenproblems. There, we have: Vectors x : column vectors in R n (real) or C n (complex). Dot products x y = x H y . These have the key properties: x x = k x k 2 > for x 6 = ; x y = y x ; x ( y + z ) = x y + x z . n n matrices A . The key fact is that we can multiply A by a vector to get a new vector, and matrix-vector multiplication is linear : A ( x + y ) = A x + A y . 1 Transposes A T and adjoints A H = A T . The key property here is that x ( A y ) = ( A H x ) y ...the whole reason that adjoints show up is to move matrices from one side to the other in dot products. Hermitian matrices A = A H , for which x ( A y ) = ( A x ) y . Hermitian matrices have three key consequences for their eigenvalues/vectors: the eigenvalues are real ; the eigenvectors are orthogonal ; 1 and the matrix is diagonalizable (in fact, the eigenvectors can be chosen in the form of an orthonormal basis ). Now, we wish to carry over these concepts to functions instead of column vectors, and we will see that we arrive at Fourier series and many more remarkable things. 3 A vector space of functions First, let us define a new vector space: the space of functions f ( x ) defined on x [0 , 1] , with the boundary conditions f (0) = f (1) = 0 . For simplicity, well restrict ourselves to real f ( x ) . Weve seen similar vector spaces a few times, in class and on problem....
View Full Document