This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 18.03 Lecture #25 Nov. 4, 2009: notes Topics for today are (first) delta functions , second convolution , and third Duhamel’s principle . [I didn’t get to the third in lecture.] Despite the name, delta functions are not functions. They were invented by the physicist Dirac in the 1930s, and they quickly became a fundamental tool in theoretical physics. They were nevertheless mostly scorned by mathematicians for the next fifteen years (who said that either (a) this idea makes no sense, and necessarily leads to contradictions, or (b) this is an old mathematical idea, which we knew since the nineteenth century). Finally delta functions were (re?) invented by the mathematician Laurent Schwartz in 1945, and they quickly became a fundamental tool in theoretical mathematics. So if it isn’t a function, what is it? (If the following explanation seems too esoterically math ematical, try EP 4.6 instead.) The idea is that for many physical quantities that we describe as “functions of time,” what we actually observe—what we can actually measure—is not individual values of the function, but rather some sort of average over a (short) time interval. For example, if we want to know the speed of a car at some time t (measured in seconds), we might measure the position p ( t ) at time t , then the position p ( t +3) three seconds later, and compute p ( t +3) − p ( t ) 3 . This is not equal to the velocity v ( t ) = dp dt ( t ), but rather it is an average value 1 3 integraldisplay t +3 t v ( s ) ds. For a car moving down a city street, this might be a perfectly reasonable approximation to the velocity. In the middle of drag race, we might prefer to use a time interval of a tenth of a second rather than three seconds, but the principal is the same: what we can measure more directly is positions, and these can give only average values of velocity. The idea of Schwartz is that often we cannot know the values of function f ( t ), but rather only some weighted averages f ( φ ) = integraldisplay ∞ −∞ f ( s ) φ ( s ) ds. (Weighted average of f ) Here φ is called a “test function”; it is supposed to be very nice, meaning for example that it has lots of continuous derivatives. In the definition φ can be anything, but you should think of it as being zero except close to some value of t that we are interested in. Key fact. If f is a piecewise continuous function defined on the real numbers, then we can recover the values of f (except at the jump discontinuities) from knowing the numbers f ( φ ) in (Weighted average of f ) for every test function φ . Because of Key fact , the following definition makes sense, and actually generalizes the notion of (piecewise continuous) function....
View
Full
Document
This note was uploaded on 05/06/2010 for the course 18 18.03 taught by Professor Unknown during the Fall '09 term at MIT.
 Fall '09
 unknown

Click to edit the document details