chapter11 - CHAPTER 11 NOISE AND NOISE REJECTION...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
CHAPTER 11 NOISE AND NOISE REJECTION INTRODUCTION In general, noise is any unsteady component of a signal which causes the instantaneous value to differ from the true value. (Finite response time effects, leading to dynamic error, are part of an instrument's response characteristics and are not considered to be noise.) In electrical signals, noise often appears as a highly erratic component superimposed on the desired signal. If the noise signal amplitude is generally lower than the desired signal amplitude, then the signal may look like the signal shown in Figure 1. Figure 1: Sinusoidal Signal with Noise. Noise is often random in nature and thus it is described in terms of its average behavior (see the last section of Chapter 8). In particular we describe a random signal in terms of its power spectral density, x ( (f)) , which shows how the average signal power is distributed over a range of frequencies, or in terms of its average power, or mean square value. Since we assume the average signal power to be the power dissipated when the signal voltage is connected across a 1 Ω resistor, the numerical values of signal power and signal mean square value are equal, only the units differ. To determine the signal power we can use either the time history or the power spectral density (Parseval's Theorem). Let the signal be x(t), then the average signal power or mean square voltage is: T t 2 22 x T 0 t 2 1 x (t) x (t) dt (f) df T (1)
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
11-2 Note: the bar notation, , denotes a time average taken over many oscillations of the signal. Consider now the case shown in Figure 1, where x(t) = s(t) + n(t). s(t) is the signal we wish to measure and n(t) is the noise signal. x(t) is the signal we actually measure. If s(t) and n(t) are independent of one another, the mean square voltage of x(t) is the sum of the mean square of voltage of s(t) and the mean square voltage of n(t): 2 2 2 x (t) s (t) n (t) desired noise signal power power (2) Here we have assumed that the noise is added on to the desired signal. This may not always be the case. However, in this chapter we will generally assume that the noise is additive. Often noise and indeed other signals are described in terms of their root mean square (rms) voltage. This is the square root of the mean square value: 2 rms n(t) (3) When noise is harmonic in nature, i.e., if Asin( t ) , where is an arbitrary phase, as would be the case if we were picking up line noise in the measurement circuit, then: 2 rms A 2 (4) (The proof is left as an exercise for the student.) When we take measurements of random signals we can calculate the mean and variance and standard deviation. How do these relate to the mean square and root mean square voltages? Recall that the variance of a signal is: 2 2 2 2 2 2 n n n n n n E[(n ) ] E[n 2n ] E[n ] 2 E[n] (5) where E[.] denotes average value of. Recognizing that 22 E[n ] n , and n E[n] , we see that: 2 2 2 nn n (t) (6) Hence, if the signal has a mean value of zero n ( 0) then the variance is equal to the mean square value. In general: 2 2 2 n (t) (7)
Background image of page 2
11-3 The rms value of the signal is: 22 rms n n n(t) (8)
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/26/2011 for the course ME 365 taught by Professor Merkle during the Fall '07 term at Purdue University-West Lafayette.

Page1 / 23

chapter11 - CHAPTER 11 NOISE AND NOISE REJECTION...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online