This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 3 Moments of Random Variables 3.1 Introduction 85 3.2 Expectation 86 3.3 Expectation of Nonnegative Random Variables 88 3.4 Moments of Random Variables and the Variance 90 3.5 Conditional Expectations 101 3.6 The Chebyshev Inequality 102 3.7 The Markov Inequality 103 3.8 Chapter Summary 104 3.9 Problems 104 3.1 Introduction Given the set of data X 1 , X 2 ,..., X N , we know that the arithmetic average (or arithmetic mean) is given by X = X 1 + X 2 + ··· + X N N When the above numbers occur with different frequencies, we usually assign weights w 1 , w 2 ,..., w N to them and the socalled weighted arithmetic mean be comes X = w 1 X 1 + w 2 X 2 + ··· + w N X N w 1 + w 2 + ··· + w N The average is a value that is representative or typical of a set of data and tends to lie centrally within a set of data that are arranged according to their magnitudes. Thus, it is usually called a measure of central tendency. 85 86 Chapter 3 Moments of Random Variables The term expectation is used for the process of averaging when a random vari able is involved. It is a number used to locate the “center” of the distribution of a random variable. In many situations we are primarily interested in the central tendency of a random variable, and as will be seen later in this chapter, the expec tation (or mean or average) of a random variable can be likened to the weighted arithmetic average defined above. Another measure of central tendency of a random variable is its variance, which measures the degree to which a random variable is spread out. This chapter deals with how the expectation (or mean or average) and variance of a random variable can be computed. 3.2 Expectation If X is a random variable, then the expectation (or expected value or mean ) of X , denoted by E [ X ] , is defined by E [ X ] = summationdisplay i x i p X ( x i ) X discrete integraldisplay ∞∞ xf X ( x ) dx X continuous Thus, the expected value of X is a weighted average of the possible values that X can take, where each value is weighted by the probability that X takes that value. The expected value of X is sometimes denoted by X . Example 3.1 Find the expected value of the random variable X whose PDF is defined by f X ( x ) = x < a 1 b a a ≤ x ≤ b x > b Solution The PDF of X is as shown in Figure 3.1. E [ X ] = integraldisplay ∞∞ xf X ( x ) dx = integraldisplay b a x b a dx = bracketleftbigg x 2 2 ( b a ) bracketrightbigg b a = b 2 a 2 2 ( b a ) = ( b a )( b + a ) 2 ( b a ) = b + a 2 trianglesolid 3.2 Expectation 87 Figure 3.1 PDF of X in Example 3.1 Example 3.2 Find the expected value of the discrete random variable X with the following PMF: p X ( x ) = 1 3 x = 2 3 x = 2 Solution E [ X ] = parenleftbigg 1 3 parenrightbigg + 2 parenleftbigg 2 3 parenrightbigg = 4 3 trianglesolid Example 3.3 Find the expected value of the random variable K with the follow ing PMF: p K ( k ) = λ k k !...
View
Full Document
 Fall '07
 Carlton
 Variance, Probability theory, λk

Click to edit the document details