This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: LECTURE 2: STATISTICS REVIEW RANDOM VARIABLES X is a random variable if it takes different values according to some probability distribution. Types of Random Variables : • Discrete Random Variable o Takes on a finite or countable infinite number of values o Example: outcome of a coin toss • Continuous Random Variable o Takes on any value in a real interval o Each specific value has zero probability o Example: height of an individual at UCLA A probability distribution is best described by the corresponding probability density function and the cumulative distribution function . PROBABILITY DISTRIBUTION FUNCTION (PDF) A Probability Distribution Function summarizes the information concerning the possible outcomes of X and the corresponding probabilities. • The PDF of a discrete random variable X that takes on values, say p 2 1 x ,..., x , x , is defined as: = ) f(x j { p ..., 1, j for ) x Pr(X x X for j j = = ≠ Example: Coin Toss . 5 1 Heads Tails • The PDF of a continuous random variable is similar to that of a discrete random variable except we now measure the probability the random variable is in a certain range or interval. It is defined as the derivative of the cumulative distribution function (CDF) and is shown in the following graph. FEATURES OF PROBABILITY DISTRIBUTIONS MOMENTS Moments are Summary statistics of the distribution. • Expected Value or Expectation or Mean = = E(X) def x μ { discrete X for ) f(x x continuous X for dx f(x) x j ∑ = p j j 1 ∫ o Measure of central tendency o Weighted average of the possible values of X with the probability f(x) serving as weights o Population mean value • Variance ) ) μ E((X Var(X) σ 2 x def 2 x − = = = { ∑ ∫ = − − p 1 j j 2 x j 2 x discrete X for ) f(x ) μ (x continuous X for dx f(x) ) μ (x o Measure of dispersion: how the values of X are spread around its mean o Always positive The square root of the variance is called the standard deviation : Var(X) σ def x = PROPERTIES OF EXPECTATION AND VARIANCE Properties of Expectation 1. E[a] = a 2. E[aX] = aE[X] 3. E[aX + b] = aE[X] + b 4. E[X + Y] = E[X] + E[Y] 5. E[(aX) 2 ] = a 2 E[X 2 ] 6. If X and Y are independent, then E[XY] = E[X]E[Y] Properties of Variance 1. Var[a] = 0 2. Var[aX + b] = a 2 Var[X] 3. Var[X + Y] = Var[X] + Var[Y] + 2Cov[X, Y] 4. Var[X – Y] = Var[X] + Var[Y] – 2Cov[X, Y] 5. Var[aX + bY] =a 2 Var[X] + b 2 Var[Y] + 2abCov[X, Y] RELATIONSHIP BETWEEN RANDOM VARIABLES In economics we are usually interested in phenomena that involve more than one random variable. Thus we have to study joint (multivariate) distributions . JOINT PROBABILITY DENSITY FUNCTION (PDF) • The joint probability density function of two discrete random variables X and Y is the function f(x, y) such that, for any point (x, y) in the X – Y plane, y) Y x, Pr(X y) f(x, = = = • Properties of Joint Probability Density Function: pairs y) (x, all for y) Y and x P(X y) f(x, ≥ = = = ∑∑ = x all y all 1 y) f(x, For the continuous case: ∫ ∫...
View
Full Document
 Winter '07
 SandraBlack
 Normal Distribution, Standard Deviation, Variance, Probability theory, probability density function

Click to edit the document details