{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# notes09 - STATISTICS 330 Mathematical Statistics...

This preview shows pages 1–6. Sign up to view the full content.

STATISTICS 330 Mathematical Statistics Supplementary Lecture Notes Cyntha A. Struthers Dept. of Statistics and Actuarial Science University of Waterloo Waterloo, Ontario, Canada Fall 2009

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Contents 1 PREVIEW 1 1.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2 RANDOM VARIABLES 5 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.2 Discrete Random Variables . . . . . . . . . . . . . . . . . . 6 2.3 Continuous Random Variables . . . . . . . . . . . . . . . . . 8 2.4 Functions of a Random Variable . . . . . . . . . . . . . . . 10 2.5 Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.6 Moment Generating Functions . . . . . . . . . . . . . . . . 17 2.7 Location and Scale Parameters . . . . . . . . . . . . . . . . 19 2.8 Calculus Review . . . . . . . . . . . . . . . . . . . . . . . . 20 3 Joint Distributions 23 3.1 Joint and Marginal CDF’s . . . . . . . . . . . . . . . . . . . 23 3.2 Joint Discrete Random Variables . . . . . . . . . . . . . . . 24 3.3 Joint Continuous Random Variables . . . . . . . . . . . . . 25 3.4 Independent Random Variables . . . . . . . . . . . . . . . . 29 3.5 Conditional Distributions . . . . . . . . . . . . . . . . . . . 31 3.6 Joint Expectations . . . . . . . . . . . . . . . . . . . . . . . 35 3.7 Conditional Expectation . . . . . . . . . . . . . . . . . . . . 37 3.8 Joint Moment Generating Functions . . . . . . . . . . . . . 39 3.9 Bivariate Normal Distribution . . . . . . . . . . . . . . . . . 40 4 Functions of Random Variables 43 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 4.2 One-to-One Bivariate Transformations . . . . . . . . . . . . 44 4.3 Moment Generating Function Method . . . . . . . . . . . . 47 1
0 CONTENTS 5 Limiting or Asymptotic Distributions 51 5.1 Convergence in Distribution . . . . . . . . . . . . . . . . . . 51 5.2 Convergence in Probability . . . . . . . . . . . . . . . . . . 53 5.3 Limit Theorems . . . . . . . . . . . . . . . . . . . . . . . . . 56 6 Estimation 61 6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 6.2 Method of Maximum Likelihood - One Parameter Case . . . . . . . . . . . . . . . . . . . . . 62 6.3 Method of Maximum Likelihood - Multiparameter Case . . . . . . . . . . . . . . . . . . . . . 66 6.4 Asymptotic Properties of M.L. Estimators - One Parameter Case . . . . . . . . . . . . . . . 73 6.5 Interval Estimators . . . . . . . . . . . . . . . . . . . . . . . 74 6.6 Relative Likelihood . . . . . . . . . . . . . . . . . . . . . . . 78 6.7 Asymptotic Properties of M.L. Estimators - Multiparameter Case . . . . . . . . . . . . . . 80 6.8 Con fi dence Regions . . . . . . . . . . . . . . . . . . . . . . . 83 7 Hypothesis Tests 89 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 7.2 Likelihood Ratio Tests for Simple Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 7.3 Likelihood Ratio Tests for Composite Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . 93

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Chapter 1 PREVIEW The follow examples will illustrate the ideas and concepts we will study in STAT 330. 1.1 Example The following table gives the number of fumbles in a game made by 110 Division A football teams during one weekend: No. of Fumbles: x 0 1 2 3 4 5 6 7 8 Total Obs. Frequency: f x 8 24 27 20 17 10 3 1 0 110 It is believed that a Poisson model will fi t these data well. Why might this be a reasonable assumption? (PROBABILITITY MODELS) If we let the random variable X = number of fumbles in a game and assume that the Poisson model is reasonable then the probability function (p.f.) of X is given by P ( X = x ) = μ x e μ x ! x = 0 , 1 , . . . where μ is a parameter of the model which represents the mean number of fumbles in a game. (RANDOM VARIABLES, PROBABILITY FUNC- TIONS, EXPECTATION, MODEL PARAMETERS) Since μ is unknown we might estimate it using the sample mean ¯ x = 8(0) + 24(1) + · · · + 1(7) 110 = 281 110 2 . 55 . (POINT ESTIMATION) The estimate ˆ μ = ¯ x is the maximum likelihood (M.L.) estimate of μ . It is the value of μ which maximizes the likelihood 1
2 CHAPTER 1. PREVIEW function. (MAXIMUM LIKELIHOOD ESTIMATION) The likelihood function is the probability of the observed data as a function of the un- known parameter(s) in the model. The M.L. estimate is thus the value of μ which maximizes the probability of the observed data.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 97

notes09 - STATISTICS 330 Mathematical Statistics...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online