ams572_notes_11

ams572_notes_11 - AMS 572 Lecture Notes #11 Nov. 1st, 2010...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
AMS 572 Lecture Notes #11 Nov. 1 st , 2010 Ch. 9. Categorical Data Analysis Quantitative R.V: Numbers associated with the measurements are meaningful. continuous R.V.: height, weight, IQ, age, etc. discrete R.V.: time(day) , # successes, etc. Qualitative R.V: Numbers associated with the measurements are not meaningful. A natural categorical variable: Eye color code percentage count Brown 1 60% 1200 Blue 2 10% 200 Green 3 Gray 4 Hazel 5 Others 6 Total 100% 2000 Sometimes we categorize quantitative data. e.g. Age group: Children (years): <17; Young adults: [17, 35]; Middle aged adults: [36, 55]; Elderly adults: >55 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
1. Inference on One Population Proportion * A special categorical R.V. -- Binary Random Variables: Eg. Jerry has nothing to do. He decided to toss a coin 1000 times to see whether  it is a fair coin. Of the 1000 tosses, he got 510 heads and 490 tails. Is a fair coin? 0 1 : 2 1 : 2 a H p H p = Here the outcome variables X i  = 1 (heads) or 0 (tails), i = 0, 1, …, 1000 The total number of heads = X 1 + X 2 + … + X 1000  (= 510 in this example) * Binomial Experiment and the Binomial Distribution: Def: A Binomial experiment consists of n trials. Each trial will result in 1of 2 possible outcomes, say “S” and “F”. The probability of obtaining an “S” remains the same from trial to trial, say P. (the probability of obtaining an “F” is 1-P). These trials are independent (previous outcomes will not influence the future outcomes) # of “S” = ( , ) X Bin n p : (p is population proportion) Sample proportion: X n ( ) (1 ) x n x n P X x p p x -   = = -     , 0,1,2, , x n = K m.g.f of X: 0 0 ( ) ( ) ( ) (1 ) n n tx tx tx x n x x i i n M t E e e p X x e p p x - = =   = = = = -     g 0 ( ) (1 ) ( 1 ) n t x n x t n i n e p p e p p x - =   = - = + -     [ note: 0 ( ) n n x n x x n a b a b x - =   + =     , Newton’s binomial theorem ] 2
Background image of page 2
E.g. Let 1 ( , ) X Bin n p : , 2 ( , ) Y Bin n p : . Furthermore X and Y are independent. What is the distribution of X+Y? Solution: 1 2 ( ) ( ) ( ) [ (1 )] n n t X Y X Y M t M t M t e p p + + = = + - Hence, 1 2 ( , ) X Y Bin n n p + + : When n is large. ( 29 30 n 1 (1 ) ( , ) n i n i X X p p X N p n n n →∞ = - = = , by CLT. (Note: Here the random sample is  X 1 , X 2 , … ,X n :  they are i.i.d. Bernoulli(p) R.V.’s.) Bernoulli Distribution Toss a coin, and get the result as following: Head(H), H, Tail(T), T, T, H, … Let 1, .
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/31/2011 for the course AMS 572 taught by Professor Weizhu during the Fall '10 term at SUNY Stony Brook.

Page1 / 10

ams572_notes_11 - AMS 572 Lecture Notes #11 Nov. 1st, 2010...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online