Week07Notes

Week07Notes - Stat 311 Introduction to Mathematical...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Stat 311 Introduction to Mathematical Statistics Zhengjun Zhang Department of Statistics, University of Wisconsin, Madison, WI 53706, USA October 13-15, 2009 Example Suppose we roll two fair six-sided dice, one red and one blue. Let A be the event that two dice show the same value. Let B be the event that the sum of the two dice is equal to 12. Let C be the event that the red die shows 4. Let D be the event that the blue die shows 4. Answer the following questions 2- (a) Are A and B independent? (b) Are A and C independent? (c) Are A and D independent? (d) Are C and D independent? (e) Are A , C , and D all independent? The joint discrete distribution function Let X 1 ,X 2 ,...,X n be n discrete random variables all defined on the same probability space. The joint discrete distribution function of X 1 ,X 2 ,...,X n , denoted by p X 1 ,X 2 ,...,X n ( x 1 ,x 2 ,...,x n ), is the following function: p X 1 ,X 2 ,...,X n : R n R p X 1 ,X 2 ,...,X n ( x 1 ,x 2 ,...,x n ) = P [ X 1 = x 1 ,X 2 = x 2 ,...,X n = x n ] Let X 1 ,X 2 ,...,X n be random variables associated with an experiment. Suppose that the sample space (i.e., the set of possible outcomes) of X i is the set R i . Then the joint random variable X = ( X 1 ,X 2 ,...,X n ) is defined to be the random variable whose outcomes consist of ordered n-tuples of outcomes, with the i th coordinate lying in the set R i . The sample space of X is the Cartesian product of the R i s: = R 1 R 2 R n . The joint distribution function of X is the function which gives the probability of each of the outcomes of X . Mutually independent random variables The random variables X 1 , X 2 , ..., X n are mutually independent if P ( X 1 = r 1 ,X 2 = r 2 ,...,X n = r n ) = P ( X 1 = r 1 ) P ( X 2 = r 2 ) P ( X n = r n ) for any choice of r 1 ,r 2 ,...,r n . Thus, if X 1 , X 2 ,..., X n are mutually independent, then the joint distribution function of the random variable X = ( X 1 ,X 2 ,...,X n ) is just the product of the individual distribution functions. When two random variables are mutually independent, we shall say more briefly that they are. Bayes theorem P ( A | B ) = P ( A ) P ( B | A ) P ( B ) = P ( A B ) P ( B ) . Remark Bayes theorem is also called the inverse probability law . It was first used by an 18th century monk named Rev. Thomas Bayes (1702-1761)....
View Full Document

Page1 / 9

Week07Notes - Stat 311 Introduction to Mathematical...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online