LectureNotes10

LectureNotes10 - Multivariate Probability Distributions:...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Multivariate Probability Distributions: Part I Cyr Emile M’LAN, Ph.D. [email protected] Multivariate Probability Distributions: Part I – p. 1/34 Introduction ♠ Text Reference : Introduction to Probability and Its Applications, Chapter 6. ♠ Reading Assignment : Sections 6.1-6.3, April 6 So far we have studied the probability models for a single random variable. Many problems in probability and statistics lead to models involving several random variables simultaneously. For example, let y 1 ,y 2 ,... ,y n be n discrete observations representing our data and let Y 1 ,Y 2 ,... ,Y n be the sequence of discrete random variables thought to have generated this data. Multivariate Probability Distributions: Part I – p. 2/34 Introduction At the heart of statistical inference is the computation of the probability of the event that is the intersection of the n events, ( Y 1 = y 1 ) , ( Y 2 = y 2 ) , ... , ( Y n = y n ) . We denote this event simply by ( Y 1 = y 1 , Y 2 = y 2 , ... , Y n = y n ) and its involves the joint behavior of many random variables. Thus, in this lecture notes, we discuss some probability models for the joint behavior of several random variables. We revisit and reemphasize the notion of independence and of conditional probability but in the context of two or more random variables. Multivariate Probability Distributions: Part I – p. 3/34 Joke Two men are having a good time in a bar. Outside, there is a terrible thunderstorm. Finally, one of the men thinks that it is time to leave. Since he has been drinking, he decides to walk home. “But arenŠt you afraid of being struck by lightning?” his friend asks “Not at all. Statistics shows that, in this part of the country, one person per year gets struck by lightning and that one person died in the hospital three weeks ago.” Multivariate Probability Distributions: Part I – p. 4/34 Jointly Distributed Discrete Random Variables The probability mass function of a single random variable X specifies how much probability mass is placed on each possible value x of X . The Joint Probability Mass Function of Two Discrete Random Variables The joint probability mass function of two discrete random variables X and Y describes how much probability mass is placed on each possible pairs of values ( x,y ) . Multivariate Probability Distributions: Part I – p. 5/34 Jointly Distributed Discrete Random Variables Definition 6.1 : Let X and Y be two discrete random variables associated to a random experiment, each assuming values on the sample space N and N , respectively. The joint (or bivariate) proba- bility mass function, p ( x,y ) , for each pair of numbers ( x,y ) where x ∈ N , y ∈ N is defined by p ( x,y ) = P ( X = x, Y = y ) ....
View Full Document

This document was uploaded on 11/18/2010.

Page1 / 34

LectureNotes10 - Multivariate Probability Distributions:...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online