This preview has intentionally blurred parts. Sign up to view the full document

View Full Document

Unformatted Document Excerpt

Lecture 26 Agenda 1. Mixed random variables and the the importance of distribution function 2. Joint Probability Distribution for discrete random variables Mixed random variables and the the impor- tance of distribution function Any numerical quantity associated with a random experiment is called a random variable. Mathematically we say, any function X : S R is a random variable. We have already defined random variables like this in the beginning of the course, and then we considered a special class of random variables, namely the discrete random variables. We know that for a discrete random variable X , when somebody asks me about its probability distribution, I have to provide him two things 1. Range(X) = { x 1 ,x 2 ,x 3 ,... } = The set of values that X takes. 2. and, for each x i Range ( X ), I have to provide P ( X = x i ). Then we learned how to calculated the expectation, variance etc for a discrete random variable and studied various examples like Binomial, Geo- metric, Negative Binomial ...... After that we considered continuous random variables, for which P ( X = x ) = 0 for all x R . We saw that if some one asked me about the distribution of a continuous random variable X , I have to give him the density f X : R [0 , ), such that 1. for all a < b , P ( a < X < b ) = R b a f X ( x ) dx 2. and, R - f X ( x ) dx = 1. Then similar to discrete random variables, we learned how to calculate mean, variance etc for a continuous random variable. We also saw various examples of continuous random variables like exponential, gamma, beta etc.... View Full Document

End of Preview

Sign up now to access the rest of the document