This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: AMS 311, Fall Semester, 2010 Chapter Six Jointly Distributed Random Variables 6.3. Sums of Independent Random Variables In a gambling problem, let X be the winnings from one play of a game of chance and have pdf X f , and Y be the winnings from one play of another game of chance with pdf Y f , with X and Y independent. The random variable Y X S + = represents the total winnings from the two games. Then  + = + = = . ) ( ) ( } { ) ( ) ( dy y f y s F s Y X P s F s F Y X Y X S Further, . ) ( ) ( ) ( dy y f y s f s f Y X Y X  + = Example 3a. Sum of two independent uniform random variables . If X and Y are two independent random variables, both uniformly distributed on (0,1) calculate the probability density of . Y X S + = Proposition 3.1. If X and Y are independent gamma random variables with respective parameters ) , ( s and ), , ( t then Y X S + = is a gamma distribution with parameters ). , ( t s + If n i X i , , 1 , = are independent gamma random variables with respective parameters...
View
Full
Document
This note was uploaded on 04/13/2010 for the course AMS 311 taught by Professor Tucker,a during the Spring '08 term at SUNY Stony Brook.
 Spring '08
 Tucker,A

Click to edit the document details