# note 7 - MA2216/ST2131 Probability Notes 7 Sums of...

This preview shows pages 1–5. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: MA2216/ST2131 Probability Notes 7 Sums of Independent Random Variables Very often we are interested in the sum of independent random variables. When X and Y are independent, we would like to know the distribution of X + Y . In the following, we will deal with continuous as well as discrete distributions, and derive some very important properties. § 1. Continuous, Independent R.V.’s Under the assumption of independence of X and Y , we have f X,Y ( x,y ) = f X ( x ) f Y ( y ) , for x,y ∈ IR . In order for us to derive the p.d.f. of X + Y , we need to find the distri- bution function of X + Y first. Then, for w ∈ IR, F X + Y ( w ) = IP( X + Y ≤ w ) = Z Z { ( x,y ): x + y ≤ w } f X,Y ( x,y ) dxdy = Z Z { ( x,y ): x + y ≤ w } f X ( x ) f Y ( y ) dxdy = Z ∞-∞ •Z w- y-∞ f X ( x ) dx ‚ f Y ( y ) dy = Z ∞-∞ F X ( w- y ) f Y ( y ) dy. Similarly, one can show that F X + Y ( w ) = Z ∞-∞ •Z w- x-∞ f Y ( y ) dy ‚ f X ( x ) dx = Z ∞-∞ F Y ( w- x ) f X ( x ) dx. Summary: F X + Y ( w ) = Z ∞-∞ F X ( w- y ) f Y ( y ) dy = Z ∞-∞ F Y ( w- x ) f X ( x ) dx. (1 . 1) 1 It then follows that the p.d.f., f X + Y ( w ), of W = X + Y is given by f X + Y ( w ) = d dw F X + Y ( w ) = Z ∞-∞ d dw F X ( w- y ) f Y ( y ) dy = Z ∞-∞ f X ( w- y ) f Y ( y ) dy. Summary: f X + Y ( w ) = Z ∞-∞ f X ( w- y ) f Y ( y ) dy = Z ∞-∞ f X ( x ) f Y ( w- x ) dx. (1 . 2) Remark. Let g and h be two “ nice ” functions on IR (such as integrable functions). The convolution of g and h , denoted by g * h , is defined to be g * h ( z ) def. = Z ∞-∞ g ( z- y ) h ( y ) dy. It can be shown that g * h is also given by g * h ( z ) = Z ∞-∞ g ( x ) h ( z- x ) dx. In other words, the convolution * is a commutative operation , i.e., g * h = h * g. Thus, the p.d.f. f X + Y is the convolution of the p.d.f.’s f X and f Y . We now turn to a few examples. 2 1. Sum of 2 Independent Exponential Random Variables. Suppose that X and Y are independent with a common exponential distribution with parameter λ > 0. Find the p.d.f. of X + Y . Derivation. Note that X + Y takes values in (0 , ∞ ). For w ≤ 0, it then follows that f X + Y ( w ) = 0 . For 0 < w < ∞ , f X + Y ( w ) = Z ∞-∞ f X ( w- y ) f Y ( y ) dy = Z w λe- λ ( w- y ) · λe- λ y dy = Z w λ 2 e- λ w dy = λ 2 w e- λ w . In summary, f X + Y ( w ) =    λ 2 w e- λ w , for 0 < w < ∞ , , elsewhere. Obviously, the distribution of X + Y is a gamma distribution of param- eters 2 and λ . Such a result can be generalized to a sum of independent gammas having the same second parameter, which is to be dealt with next. 3 2. Sum of 2 Independent Gammas. Assume that X ∼ Γ( α,λ ) and Y ∼ Γ( β,λ ), and X and Y are mutually independent. Then, X + Y ∼ Γ( α + β, λ ) ....
View Full Document

## This note was uploaded on 03/19/2012 for the course SCIENCE ST2131 taught by Professor Forgot during the Fall '08 term at National University of Singapore.

### Page1 / 18

note 7 - MA2216/ST2131 Probability Notes 7 Sums of...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online