1Sum of Independent Binomial RVs•Let X and Y be independent random variablesX ~ Bin(n1, p) and Y ~ Bin(n2, p) X + Y ~ Bin(n1+ n2, p)•Intuition:X has n1trials and Y has n2trialsoEach trial has same “success” probability pDefine Z to be n1+ n2 trials, each with success prob. pZ ~ Bin(n1+ n2, p), and also Z = X + Y•More generally: Xi~ Bin(ni, p) for 1 iNpnXNiinii,Bin~11Sum of Independent Poisson RVs•Let X and Y be independent random variablesX ~ Poi(l1) and Y ~ Poi(l2)X + Y ~ Poi(l1+ l2) •Proof: (just for reference)Rewrite (X + Y = n) as (X = k, Y = n–k) where 0 knNoting Binomial theorem: so, X + Y = n ~ Poi(l1+ l2)nknkknYPkXPknYkXPnYXP00)()(),()(nkknknkknknkknkknknneknkekneke021)(021)(021)!(!!!)!(!)!(!212121llllllllllllnnenYXP21)(!)(21llllnkknknknkn02121)!(!!)(llllReference: Sum of Independent RVs•Let X and Y be independent Binomial RVsX ~ Bin(n1, p) and Y ~ Bin(n2, p) X + Y ~ Bin(n1+ n2, p)More generally, let Xi~ Bin(ni, p) for 1 ≤ i≤ N, then•Let X and Y be independent Poisson RVsX ~ Poi(l1) and Y ~ Poi(l2)X + Y ~ Poi(l1+ l2) More generally, let Xi~ Poi(li) for 1 ≤ i≤ N, thenpnXNiiNii,Bin~11NiiNiiX11Poi~lExpected Values of Sums•Let g(X, Y) = X + Y. Compute E[g(X, Y)] = E[X + Y]E[X + Y] = E[X] + E[Y]•Generalized:Holds regardless of dependency between Xi’sWe’ll prove this next timeniiniiXEXE11][
This preview has intentionally blurred sections.
Sign up to view the full version.