Example_class_5_handout_solution

Example_class_5_handout_solution - THE UNIVERSITY OF HONG...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
THE UNIVERSITY OF HONG KONG DEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCE STAT1301 PROBABILITY AND STATISTICS I EXAMPLE CLASS 5 Review Some common discrete distributions Table of Important Discrete Distributions Name for which ± ² (² ³ ) ´ µ ± ² (² ³ ) ´ µ for these ¶(²) ·¸¹(²) Description Bernoulli º» ¼ ½» ¾ ¾ ¾½ One trial; 1 if success, 0 if failure Binomial º»¼» ¿ » À Á À  à ¾ Ä ½ ÅÆÄ À¾ À¾½ Number of success in À Bernoulli trials Geometric ¼»Ç»È» ¿ ½ ÄÆÉ ¾ ¼ ¾ ½ ¾ Ê Number of Bernoulli trials needed to get one success Pascal (Negative Binomial) Ë» Ë Ì ¼» ¿ Á Â Í ¼ Ë Í ¼ à ¾ Î ½ ÄÆÎ Ë ¾ ˽ ¾ Ê Number of Bernoulli trials needed to get Ë success Poisson º»¼»Ç» ¿ Ï Ä ÂÐ Ñ ÆÒ Ï Ï Approximates binomial for À large, but Ï ³ À¾ not large Hypergeometric ÓÔÕ(º» À Í Ö Ì ×)Ø Ù Â ÙØ ÓÚÛ(À» ×) Á × Â Ã Á Ö Í × À Í Â Ã Á Ö À à À× Ö À(Ö Í À) Ö Í ¼ × Ö Ö Í × Ö Number of red balls chosen when À balls are chosen from Ö balls, × of which are red Uniform ¼»Ç» ¿ » À ¼ À À Ì ¼ Ç À Ê Í ¼ ¼Ç Choose one of the numbers ¼» Ç» ¿ » À Definition of PDF f X ( x ) = dF X ( x ) dx = lim ( dx 0) P ( X x + dx ) - P ( X x ) dx = lim ( dx 0) P ( X < x + dx ) - P ( X x ) dx = lim ( dx 0) P ( X x + dx ) - P ( X < x ) dx = lim ( dx 0) P ( X < x + dx ) - P ( X < x ) dx Properties of PDF 1. f X ( x ) 0 2. ´ + -∞ f X ( x ) dx = 1 3. F X ( x ) = ´ x -∞ f X ( t ) dt 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4. P ( a < X b ) = P ( a X b ) = P ( a X < b ) = P ( a < X < b ) = ´ b a f ( x ) dx = F X ( b ) - F X ( a ) Markov’s inequality If X is a nonnegative random variable with finite mean E ( X ) , then for any c > 0 , P ( X c ) E ( X ) c . Proof: Note that since X 0 , cI { X c } ≤ X The inequality follows by taking expectation on both sides. Chebyshev’s inequality If the random variable X has finite mean μ and finite variance σ 2 , then for any real number k > 0 , P ( | X - μ | ≥ ) 1 k 2 . Proof: By Markov’s inequality, P (( X - μ ) 2 k 2 σ 2 ) E (( X - μ ) 2 ) k 2 σ 2 = 1 k 2 . Problems
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/16/2012 for the course STAT 1301 taught by Professor Smslee during the Fall '08 term at HKU.

Page1 / 7

Example_class_5_handout_solution - THE UNIVERSITY OF HONG...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online