This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: http://www.mube.co.nr Probability A 0 P(A) 1 ; P(S) =1 ; P( φ )=0 ; P(A) = 1-P( ) ≤ ≤ A B A B A B A B P( ) = P( ) ; P( ) = P( ) ∪ ∩ ∩ ∪ ....De Morgan’s Laws P(A B) = P(A) - P(A B) ; P(A ) = P(B) - P(A B) B ∩ ∩ ∩ ∩ P(A B) P(A B) P(A/B)= P(B/A)= P(B) P(A) ; ∩ ∩ ....Conditional Probability P(B) = P(B/A 1 )*P(A 1 )+ P(B/A 2 )*P(A 2 )+ …..….. + P(B/A n )*P(A n ) ….Theorem of Total Probability j 1 1 2 2 n n ( )* ( / ) P(A/B)= P(B/A )*P(A )+P(B/A )*P(A )+&...+P(B/A )*P(A ) P Aj P B Aj ....Bayes’ Theorem Probability Distribution A random variable X is rule that assigns a numerical value to each result in sample space of an experiment. A discrete random variable can take on specific, isolated numerical values, like the outcome of a roll of a die, or the number of dollars in a randomly chosen bank account. A continuous random variable can take on any values within a continuum or an interval (a, b), like the temperature in Central Park, or the height of an athlete in centimeters. Probability distribution of a discrete random variable is a list of probabilities associated with each of its possible values in interval (a, b). It is also called the probability function or the probability mass function (p.m.f) or probability density function (p.d.f). function (p....
View Full Document
This note was uploaded on 12/14/2011 for the course ECE 604 taught by Professor Kumar during the Spring '11 term at Indian Institute of Technology, Kharagpur.
- Spring '11