Spring 2010 STAT230 Final package_merged (1)

Spring 2010 STAT230 Final package_merged (1) - Waterloo SOS...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Waterloo SOS STAT 230 Final Review Package Prepared by Grace Gu Spring 2010 STAT 230 Final Review Package Spring 2010 Table of Contents Important formulas ............................................................................................................ 3 Chapter 3 - Counting Techniques ....................................................................................... 5 Chapter 4 – Probability Rules and Conditional Probability ................................................ 7 Chapter 5 – Discrete Distributions .................................................................................... 10 Chapter 7 – Expectation, Averages and Variability........................................................... 16 Chapter 8 – Discrete Multivariate Distributions ............................................................... 18 Chapter 9 – Continuous Distributions .............................................................................. 25 Extra Practice for Final ...................................................................................................... 32 2 STAT 230 Final Review Package Important formulas 1. Spring 2010 2. – 3. n(r) = 4. – =n(n-1)(n-2)…(n-r+1) 5. 6. 7. 8. 9. 10. a. b. c. 3 STAT 230 Final Review Package 11. 12. 13. 14. 15. 16. 17. Spring 2010 18. 19. 20. 21. If X and Y are independent, then Cov(X, Y) = 0 22. The correlation coefficient of X and Y is 23. 24. 25. If we have n identically distributed random variables, and ai = 1 for all I = 1, …, n 26. 4 STAT 230 Final Review Package Chapter 3 - Counting Techniques Definitions The Addition Rule: Spring 2010 If we can do A in p ways and B in q ways, then we can do either A or B but not both in p + q ways. The Multiplication Rule: If we can do A in p ways and for each of these ways we can do job B in q ways, then we can do both A and B in p x q ways. Permutation Rules: a) The number of ways to arrange n distinct objects in a row is n! = n(n-1)(n-2)…(1) b) The number of ways to arrange r objects selected from n distinct objects is n(r) = =n(n-1)(n-2)…(n-r+1) c) The number of distinct arrangements of n objects when n1 are alike of one type, n2 alike of a second type, …, nk alike of a kth type (where n1 + n2 + … + nk = n) is Combination Rules: The number of ways to choose r objects from n is denoted by The r! in this formula removes the redundant options from the permutations. *Please do not confuse permutations and combinations. Combinations do not keep track of order. Combination is equal to or smaller than its corresponding permutation. 5 STAT 230 Final Review Package Chapter 3 - Distinguishing permutations and combinations Spring 2010 Is the following a permutation or a combination? 1. A group with 30 members chooses a President, Vice President, Treasurer and Secretary. Permutation – order does matter because you care which person is chose for which office. 2. A group with 30 members chooses 4 members to serve on a public relations committee. Combination - order DOES NOT matter - you don't care which member was chosen first or last. 3. In a race with 9 horses how many ways are there for any three horses to finish in win, place or show? Permutation - order does matter because you care which horse wins, comes in second or third 4. A pizza restaurant stocks 11 different toppings. How many ways are there to order a 3 topping pizza? Combination - You care whether or not you got the three toppings you wanted but you don't care in which order the pizza guy put the ingredients on the dough. Chapter 3 - Counting Technique Examples 1. True or False: If A and B are mutually exclusive events with P(A) > 0, P(B) > 0, then P(AB) = P(A)P(B) FALSE. 6 STAT 230 Final Review Package Spring 2010 2. Two boys and two girls randomly select one of five sections of STAT 230. Find the probability of the following events. a) All four students pick the same section Number of ways for students to choose section: Number of ways for all to choose the same one: 5. So P( all same) = 5/625 = 1/125 = 0.008 b) All four students pick different sections. Number of ways to all choose different: So P(all different) = 120/625 = 0.4096. c) No student picks section one. Number of ways if section one is eliminated: So P(no section one) = 256/625 = 0.4096 = 256. d) The two boys pick different sections and the two girls pick the same section (which may be the same as one of the boys.) For the boys, there are 2* =20 ways to have them in different sections. The girls can both be in any one of the five sections. So the number of ways this can occur: 20*5 = 100. So P(boys different, girls same) = 100/625 = 4/25 = 0.16. e) If two sections are cancelled, what is the probability of all the students selecting different sections? If two sections are cancelled, there are only three sections and four students. So there is no way they can all be in different sections! The probability is 0. Chapter 4 – Probability Rules and Conditional Probability Rules 1) 2) 7 STAT 230 Final Review Package 3) 4) 5) Spring 2010 6) : 8 STAT 230 Final Review Package Chapter 4 Examples 1. If A and B are events with P(A) = 0.3, P(B) = 0.6 and P(AB) = 0.2 then a) P(B) = 1-0.6 = 0.4 b) P(AUB) = P(A) + P(B) – P(AB) = 0.3 + 0.4 -0.1 – 0.6 c) P(A|B) = P(AB)/P(B) = 0.1/0.4 = 0.25 d) A and B are independent FALSE Spring 2010 2. Suppose A, B and C are three events such that A and B are mutually exclusive, C is independent of both A and B. Also, P(A) = 0.3, P(B) = 0.5 and P(C) = 0.4. Calculate a) P(AUB) 0.3 + 0.5 = 0.8 b) P(AC) 0.3*0.4 = 0.12 c) P(AUC) 0.3 + 0.4 – 0.12 = 0.58 d) P(AUC|B) =P(ABUCB)/P(B) = (0+0.5*0.4-0)/0.5=0.4 9 STAT 230 Final Review Package Chapter 5 – Discrete Distributions Definitions Spring 2010 Random Variable - a function that assigns a real number to each point in a sample space S. Probability function (p.f.) of a discrete random variable X - the function Cumulative distribution function (cdf) of a random variable X - the function 2. The following are the properties of a cdf F(x): a. b. c. 3. Distributions A) Discrete Uniform Distribution. If X takes on values a, a+1, a+2, . . . , b with all values being equally likely, then X has a discrete uniform distribution on [a, b]. B) Hypergeometric Distribution We pick n objects at random without replacement from a collection of N items, and X is the number of successes among the n objects picked. Then, X has a hypergeometric distribution. Intuition: Numerator: We use the counting techniques from Chapter 3. We have r “success items” within the collection of N items. We select x objects out of the r objects, and select the remaining n-x objects out of the N-r “failure items”. 10 STAT 230 Final Review Package Spring 2010 Denominator: If we don’t impose any restrictions, we can choose all n objects from any of the N items available. C) Binomial Distribution Suppose we conduct an experiment that results in Success, and Failure (a Bernoulli r.v.). Let the probability of success be p and the probability of failure be 1-p. We then repeat the experiment n independent times. Let X be the number of successes obtained. Then X has a binomial distribution. Intuition: The x successes can happen in any of the n trials, and the x successes and n-x failures are repeats. Thus, by the counting techniques from chapter 3, we can arrange them in ways. Since each one of the n trials is independent, by the multiplication rule from chapter 4, we can simply multiply all the probabilities together. D) Geometric Distribution Suppose we conduct an experiment that can either result in success (with probability p) or failure (with probability 1-p). We keep repeating the experiment independently until we obtain a success. Let X be the number of failures obtained before the first success. Then X has a geometric distribution. Intuition: We use the multiplication rule from Chapter 4 for independent events. We have x failures before obtaining the first success, so we multiply the probabilities for all of these events together. E) Negative Binomial Distribution (a generalization of the geometric distribution) Suppose we conduct an experiment which results in success (probability p) or failure (probability 1-p). We keep repeating the experiment independently until we obtain k successes. Let X be the number of failures obtained before the kth success. Then X has a negative binomial distribution. Intuition: When we have x failures before obtaining the kth success, we have a total of x+k trials. Note that the x failures and previous k-1 successes can happen in any order within the previous x+k-1 trials. Thus, by the counting techniques from chapter 3, we can arrange them in 11 STAT 230 Final Review Package Spring 2010 ways. Since each one of the x+k trials are independent, by the multiplication rule in chapter 4 we can simply multiply the corresponding probabilities. F) Poisson Distribution There are two ways to derive a Poisson distribution, one using the binomial approximation which we will see later, and another using the conditions of a Poisson process. We will see the latter first. 3 conditions of a Poisson process occurring randomly over time (or space): a. Independence: The number of events in non-overlapping intervals is independent. b. Individuality: For a very small time interval, the probability of 2 events occurring in the same interval is close to 0. In other words, no more than 1 event can take place at one exact point in time. c. Homogeneity/Uniformity: events occur at a uniform rate of . An event of some type occurs according to a Poisson process as defined above. Let X be the number of event occurrences in a time period of length t. Then, X has a Poisson distribution with Note: By the uniformity argument, if X takes place at a rate over a time period of length t1 , then X takes place at a rate of over a time period of length t2. G) Approximations a. Approximate of the Hypergeometric Distribution using the Binomial distribution. Recall, if X has a Hypergeometric distribution, If N and r are large, we can approximate this using a binomial distribution where . Then, we have b. Approximate of the Binomial Distribution using the Poisson distribution. Suppose that X has a Binomial (n, p) distribution. If we let n→∞ and p→0, while keeping np fixed at some µ, then we can use a Poisson distribution with µ=np to approximate X. Thus, we have 12 STAT 230 Final Review Package Example 1 Spring 2010 In some “random digit dialling” surveys, a computer phones randomly selected telephone numbers. However, not all numbers are “active” (belong to a telephone account) and they may belong to businesses as well as to individuals or residences. Suppose that for a given large set of telephone numbers, 52% are active residential or individual numbers. We will call these “personal” numbers. Suppose that we wish to interview (over the phone) 1000 persons in a survey. (a) Suppose that the probability a call to a personal number is answered is 0.75 and that the probability the person answering agrees to be interview is 0.55. Give the probability distribution for X, the number of calls needed to obtain 1000 interviews. (b) Suppose instead that 3200 randomly selected numbers were dialled. Give the probability distribution for Y, the number of interviews obtained, and find an expression for a suitable approximation of P(Y=1000) Solutions: (a) From the question, it seems that we will need to use the negative binomial distribution. However, we first need to find the probability of success, or in other words, P(interview). We realize that to get an interview, we need one exact sequence of events. We first need to reach a personal number; the call to the personal number must be answered; and the person answering needs to agree to be interviewed. We define the following events: Pers => Reached personal number A => Call was answered I => Agrees to be interviewed Thus, Thus, now we can use the negative binomial distribution. However, note that X is the total number of calls needed before 1000 interviews, not the number of failures. Hence, if we have X=x, then the number of failures will be x-1000. Thus, (b) 13 STAT 230 Final Review Package Spring 2010 Since the event of getting an interview is either a success or a failure (Bernoulli) and we have 3200 repetitions of this, Y has a Binomial (3200, 0.1287) distribution. Thus, 14 STAT 230 Final Review Package Example 2: Spring 2010 a) Consider a random variable X taking positive integer values with cumulative distribution function F(x) = 1 – 1/x, for x= 1, 2, 3, … Then P(X≥3) = 1 – F(2) = 1- (1 – ½ ) = ½ b) A random variable X has probability function f(x) = distribution function. F(x) = 1 F(x) = 0, if x<1 , x = 1, 2, … Find the cumulative 15 STAT 230 Final Review Package Chapter 7 – Expectation, Averages and Variability Definitions: Spring 2010 1. The median of a sample is a value such that half the results are below it and half the results are above it, when the results are arranged in numerical order. 2. The mode of the sample is the value which occurs most often. It is possible to have more than 1 mode in a sample. 3. The expected value of a discrete random variable X with probability function f(x) is 4. Suppose that the random variable X has probability function f(x). Then the expected value of some function g(x) of X is given by Linear property of expectations: 5. The variance of a r.v. X is given by We also have the following two expressions for variance, which are usually handier 6. The standard deviation of a random variable X is 7. Means and variances of special discrete distributions: a. Binomial distribution If X is a Binomial (n,p) random variable, then b. Poisson distribution If Y is a Poisson (µ) random variable, then 8. The moment generating function (m.g.f) of a discrete r.v. X with p.f. f(x) is given by: 9. The moments of the distribution of a random variable X can be derived from its m.g.f as follows: In other words, the rth moment of X can be obtained by differentiating M(t) r times and evaluating it at t = 0. 16 STAT 230 Final Review Package Example 1 Spring 2010 A free casino game has six envelopes each with cash amounts of 1, 2, 4, 8, 10, and 20 dollar(s). You are randomly given an envelope and receive the cash. a) What are your expected winnings? E(X) = (1 + 2 + 4 + 8 + 10 + 20)/6 = 7.5 b) Show that Var(X) = E(( Var(X) = = = = c) Calculate the variance of your expected winnings. E( = = 97.5 = 97.5 – = 41.25 Var(X) = d) Winnings from casino games are taxed at a rate of ln(x). What are your expected taxes? E(log(X)) = (log(1)+…+log(20)/6 = 1.5762 e) For a one-time event, the casino squares all the cash amounts in the envelopes. What is your expected after tax profit in this case? The after tax profit is E(t(X)) = E( Example 2 A random variable X has a Poisson distribution with f(x) = E(g(X)) E(g(X)) = , x = 0, 1, 2, … Let g(X) = . Find 17 STAT 230 Final Review Package Chapter 8 – Discrete Multivariate Distributions Definitions Spring 2010 1. Suppose that there are two random variables X and Y. We define f(x, y), the joint probability function of (X, Y) as: And as usual, 2. We define the marginal distributions of X and Y as: In words, this means that to find the marginal distribution of X, sum the joint p.f. over all values of Y; and similarly to find the marginal distribution of Y sum the joint p.f. over all values of X. 3. Let X1, X2,…, Xn be some collection of random variables. We say that X1, X2,…, Xn are independent if and only if: Note: This is similar to the definition of independent events we saw in Chapter 4. 4. The conditional probability function of X given Y=y is Similarly 5. If we have a new variable U, such that U = g(X, Y), then the probability function for U is 6. The multinomial distribution: Similar to the binomial case, we conduct an experiment which has k different outcomes (instead of just two), with probabilities p1,p2,…,pk (p1+p2+…+pk=1). This experiment is repeated independently n times. Let X1 be the number of times outcome 1 occurs, X2 be the number of times outcome 2 occurs,…, Xk be the number of times outcome k occurs. Then, (X1, X2,…,Xk) has a multinomial distribution with the joint distribution function: Intuition: Think about the multinomial in the same way as you did about the binomial distribution. There are n objects, and we have k repeating types among the n objects. There are objects of the ith repeating type, so by our repetition rule for permutations we can arrange 18 STAT 230 Final Review Package these objects in each type occurring times. 7. The expected value of a function of discrete rv’s X and Y, g(X, Y) is: Spring 2010 ways. Due to independence, we can simply multiply the probability of This can be extended beyond two variables X and Y. 8. Property of Multivariate Expectation: 9. The covariance of X and Y, denoted Note: A handier formula for covariance is 10. If X and Y are independent, then Cov(X, Y) = 0 11. Suppose X and Y are independent random variables. Then, if functions, . 12. The correlation coefficient of X and Y is and are any two Note: This is a measure of the strength of the relationship between X and Y that lies in the interval [-1,1]. 13. Properties of Covariances: a. b. Intuition: Think of this as multiplying the two terms (aX+bY) and (cU+dV) together. (Which is exactly how it is derived using the definition) 14. Variance of a linear combination: In fact, more generally if we have n r.v’s X1,X2,…,Xn If we have n identically distributed random variables, and ai = 1 for all I = 1, …, n 19 STAT 230 Final Review Package Spring 2010 Note: This general formula is very useful in problems involving indicator random variables. If all n random variables are independent, then 20 STAT 230 Final Review Package Example 1 X and Y are two random variables that take on integer values from 0 to 2. You are given the following information about the distribution of X and Y: Spring 2010 (a) What is the correlation between X and Y? (b) Suppose that you were not given E(Y), but were instead told that X and Y are independent. You are also given two additional random variables U and V, such that U = 5X and V = 3X – 2Y2. Calculate Cov(U, V) Solutions (a) We want to draw a table for the joint distribution of X and Y. But first, note that and we set P(X=0) = p giving P(X=0, Y=1)= 0.4p Using the given information, we fill in the following table Y X 0 1 2 Sum 0 0.12 0.27-0.4p 0.2 0.59-0.4p 1 0.4p 0.23 0.13-0.4p 0.36 2 0.6p-0.12 0.4p 0.17-0.6p 0.4p+0.05 sum P 0.5 0.5-p 1 Note: The bolded terms were calculated by first using the fact that simple subtractions and additions. Make sure that you can do them. and then by 21 STAT 230 Final Review Package Filling in the table for p=0.25, we have the following Y X 0 1 2 Sum 0 0.12 0.17 0.2 0.49 1 0.1 0.23 0.03 0.36 2 0.03 0.1 0.02 0.15 sum 0.25 0.5 0.25 1 Spring 2010 Now, recall that the correlation between X and Y is given by: From the table, we can calculate Var(X) and Var(Y) We were given that . Now, And, (b) From the first table we made in part(a), we have 22 STAT 230 Final Review Package Spring 2010 Plugging this into the table in part (a), we have: Y 1 0.4p 0.23 0.02728 0.36 X 0 1 2 Sum 0 0.12 0.16728 0.2 0.48728 2 0.03408 0.10272 0.01592 0.15272 sum 0.2568 0.5 0.2432 1 Now, Cov(U, V) = Cov(5X, 3X-2Y 2)=15Cov(X, X) – 10 Cov(X, Y 2) Cov(X, X)= Var(X)= E(X 2)-E(X) 2. Using the same method as in part (a), we get Var(X)=0.4998 Cov(X, Y 2)=E(XY 2)-E(X)E(Y 2)=0.8228-(0.9864)(0.97088)= -0.1349 (Verify this) Thus, Cov(U, V) = 15(0.4998)-10(-0.1349) = 8.846 23 STAT 230 Final Review Package Example 2 Spring 2010 The proportions of cats with blood types A, B, and AB respectively in a large population are 0.7, 0.2 and 0.1 respectively. Let denote the frequencies of these three types in a random sample of size 20 taken from the population. Find the conditional distribution of given =12. For P( = = 0,1,…,8 = = 24 STAT 230 Final Review Package Chapter 9 – Continuous Distributions Spring 2010 Note: In this chapter, we present relevant examples after each group of definitions as there is a lot of material to cover. Definitions 1. The probability density function (p.d.f.) f(x) for a continuous random variable X is the derivative: Where F(x) is the c.d.f. for X. 2. The following are properties of a p.d.f. : a. b. c. d. 3. When X is continuous, we define Example 1 be a pdf Find : a)k b)F(x) c)P(1/4 < X < 5/4) d)Var(X) 25 STAT 230 Final Review Package Spring 2010 Solutions: When finding the area of a region bounded by different functions, we split the integral into pieces: b) We start with the easy part which we forget all too often: F(x) = 0 if x ≤ 0 and F(x)= 1 if x ≥ 2. i.e c) d) 26 STAT 230 Final Review Package Spring 2010 Definitions (continued) 4. Change of variables: When we know the pdf or cdf of a continuous random variable X, and want to find the p.d.f or cdf of another random variable Y, we follow the steps below: a. Write the cdf of Y as a function of X. b. Use to find . Then if you want the p.d.f. ,differentiate c. Find the range of values of y. Example 2 Suppose that the random variable X the pdf Let Y =5X2 . Find the probability density function of Y. Solution: Integrating f(x), we find that F(x) = Note: We can obtain the range for y by plugging the range for x into the equation Y=5X2. 27 STAT 230 Final Review Package Spring 2010 Definitions (continued) Special distributions: 5. Suppose that X takes values in some interval [a,b] with equal probability across all subintervals. The X has a continuous uniform distribution. We write The probability density function of X is: And the cdf of X is: The mean and variance of X is given by: 6. If the continuous random variable X has an exponential distribution, then its pdf is of the form: And its cdf is of the form: Its mean and variance are: 7. Generation of Random Variables: To generate a random value from a r.v. X which has cdf F(x), generate a random number U between 0 and 1, and find X = F-1(U). Example 3 Suppose that the waiting time in a doctor’s office follow an exponential distribution with mean ½ hours. The doctor will receive a complaint if the waiting time is greater than 2 hours. What is the probability that he receives his 3rd complaint after seeing the 10th patient? Solution: 28 STAT 230 Final Review Package Spring 2010 The last sentence of the problem indicates that this involves a negative binomial distribution. However, we need to calculate p first. This is an exponential probability. dn n f n n- ain Definitions (continued) 8. If the continuous random variable X has a normal distribution, then it has mean and variance: It’s impossible to integrate the pdf of this distribution by hand, so we use the standard normal distribution instead. To transform the normally distributed rv X into the standard normal r.v. Z, we use the transformation: In other words, if and , Note, can also be written as Notes on reading probabilities for Z from the standard normal table (for P(Z≤ z): Only values for z > 0 are shown, so we use the fact that N(0 , 1) is symmetric about 0. a. If we want P(Z ≤ z) and z < 0, we use the fact that P( Z ≤ z)= P( Z ≥ -z)= 1 – P( Z ≤ z) b. Similarly if we want P(Z ≥ z) and z < 0, we use P(Z ≤ - z). c. If we want P( -q ≤ Z ≤ p) where q, p > 0, then we use d. If we have Q where P(Z ≤ z) = Q and Q < 0.5, we use P(Z ≤ -z) = 1-Q to find z. Use a similar approach if you have R where P(Z ≥ z) = R and R > 0.5. e. If we have Q where P(-z ≤ Z ≤ z) = Q, use P(Z ≤ z) = Q + 0.5(1-Q)=0.5Q + 0.5 I recommend that you understand these properties using a graph of a normal distribution rather than memorize them. 9. Linear combinations of independent normal random variables: 29 STAT 230 Final Review Package Spring 2010 a. If and Y = aX + b, where a and b are constants, then b,a2 2 b. If are independent, then where constants. c. Let X1,X2,…,Xn be independent N(µ, 2) random variables. Then are 10. Normal Distribution in Approximations Central Limit Theorem: Let X1,X2,…,Xn be independent identically distributed r.v’s with mean µ and variance . As the number of random variables, n→∞, the cdf of: And Approach the N(0,1) cdf. Notes a. The CLT can be applied to a Poisson distribution with a large by thinking of it to be a sum of independent Poisson r.v.’s with parameter 1. b. The CLT can be applied to a Binomial distribution with a large n by thinking of it to be a sum of n independent Bernoulli distributions with parameter p. c. Continuity corrections: Think of this as going “halfway towards the next probability”. Essentially, you increase the range that you are calculating the probability for by 0.5 in each direction of consideration. Sketch a histogram to help in understanding the following: Example 4 Suppose that X and Y are independent random variables, where Calculate the probability . 30 STAT 230 Final Review Package Spring 2010 Solution Note that X > Y – X + 3 can be rewritten as 2X – Y > 3. So, we first find the distribution for 2 X – Y. Recall that P( W > w) can be rewritten in terms of a standard normal distribution as: Thus, P(2X-Y>3) is Example 5 An astrophysicist suspects that the number of volcano eruptions per month on one of Jupiter’s moons can be modelled with a Poisson distribution. However, he isn’t sure what the parameter of the model should be. His data showed that in the last 10 years, there were only 3 months with more than 100 eruptions. What are the possible parameters of the Poisson distribution? (Use a continuity correction) Solution: This problem requires us to work backwards from the probability using the Central Limit Theorem. n n n aa . indi a a . f a an 100 an n in a n . in a n n i n a Converting to a standard normal probability, Thus, 31 STAT 230 Final Review Package Extra Practice for Final Spring 2010 1. Suppose X has a normal distribution with mean 2 and variance 4. Find 2. the constant c so that P( |X − 2|≤ c)= 0. 99 P(-c ≤ X-2 ≤ c) = P( ≤ ≤ ) = 0.99 = 2.57 or 2.58, c = 5.14 or 5.16 2. A random variable X has probability function x 0 1 2 . Find the moment generating function of X. P (X = x) 0. 5 0. 3 0. 2 M(t) = E( 3. Use an expression for P(AUB) (the additivity rule of probability) to prove that P(AB)≥P(A)+P(B)-1 P(AUB)=P(A)+P(B)-P(AB) Therefore P(AB)=P(A)+P(B)-P(AUB) ≥P(A)+P(B)-1 4. If P(A) = 0.6, P(B) = 0.5 and P( |A) = 0.5, find P(AUB). Are A, B independent? P(AB) = P(A) – P(A P(AUB) = P(A)+P(B) – P(AB) = 0.6 + 0.5 – 0.3 = 0.8 Since P(AB) = P(A)P(B) the events are independent 5. Suppose X and Y are discrete random variables with joint probability function f(x, y)= Try any pair of values. For example f(1,1) = But Since f(1,1) ≠ they are NOT INDEPENDENT. 6. Suppose X, Y are independent normal random variables with E(X) = E(Y) = 1 and Var(X) = Var(Y) = 5. Find P(X-2Y+3>1) E(X-2Y+3) = 1-2+3=2 32 STAT 230 Final Review Package Var(X-2Y+3)=25 P(X-2Y+3>1)=P( >) = P(Z>-0.2) where Z is standard normal = 0.5793 Spring 2010 7. Suppose X and Y are independent geometric random variables each with the same probability function f ( x) =( p)( 1 – p) , x=0,1,2, ... and define a random variable T= X+Y. Is the correlation coefficient between X and T, positive, negative or zero? Explain why. Large values of X will tend to result in larger values of T and small values of X to smaller values of T. Therefore pXT is positive. x 8. Each day the price of a particular stock can move “up” or “down” relative to the price of the stock on the previous day. The probability of an “up” move on any given day is p, and the probability of a “down” move is 1 − p where 0 p 1, independently of the stock movements on other days. For each t 1,2, ... define the random variable 1 if the move on day t is “up” Xt 0 if the move on day t is “down” Is Xt a Markov Chain? Explain why or why not and if it is, give its transition probability matrix. Since the probaility of a jump at time t +1 given the value of Xt does not depend on the values Xt−1, Xt−2, .... the chain is a Markov Chain. The transition matrix is P 1−p p 1−p p 33 STAT 230 Final Review Package Spring 2010 8. Four people are listed as friends on Facebook according to the diagram: a. Find b. Find E(X) using the relationship between X and the X= So E(X) = E( =0.01+0.01+0.001+0.001 =0.022 c. Find Var( Var( Var( d. Find all Cov( Cov( Cov( = =0.00009 (same for Cov( Cov( ’s e. Find Var(X) Var(X) = Var( +2 =0.0099+0.0099+0.00099+0.00099+2(0+0.00099+0.000009) =0.022536 34 Statistics 230, Winter 2010 Midterm Test 1 January 26, 2010 Duration: 75 Minutes Family Name: Given Name: ID #: 1. [9 marks] Consider the experiment of rolling a single fair die with sides labelled {1, 2, 3, 4, 5, 6}. Suppose A and B are two events. (a) If A and B are mutually exclusive, what can you say about P (A ∩ B )? (b) If A and B are independent, what can you say about P (A ∩ B )? (c) How many events have probability such events. 2 3? Give your answer as a combinatorial number, and verify it by listing all 1 (d) If A is the event that the die is odd (so A = {1, 3, 5} and P (A) = 2 ). In each of the following parts, find an event B with the listed properties, or show that no such event exists. i. B and A are mutually exclusive, and P (B ) = 1 . 2 ii. B and A are mutually exclusive, and P (B ) > 1 . 2 1 iii. B and A are independent, and P (B ) = 2 . iv. B and A are independent, and 1 2 < P (B ) < 1. (Hint: Look at your list from part (c)) v. B and A are independent and mutually exclusive. (Side 1 of 4 - Please turn over) 2. [4 marks] Suppose that A and B are two events. Draw a Venn diagram and use it to explain how P (A ∩ B ) can be computed if all you know are P (A), P (B ), and P (AB ). (Side 2 of 4 - Please turn over) 3. [11 marks] A deck of Euchre cards consists of 24 cards, divided into 4 suits {clubs, diamonds, hearts, spades} with each suit consisting of 6 ranks {9, 10, Jack, Queen, King, Ace}. There are, for example, four Queens in the deck. Suppose that 5 cards are dealt to form a hand. When calculating the following probabilities, fractions involving combinatorial numbers are preferred to decimals, but you should simplify where convenient. You cannot earn part marks unless you explain your answers. (a) What is the probability that the hand contains no Kings? (b) What is the probability that the hand contains at least one Queen? (c) What is the probability that the hand contains no Kings and no Queens? (d) What is the probability that the hand contains at least one King and at least one Queen? (e) What is the probability that the hand contains at least one Queen, given that it contains at least one King? (f) Is the answer in 3e less than, greater than, or equal to the answer in 3b? Could this be determined without computing either probability explicitly? (Side 3 of 4 - Please turn over) 4. [6 marks] Suppose that you are given four identical looking coins. Three of the coins are fair (they produce heads and tails with equal probability), but the remaining coin is biased, and produces heads with probability P (heads) = 3 . 5 You flip all of the coins at the same time. (a) What is the probability that three of the coins are tails, and the remaining coin is a head? (b) If you repeat the experiment until you obtain the outcome in (a) (i.e. 3 tails and 1 head), what is the probability that the single head is the biased coin? (Side 4 of 4) Statistics 230, Winter 2010 Family Name: Midterm Test 2 March 9, 2010 Duration: 75 Minutes Given Name: ID #: 1. [8 marks] Recall that if X is a random variable, then the cumulative distribution function of X is the function F (x) defined by F (x) = P (X ≤ x). For each of the following functions, either explain why the function cannot be the cumulative distribution function of a random variable, or find the probability that the variable is between 1 4 3 and 4 . (a) y=1 if x < 0 0 2 x if 0 ≤ x ≤ 0.8 F (x) = 0.64 if 0.8 < x < 1.5 1 if x ≥ 1.5 y=0 x=.8 x=1.5 (b) y=1 0 x+ F (x) = 1 y=0 x=1 1 4 sin(2πx) if x < 0 if 0 ≤ x < 1 if x ≥ 1 (c) y=1 F (x) = 0 1 − e −x if x < 0 otherwise y=0 (d) y=1 F (x) = 1 4 3 4 if x < 1 2 otherwise y=0 x=.5 (Page 1 of 4) 2. [3 marks] Suppose that the random variable X has the cumulative distribution function: y=1 0 0.3 F ( x) = 0.5 1 if if if if x<0 0≤x<2 2 ≤ x < 3.5 3.5 ≤ x y=0 (a) Sketch the cumulative distribution function of X on the provided axes. (b) Determine the following probabilities: i. P (X = 2) ii. P (X = 3) iii. P (X = 4) iv. P (X ≤ 3) 3. [5 marks] Consider three random variables, X ∼ Bi(12, 1 ), Y ∼ Bi(120 000, 30 1 ), and Z ∼ Poisson(µ). 3 000 (a) What are the expected values of X , Y , and Z ? (b) What are the variances of X , Y , and Z ? (c) Determine exact expressions for P (X = 5), P (Y = 5), and P (Z = 5). (Do not evaluate the expressions.) (d) By picking an appropriate value of µ, use your knowledge of Z to estimate P (Y = 5) to two decimal places. (e) Why is Z not useful for approximating P (X = 5)? 4. [3 marks] Suppose that X is a random variable with probability function f (x) = P (X = x) = 1 1 − e −1 x x = 1, 2, 3, . . . . Find the expected value of X . (Hint: It may help to make the substitution r = 1 − e−1 .) x , for (Page 2 of 4) 5. [4 marks] The score of a hockey game can be estimated by assuming that each team scores according to a Poisson process, and determining appropriate intensities from available data. Suppose that Canada plays a game against Russia. Assume that Canada scores at a rate of 3.5 goals per hour of play, and that Russia scores independently at a rate of 2.5 goals per hour of play. (a) What is the probability that there are a total of 3 goals during the first 20 minutes of the game? (b) What is the probability that Canada scores 4 goals and Russia scores 2 goals during the first 60 minutes of the game? (c) If a total of 6 goals are scored during the first 50 minutes of the game, what is the probability that Canada scores exactly 4 of them? 6. [6 marks] Recall that if X is a random variable, then its moment generating function M (t) is defined by M (t) = E (etX ). (a) Supposing that you know M (t), how could you determine the expected value of X 12 ? (b) For a particular random variable X , it has been determined that MX (t) = E (etX ) = following: i. E (X ) 1 . Compute the 1 − 6t ii. E (X 2 ) iii. Var(X ) (c) If someone tells you that they have determined that the moment generating function of a different random variable, Y , is MY (t) = E (etY ) = cos(t), why should you not believe them? (Page 3 of 4) 7. [5 marks] Two random variables, X and Y , satisfy E (X ) = 3, E (Y ) = 5, E (X 2 ) = 21, and E (Y 2 ) = 28. (a) What additional information do you need if you want to compute the correlation coefficient of X and Y ? (b) What are the values of Var(X ) and Var(Y )? (c) Suppose, in addition, that E (XY ) = 9. Compute Cov(X, Y ). (d) In fact, it is not possible for E (XY ) to be less than 9. What is the largest possible value of E (XY )? (Hint: Try computing ρ, the correlation coefficient.) 8. [6 marks] A jar contains a large number of jelly beans. Suppose half of the jelly beans are red, one quarter are green, and the rest are blue. You plan to randomly select 20 beans from the jar. Let R, G, and B be the numbers of red, green, and blue beans you select. (a) What is the probability that R = 9 and B = 4? (b) You plan to buy the red beans, at a cost of 5 cents each, and the blue beans, at a cost of 8 cents each (you will not buy the green beans). So your total cost will be C = 5R + 8B cents. i. Compute E (C ). ii. What are Var(R), Var(B ), and Cov(R, B )? iii. Give an expression for Var(C ) involving Var(R), Var(B ), and Cov(R, B ), and use it to compute Var(C ). iv. Would the variance increase, decrease, or stay the same if there were only 40 beans in the jar? (Page 4 of 4) ...
View Full Document

Ask a homework question - tutors are online