#### b.lect14

Penn State, STAT 401
Excerpt: ... Outline The Joint Probability Density Function Statistics and Sampling Distributions Lecture 14 Chapter 4: Multivariate Variables and Their Distribution Michael Akritas Michael Akritas Lecture 14 Chapter 4: Multivariate Variables and Their Distribu Outline The Joint Probability Density Function Statistics and Sampling Distributions The Joint Probability Density Function Denition and Basic Properties Marginal Probability Density Functions Conditional Probability Density Functions Independence Statistics and Sampling Distributions Denitions and Examples Michael Akritas Lecture 14 Chapter 4: Multivariate Variables and Their Distribu Outline The Joint Probability Density Function Statistics and Sampling Distributions Denition and Basic Properties Marginal Probability Density Functions Conditional Probability Density Functions Independence The pdf of a univariate r.v. X is a function f (x) such that probabilities are represented as areas under the curve. Michael Akritas Lecture 14 Chapter 4 ...

#### Lecture 5

Maryland, ECON 422
Excerpt: ... Lecture 5 February 8, 2007 Review of Probability and Statistics 2 random variables: Joint Distribution Joint probability density function Marginal probability density function Independence Why important? Definition of independence: Useful conclusion if X,Y are independent: Proof: Conditional Distribution When the two random variables are dependent, we can describe how X affects Y with the conditional distribution: And in discrete terms: Conditional Expectation The conditional expectation of Y given X (discrete): Useful properties of conditional expectations: If X and Y are independent then Population Covariance Definition: expected value of the product of differences from mean: Property of covariance formula: Population Covariance Note the role of measurements: Covariance and independence Covariance Converse is not true! A covariance of zero does not imply independence Sample covariance Population Correlation Formula: Easier to interpret than covariance: If equal to 1, then perfect pos ...

#### stat400lec18

UIllinois, STAT 400
Excerpt: ... m variables X and Y are independent if P(X=x,Y=y)=P(X=x)P(Y=y) Or, equivalently, f(x,y)=f1(x)f2(y) Example: f(x,y)=(x+y)/21 for x=1,2,3 and y=1,2. (a)Find the marginal pmf of X (b) Are X and Y independent? Ping Ma Lecture 18 Fall 2005 -2- Mathematical Expectation Definition: If f(x,y) is the p.m.f. of discrete random variables X andY. Then the mathematical expectation or the expected value of function u(X,Y) E[u(X,Y)]= u ( x, y ) f ( x, y ) Example: Let pmf and X1 and X2 are f(x1,x2)=(3-x1-x2)/8 for x1=0,1 and x2=0,1. Find E[X+Y] Ping Ma Lecture 18 Fall 2005 -3- Example: DNA sequence consists of four nucleotides A, T C and G. A CpG island, a special region of a DNA sequence, has the 40% C, 40% G, and 20% A and T. If we choose 15 nucleotides randomly for a study, what is the probability that we have 6 Cs and 7 Gs? Continuous random variables Joint probability density function Properties of joint p.d.f f(x,y): (a) f(x,y)>0; (b) f ( x, y)dxdy 1 ; (c) P(a<X<b, c<Y<d)= b a ...

#### Exam4_06

University of Florida , STA 4321
Excerpt: ... e sure to watch your time. Do not turn the page or begin the exam until you are told that you may do so. 2 1. A mountain rescue service studied the behavior of lost hikers so that more effective search strategies could be devised. They decided to determine both the direction traveled and the experience level. From this study, it is known that the probabilities of a hiker being experienced or not and of going uphill, downhill or remaining in the same place are as shown in the table below: Direction Uphill Downhill Remain in Same Place Novice 0.10 0.25 0.25 Experienced 0.05 0.10 0.25 a. (8 points) Find the marginal distribution of the direction a lost hiker travels. b. (8 points) Find the conditional distribution of direction given that the hiker is experienced. c. (3 points) Are experience and the direction a hiker walks independent? Justify your answer. 3 2. For the joint probability density function , x1 > 0, x2 > 0, x1 + x2 1 6(1 - x1 - x2 ), , f ( x1 , x2 ) = otherwise 0, the marginal density func ...

#### hw04_11_06

Michigan, STAT 425
Excerpt: ... Stat 425 Problems due 4/11/06 Chapter 7: 9a (there can be more than one ball in an urn), 11, 32, 34 A) An environmental engineer measures the amount (by weight) of particulate pollution in air samples of a certain volume collected over the smokestack of a coal-operated power plant. Let Y1 denote the amount of pollutant per sample collected when the cleaning device on the stack is not operating and let Y2 denote the amount of pollutant per sample collected under the same environmental conditions when the cleaning device is operating. It is observed that the joint probability density function of Y1 and Y2 can be modeled by fY1 Y2 (y1 , y2 ) = K 0 if 0 y1 2; 0 y2 1; 2y2 y1 ; elsewhere. (That is, Y1 and Y2 are uniformly distributed over the region inside the triangle bounded by y1 = 2, y2 = 0 and 2y2 = y1 .) 1. Find the value of K that makes this function a joint probability density function . 2. Find P (Y1 3Y2 ). (That is, find the probability that the cleaning device will reduce the amount of pollutant ...

#### lecture22-312

Wisconsin, STAT 312
Excerpt: ... freshmen? x 39, 43, 21, 64, 57, 47, 28, 75, 34, 52 y 65, 78, 52, 82, 92, 89, 73, 98, 56, 75 x<-c(39,43,21,64,57,47,28,75,34,52) y<-c(65,78,52,82,92,89,73,98,56,75) > cor(x,y) [1] 0.8397859 We are interested in testing H0 : = 0 vs. H1 : = 0. In order to do this, we need to know the distribution of the sample correlation coefcient r under the null assumption. 2. Suppose the joint probability density function f of X and Y is the bivariate normal such that X 2 2 N (X , X ), Y N (Y , Y ) and (X, Y ) = . In this situation, it can be shown that E(Y |X = x) = 0 + 1 x, where 1 = Y /X . This is the regression model we studied before. Note that = 0 if and only if 1 = 0. Hence testing H0 : = 0 vs. H1 : = 0 is equivalent to H0 : 1 = 0 vs. H1 : 1 = 0. So the appropriate test statistic is T = (1 1 )/S1 = 1 /S1 tn2 This is asymptotic results for large sample. >fisher.z<-function(r){atanh(r)} >cor.test(x,y,"two.sided","pearson" ,conf.level ...

#### quiz-4

UCSC, AMS 131
Excerpt: ... Name: Discussion Section: AMS-131: Introduction to Probability Theory Quiz 4 (Thursday May 28, 2009) Quizzes are closed-book, closed-notes, with the exception of one (letter size) piece of paper with formulas on both sides. Please show all your work. Unsupported answers will receive little (or no) credit, even if they are correct. Good luck! Suppose that X and Y have a continuous joint distribution for which the joint probability density function is given by f (x, y) = 3 2 -y 8x e 0 for 0 x 2 and y > 0 otherwise (a) (4 points) Compute the probability Pr(X > 1 and Y < 1). (b) (6 points) Obtain the marginal probability density functions f 1 (x) and f2 (y) for X and Y , respectively. Are X and Y independent? Justify your answer. ...

#### S11

ASU, CHAPTER 591
Excerpt: ... 3-11 CORRELATION AND INDEPENDENCE 3-11.1 Correlation joint probability density function joint probability mass function covariance 3-11.2 Independence ...

#### LectureNotes10

UConn, STAT 230
Excerpt: ... probability mass function of Y1 is: 2 3 y1 6 2 2 y2 =0 p(y1 ) = y2 =0 3 y1 p(y1 , y2 ) = 3 2-y1 6 2 2 y2 1 2 - y1 - y2 = y1 = 0, 1, 2 Similarly, The marginal probability mass function of Y2 is 2 p(y2 ) = y1 =0 2 y2 p(y1 , y2 ) = 4 2-y2 6 2 2 y2 6 2 2 y1 =0 3 y1 1 2 - y1 - y2 = y2 = 0, 1, 2 Multivariate Probability Distributions: Part I p. 12/3 Jointly Distributed Discrete Random Variables All these information can be summarized in the table below. y1 0 0 y2 1 2 p1 (y1 ) 0 2/15 1/15 3/15 1 3/15 6/15 0 9/15 2 3/15 0 0 3/15 p2 (y2 ) 6/15 8/15 1/15 1 Multivariate Probability Distributions: Part I p. 13/3 Jointly Distributed Continuous Random Variables Definition 5.4: Let Y1 and Y2 be two continuous random variables associated to a random experiment. The joint probability density function , f (y1 , y2 ) for Y1 and Y2 is the function such that for any two-dimensional set A P (Y1 , Y2 ) A = (y1 ,y2 )A f (y1 , y2 ) dy1 dy2 . In particular, b d P a Y1 b, c Y2 d = a c f (y1 , y2 ) dy ...

#### SampleFinalExamQuestionsStat230

UConn, STAT 230
Excerpt: ... SOME SAMPLE EXAMS PROBLEMS Problem 1 Suppose two fair dice are rolled. Let X be the sum of the numbers showing, and let Y be the larger of the two. (a) Compute the joint probability mass function (b) Compute Cov(X, Y ). (c) Compute Corr(X, Y ). Problem 2 A study claims that the daily number of hours, X, a teenager watches television and the daily number of hours. Y , he works on his homework are approximated by the joint probability density function f (x, y) = xye-(x+y) , if x 0 and y 0, 0, otherwise. What is the probability a teenager chosen at random spends at least twice as much time watching television as he does working on his homework? Problem 3 Let X and Y be two random variables with joint probability distribution function F (x, y) = 1 - e-x 1 - e-y , if x > 0 and y > 0, 0, otherwise. (a) Find the joint probability density function of X and Y . (b) Find the marginal distribution of X and Y . (c) Are X and Y independent random variables? What is Corr(X, Y )? (d) Compute E(XY ). 1 Problem 4 Sup ...

#### lecture22 notes

Wisconsin, STAT 312
Textbook: Probability and Statistics for Engineering and the Sciences (with CD-ROM and InfoTrac )
Excerpt: ... Stat 312: Lecture 22 Inference on Correlation Moo K. Chung mchung@stat.wisc.edu April 17, 2003 Concepts 1. Assume that X and Y are two random variables whose joint probability density function f is the 2 bivariate normal such that X N (X , X ), Y 2 ) and (X, Y ) = . In this situation, we N (Y , Y can show that (Y |X = x) = 0 + 1 x, where 1 = Y /X . This is the regression model we studied! procedure and end up with the same result. Example. Test for the dependence of the mathematical achievement test scores (xi ) and calculus grades (yi ) for 10 college freshmen assuming (x i , yi ) are from bivariate normal. x 39, 43, 21, 64, 57, 47, 28, 75, 34, 52 y 65, 78, 52, 82, 92, 89, 73, 98, 56, 75 Solution. There are two ways to solve this problem. 2. Testing H0 : = 0 vs. H1 : = 0 is equiva- The first approach might be much simpler conceptually. lent to H0 : 1 = 0 vs. H1 : 1 = 0. So the The only thing you need is to compute the correlation appropriate test statistic for testing H 0 : = 0 is coeffi ...

#### wk08.qu.txt

East Los Angeles College, MATH 230
Excerpt: ... Questions for Math 230 Probability: Week 8 Q 8.1 WSwk08 Suppose X1 , X2 and X3 are independent random variables with expectations 3, -1 and 2, and variances 5, 8 and 9 respectively. Find the means, variances and covariance of Y1 = 3X1 - 2X2 + X3 and Y2 = X2 - X3 . Q 8.2 WSwk08 Let X = (X1 , X2 , X3 ) have variance matrix 4 2 0 2 6 0 . 0 0 4 Find the variance matrix of Y = AX where 1 -1 0 A= 1 0 -1 . 0 1 -1 Q 8.3 WSwk08 The random variables X, Y Exp() and X and Y are independent. Find the joint and X the marginal distribution of S = X+Y , T = X + Y . Are these independent? Q 8.4 WSwk08 Exam 2001. Let X1 and X2 be two continuous random variables with joint probability density function 1 |x1 | + |x2 | 1 2, fX1 X2 (x1 , x2 ) = 0, otherwise. (i) Sketch the area where fX1 X2 (x1 , x2 ) = 1 . 2 (ii) Are X1 and X2 independent? (iii) Show that the marginal probability density function of X1 is -1 x1 1 1 - |x1 |, fX1 (x1 ) = 0, otherwise. (iv) Derive the joint probability density function of S = X1 + X2 ...

#### exam3handout06

Iowa State, STAT 341
Excerpt: ... Statistics 341, Exam 3, Friday, April 14, 2006, Michael Larsen Closed book. No notes. Calculator permitted. Read directions carefully. Show work. Answers without adequate work and explanation might receive reduced or no credit. DISTRIBUTION FORMULAS PROVIDED. NORMAL TABLE PROVIDED. NAME: Problem 1 Problem 2 Problem 3 Problem 4 Problem 5 Problem 6 Total Parts Points Score 3 20 2 13 3 20 2 14 2 13 4 20 16 100 f (x) = 12x(1 - x)2 1. (20 points) Let X have the following probability density function (pdf): for 0 < x < 1 and f (x) = 0 elsewhere. (a) Show that the function is, in fact, a probability density function (pdf). (b) What is the mean of X? (c) What is the probability that X has a value between -0.1 and 0.2? 2. (13 points) Consider the following joint probability density function : 2 f (x, y) = ye-x/3 3 for 0 < y < 1 and 0 < x < and f (x, y) = 0 otherwise. (a) Are X and Y independent? Why or why not? (b) What is E(XY )? Show work or briefly explain how you arrive at your answer. 3. (20 points) Let rando ...

#### f00hw11

Purdue, IE 230
Excerpt: ... Homework #11. Due November 8, 2000. IE 230 Textbook: D.C. Montgomery and G.C. Runger, Applied Statistics and Probability for Engineers, John Wiley & Sons, New York, 1999. Chapter 6, Sections 6.3-6.4. Pages 15-16 of the concise notes. 1. (Problem 6.34, M&R Textbook.) Determine the value of c such that the function f (x , y ) = cxy for 0 < x < 3 and 0 < y < 3 satises the properties of a joint probability density function . 2. (Problem 6.35, M&R Textbook.) Continuation of Problem 1. Determine the following. (a) P(X < 2.5, Y < 3). (b) P(X < 2.5). (c) P(1 < Y < 2.5). (d) P(X > 1.8, Y < 2.5) (e) E(X ). (f) E(Y ). 3. (Problem 6.36, M&R Textbook.) Continuation of Problem 1. Determine the following. (a) Marginal probability distribution of the random variable X . (b) Conditional probability distribution of the random variable Y given that X = 1.5. (c) E(Y | X = 1.5). (d) Conditional probability distribution of X given that Y = 2. 4. (Problem 6.54, M&R Textbook.) The time between surface problems in a galvanizing p ...

#### hw7

SUNY Stony Brook, AMS 311
Excerpt: ... AMS 311 (Fall, 2008) Joe Mitchell PROBABILITY THEORY Homework Set # 7 Due at the beginning of class on Tuesday, November 25, 2008. Reminder: Show your reasoning! You do not need to evaluate arithmetic expressions or integrals, if they are fully specied. For example, 0.5 1x 2 y you may leave 0 x e dydx in this form. x Read: Ross, Chapter 6, Sections 6.4, 6.5; Chapter 7, Section 7.5 and Section 7.7; and handout Notes on Expectation, Moment Generating Functions, Variance, Covariance SPECIFICS OF READING ASSIGNMENT: Examples to read carefully: Chapter 6: 4a, 4b, 5a, 5b Chapter 7: 5a, 5b, 5c, 5d, 5k, 5l, 7a, 7b, 7d, 7e, 7f, 7g, 7h (1). (15 points) If E(3X) = var(X/2) and var(2X) = 3, nd (a). E[(2 + X)2 ] and (b). var(4 + 3X). (2). (15 points) The random variables X and Y have a joint density function given by f (x, y) = Compute cov(2X, Y + 3). (3). (30 points) Let X and Y be continuous random variables with joint probability density function given by 2 f (x, y) = Cx if x 0, x < 4, x y, ...

#### stat400lec18

University of Illinois, Urbana Champaign, STAT 400
Excerpt: ... are independent if P(X=x,Y=y)=P(X=x)P(Y=y) Or, equivalently, f(x,y)=f1(x)f2(y) Example: f(x,y)=(x+y)/21 for x=1,2,3 and y=1,2. (a)Find the marginal pmf of X (b) Are X and Y independent? Ping Ma Lecture 18 -2- Mathematical Expectation Definition: If f(x,y) is the p.m.f. of discrete random variables X andY. Then the mathematical expectation or the expected value of function u(X,Y) E[u(X,Y)]= u ( x, y ) f ( x, y ) Example: Let pmf and X1 and X2 are f(x1,x2)=(3-x1-x2)/8 for x1=0,1 and x2=0,1. Find E[X+Y] Ping Ma Lecture 18 -3- Example: DNA sequence consists of four nucleotides A, T C and G. A CpG island, a special region of a DNA sequence, has the 40% C, 40% G, and 20% A and T. If we choose 15 nucleotides randomly for a study, what is the probability that we have 6 Cs and 7 Gs? Ping Ma Lecture 18 -4- Continuous random variables Joint probability density function Properties of joint p.d.f f(x,y): (a) f(x,y)>0; (b) f ( x, y)dxdy = 1 ; (c) P(a<X<b, c<Y<d)= b a d c f ( x, y )dxdy Marginal probab ...

#### prob3

East Los Angeles College, EC 961
Excerpt: ... c Y d) = f ( x, y)dydx a c b d This function is called a joint probability density function . The joint probability function must also be nonnegative and sum to one. The cumulative joint distribution function is given by: F ( x, y ) = P( X x and Y y ) = P (- X x and - Y y ) = - - f (s, t )dtds x y Properties of Joint cdfs 1.F (- ,- ) = 0 2.F (, ) = 1 2 F ( x, y ) 3. = f ( x, y ) xy 4.P (a X b and c Y d ) = F (b, d ) - F (a, d ) - F (b, c) + F (a, c) Marginal Distributions and Independence To recover the marginal distribution in the continuous case, we simply integrate out the relevant variable. Thus: g ( x) = f ( x, y )dy - h( y ) = f ( x, y )dx - Recall that we said that two variables were independent if P(AB) = P(A)P(B). Similarly, two random variables are independent if f(x,y) = g(x)h(y). That is to say if we can write the joint probability density function as the product of the marginal distributions. Expectation and Variance I will only consider the continuous cas ...

#### my.lecture13

Texas A&M, STAT 211
Excerpt: ... Lecture 13 Jointly Distributed Random Variables 1 Discrete Random Variable Joint Probability Mass Function Joint probability mass function of two random variables X and Y is p(x, y ) = P X = x, Y = y P (X, Y ) A = (x,y )A p(x, y ) Marginal Probability Mass Function Marginal probability mass function of X and Y is respectively PX (x) = y all P X = x, Y = y P X = x, Y = y xall PY ( y ) = Example 5.1 and 5.2 1 2 Continuous Random Variable Joint Probability Density Function f (x, y ) is the joint probability density function of the continuous random variables X and Y if P (X, Y ) A = (x,y )A f (x, y )dxdy. Especially, b d P a X b, c Y d = a c f (x, y )dydx. Marginal Probability Density Function Marginal probability density function of X and Y is respectively fX (x) = f (x, y )dy f (x, y )dx fY (y ) = Example 5.3 and 5.4 2 3 Independent Random Variables X and Y are independent if p(x, y ) = PX (x) PY (y ) f ...

#### exam3-review

Oakland University, MATH 30530
Excerpt: ... Math 30530-02, Fall 2007 Review for Exam 3 Exam 3, Tuesday, November 27 at 8 a.m. in DBRT 140, will cover sections 7.1-7.4 and 8.1-8.3. Of course this material builds on the earlier material. Section 9.1 wont be covered on this exam but is good reinforcement for 8.1-8.3. You are allowed to use an 8 1 11 sheet of paper with notes in your own handwriting on one side (or 2 an index card up to 5 8 with notes on both sides). A table of values of will be provided. Here is an outline of the topics. 1. Specic types of continuous distributions Uniform Normal Normal approximation to the binomial distribution Correction for continutity Exponential Gamma 2. Jointly distributed random variables Joint cumulative probability distribution function (joint distribution function) Marginal distribution functions Joint probability mass function Relation with probability mass functions Jointly continuous random variables Joint probability density function (joint densit ...

#### hw10

Midwestern State University, MATH 443
Excerpt: ... Math/Stat 443 Fall 2008 Homework 10 Due December 3 Wednesday 1. Exercise 6.1 a Do this by method of distribution function. 2. Exercise 6.9 a This is the method of transformation applied to Exercise 6.1 a. should get the same answers! You 3. Exercise 6.4 Use any method. 4. Exercise 6.7 Use any method. 5. Exercise 6.11 Use the method of transformation (Jacobian). variable V=X2. Note that uuv=1v, vuv=-uv2, etc. function of u and v. Define a second random So, the Jacobian will be a 6. Exercise 6.16 7. Exercise 6.31 Hints: Use the mgfs to get the joint probability function of X1 and X2. Use the definition of conditional probability. 8. Suppose X and Y have joint probability density function fx,y=1x2y2, x,y1 . Find the joint probability density function of U=XY and V=X/Y. ...