NTHU MATH 2820, 2008
Solution to Homework 10
made by
NTHU MATH 2820, 2008
Solution to Homework 10
made by
NTHU MATH 2820, 2008
Solution to Homework 10
made by
NTHU MATH 2820, 2008
Solution to Homework 10
made by
NTHU MATH 2820, 2008
Solution to Homework 9
made by
NTHU MATH 2820, 2008
Solution to Homework 9
made by
NTHU MATH 2820, 2008
Solution to Homework 9
made by
NTHU MATH 2820, 2008
Solution to Homework 9
made by
NTHU MATH 2820, 2008
Solution to Homework 7
made by
NTHU MATH 2820, 2008
Solution to Homework 7
made by
NTHU MATH 2820, 2008
Solution to Homework 7
made by
NTHU MATH 2820, 2008
Solution to Homework 6
made by
NTHU MATH 2820, 2008
Solution to Homework 6
made by
NTHU MATH 2820, 2008
Solution to Homework 6
made by
NTHU MATH 2820, 2008
Solution to Homework 6
made by
NTHU MATH 2820, 2008
Solution to Homework 4
made by
NTHU MATH 2820, 2008
Solution to Homework 4
made by
NTHU MATH 2820, 2008
Solution to Homework 4
made by
NTHU MATH 2820, 2008
Solution to Homework 4
made by
NTHU MATH 2820, 2008
Solution to Homework 4
NTHU MATH 2820, 2008
Solution to Homework 3
made by
NTHU MATH 2820, 2008
Solution to Homework 3
made by
NTHU MATH 2820, 2008
Solution to Homework 3
made by
NTHU MATH 2820, 2008
Solution to Homework 3
made by
NTHU MATH 2820, 2008
Solution to Homework 3
NTHU MATH 2820, 2008
3-70 Fx (x, y) = Pr ( X (1) x, X (n) y )
Solution to Homework 2
= Pr ( cfw_X (1) x cfw_X (n) y )
A
B
= Pr ( cfw_X (n) y ) Pr ( cfw_X (1) > x cfw_X (n) y ) ~ = Pr ( cfw_X1 , X 2 , , X n y ) Pr ( cfw_x < X1 , X 2 , , X n y ) = 3-77 U1 ,
NTHU MATH 2820, 2008
Solution to Homework 1
made by
NTHU MATH 2820, 2008
Solution to Homework 1
made by
NTHU MATH 2820, 2008
Solution to Homework 1
made by
NTHU MATH 2820, 2008
Solution to Homework 1
made by
NTHU MATH 2820, 2008
Solution to Homework 1
Proof. n (n - 1)S 2 = i=1 [(Xi - ) - (X n - )]2 n n 2 2 = + i=1 (Xi - ) i=1 (X n - ) n - 2(X n - ) [ i=1 (Xi - )] n 2 = + n(X n - )2 - 2n(X n - )2 i=1 (Xi - ) n 2 = - n(X n - )2 . i=1 (Xi - ) Therefore,
p. 7-11
Note. The previous three corollaries also ho
Expectation
p. 7-1
Recall. Expectation for univariate random variable. Theorem. For random variables X=(X1, . , Xn) with joint pmf pX/pdf fX, the expectation of a univariate random variable Y, where Y=g(X1, . , Xn), g:RnR1, is E(Y ) =
yY
y pY (y)
g(x1 ,
Method of probability density function Theorem. Let X=(X1, ., Xn) be continuous random variables with the joint pdf fX. Let
p. 6-31
Y=(Y1, ., Yn)= g(X), where g is 1-to-1, so that its inverse exists and is denoted by x=g1(y) = w(y) = (w1(y), w2(y), ., wn(
Proof. Let Ai (y) = cfw_x : gi (x) y, i=1, ., n, then FY (y1 , . . . , yn ) = P (Y1 y1 , . . . , Yn yn ) = P (X1 A1 (y1 ), . . . , Xn An (yn ) = P (X1 A1 (y1 ) P (Xn An (yn ) = P (Y1 y1 ) P (Yn yn ) = FY1 (y1 ) FYn (yn ). Theorem. X=(X1, ., Xn) are indepe
n n1,n m
=
n! n1 !n m !
p. 6-11
ways.
Example: MISSISSIPPI 11 4,1,2,4 =
11! 4!1!2!4! .
Example (Die Rolling). Q: If a balanced (6-sided) die is rolled 12 times, P(each face appears twice)=? Sample space of rolling the die once (basic experiment): 0 = cfw_
Jointly Distributed Random Variables
Recall. In Chapters 4 and 5, focus on univariate random variable. However, often a single experiment will have more than one random variable which is of interest.
p. 6-1
P
X1 X2 Xn
R R R
Definition. Given a sampl
A special case of the gamma distribution occurs when =n/2 and =1/2 for some positive integer n. This is known as the Chi-squared distribution with n degrees of freedom (Chapter 6) Summary for X ~ Gamma(, ) -1 -x Pdf: x e , if x 0, () f(x) = 0, if x < 0. C
Example (Uniform Distributions). If 1 , if < x , - fX (x) = 0, otherwise, then
p. 5-11
Some properties of expectation Expectation of Transformation. If Y=g(X), then R R E(Y ) = - y fY (y) dy = - g(x) fX (x) dx,
provided that the integral converges absolu
Continuous Random Variables
p. 5-1
Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables that can take (at least theoretically) on an u
p. 4-31
Note: For X~binomial(n, p), where (i) n large; (ii) p small, distribution of X Poisson(np) E(X) np mean of the Poisson Var(X) np(1p) variance of the Poisson
Poisson Process Example: (1) # of earthquakes occurring during some fixed time span (2) #
proof.
p. 4-21
Summary for X ~ Binomial(n, p) Range: X = cfw_0, 1, 2, ., n n x n-x , for x X Pmf: fX (x) = x p (1 - p) Parameters: ncfw_1, 2, 3, . and 0p 1 Mean: E(X)=np Variance: Var(X)=np(1p) Geometric and Negative Binomial Distributions Experiment: A b
p. 4-11
Expectation (Mean) and Variance
Q: We often characterize a person by his/her height, weight, hair color, . How can we "roughly" characterize a distribution? Definition: If X is a discrete r.v. with pmf fX and range X , then the expectation (or ca
Random Variables
p. 4-1
A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2, ., n, respectively) in our class. = cfw_all combinations = cfw_i1, ., ik: 1i1<ikn A probability measure
Some Notes. The mgf is a function of the variable t. The mgf may only exist for some particular values of t. Example. If X is a discrete r.v. taking on values xi with probability pi, i=1, 2, 3, ., then If X ~ Poisson(), then for <t<,
MX (t) = = e
p. 7-21
NTHU MATH 2810
Midterm Examination
Oct 30, 2007
Note. There are 7 problems in total. To ensure consideration for partial scores, write down intermediate steps where necessary.
1. (a) (3pts) Two years ago, Santa Claus was distributing 10 identical candies
NTHU MATH 2810
1. (16pts, 2pts for each) (a) True.
Final Examination Solution
Jan 8, 2008
(b) False. The values of a pdf can be larger than 1. However, the integration of a pdf over any region must have values between 0 and 1. (c) False. It must be a one-