18.05 Lecture 32 May 2, 2005
Two-sample t-test X1 , ., Xm N (1 , 2 )
Y1 , ., Yn N (2 , 2 )
Samples are independent.
Compare the means of the distributions.
H1 : 1 = 2 , 1 2
H2 : 1 = 2 , 1 > 2
By properties of Normal distribution a
18.05 Lecture 31 April 29, 2005
t-test X1 , ., Xn - a random sample from N (, 2 ) 2-sided Hypothesis Test: H1 : = 0 H2 : = 0 2 sided hypothesis - parameter can be greater or less than 0 Take (0, 1) - level of significance (error of type 1) Construct a con
18.05 Lecture 26 April 13, 2005
Confidence intervals for parameters of Normal distribution.
2 2 Confidence intervals for 0 , 0 in N (0 , 0 ) 2 2 - (x)2 ^ = x, = x ^ 2 0 , 2 0 with large n, but how close exactly? ^ ^
You can guarantee that the mean or vari
18.05 Lecture 25 April 11, 2005
Maximum Likelihood Estimators X1 , ., Xn have distribution P0 cfw_P : Joint p.f. or p.d.f.: f (x1 , ., xn ) = f (x1 |) . f (xn |) = () - likelihood function. If P - discrete, then f (x|) = P (X = x), and () - the probabili
18.05 Lecture 24 April 8, 2005
Bayes Estimator. Prior Distribution f () compute posterior f (|X1 , ., Xn ) Bayes's Estimator = expectation of the posterior. E(X - a)2 minimize a a = EX Example: B(p), f (p) = Beta(, ) f (p|x1 , ., xn ) = Beta( + + xi (x1 ,
18.05 Lecture 23 April 6, 2005
Estimation Theory: If only 2 outcomes: Bernoulli distribution describes your experiment.
If calculating wrong numbers: Poisson distribution describes experiment.
May know the type of distribution, but not the parameters in
18.05 Lecture 17 March 16, 2005
Properties of Expectation. Law of Large Numbers. E(X1 + . + Xn ) = EX1 + . + EXn Matching Problem (n envelopes, n letters)
Expected number of letters in correct envelopes?
Y - number of matches
Xi = cfw_1, letter i match
18.05 Lecture 15 March 9, 2005
Review for Exam 1 Practice Test 1: 1. In the set of all green envelopes, only 1 card can be green.
Similarly, in the set of red envelopes, only 1 card can be red.
Sample Space = 10! ways to put cards into envelopes, treati
18.05 Lecture 14 March 7, 2005 Linear transformations of random vectors: Y = r( X ) y1 x1 . . . =A . . . yn xn A - n by n matrix, X = A1 Y if det A = 0 A1 = B x1 = b1 y1 + . + b1n yn b11 . b1n . . where b s are partial derivatives of si with respect to yi
18.05 Lecture 13 March 4, 2005 Functions of random variables. If (X, Y) with joint p.d.f. f (x, y), consider Z = X + Y. p.d.f. of Z: f (z) = - f (x, z - x)dx If X and Y independent: f (z) = - f1 (x)f2 (z - x)dx
X, Y independent, uniform on [0, 1
18.05 Lecture 12
March 2, 2005
Functions of Random Variables X - random variable, continuous with p.d.f. f(x)
Y = r(X)
Y doesn't have to be continuous, if it is, find the p.d.f.
To find the p.d.f., first find the c.d.f.
P(Y y) = P(r(X) y) = P(x : r(X
18.05 Lecture 9 February 23, 2005
Cumulative distribution function (c.d.f):
F (x) = P(X x), x R
1. x1 x2 , cfw_X x1 cfw_X x2 P(X x1 ) P(X x2 ) non-decreasing function. 2. limx F (x) = P(X ) = 0, limx F (x) = P(X ) = 1. A random variable o