18.05 Lecture 10 February 25, 2005
x y In the continuous case: F (x, y) = P(X x, T y) = - - f (x, y)dxdy. Marginal Distributions Given the joint distribution of (X, Y), the individual distributions of X, Y
are marginal distributions.
Discrete (X, Y): ma
18.05 Lecture 14 March 7, 2005 Linear transformations of random vectors: Y = r( X ) y1 x1 . . . =A . . . yn xn A - n by n matrix, X = A1 Y if det A = 0 A1 = B x1 = b1 y1 + . + b1n yn b11 . b1n . . where b s are partial derivatives of si with respect to yi
18.05 Lecture 9 February 23, 2005
Cumulative distribution function (c.d.f):
F (x) = P(X x), x R
Properties:
1. x1 x2 , cfw_X x1 cfw_X x2 P(X x1 ) P(X x2 ) non-decreasing function. 2. limx F (x) = P(X ) = 0, limx F (x) = P(X ) = 1. A random variable o
18.06CI Hints for speakers
Page 1 of 1
Hints for Class Presentations
1. Good timing is very important. Don't try to include more than you can explain
in 25 min.
2. When choosing the contents of your talk, priority should be given to examples
over formal
PROPERTIES OF SIMPLE ROOTS
YOUR NAME HERE
18.099 18.06 CI.
Due on Monday, May 10 in class.
Write a paper proving the statements and working through the examples
formulated below. Add your own examples, asides and discussions whenever
needed.
Let V be a Eu
SIMPLE AND POSITIVE ROOTS
YOUR NAME HERE
18.099 18.06 CI.
Due on Monday, May 10 in class.
Write a paper proving the statements and working through the examples
formulated below. Add your own examples, asides and discussions whenever
needed.
Let V be a Euc
ABSTRACT ROOT SYSTEMS
YOUR NAME HERE
18.099 18.06 CI.
Due on Monday, May 10 in class.
Write a paper proving the statements formulated below. Add your own
examples, asides and discussions whenever needed.
Let V be a Euclidean space, that is, a nite dimensi
CARTAN MATRICES, DYNKIN DIAGRAMS AND
CLASSIFICATION
YOUR NAME HERE
18.099 18.06 CI.
Due on Monday, May 10 in class.
Write a paper proving the statements and working through the examples
formulated below. Add your own examples, asides and discussions whene
REFLECTIONS IN A EUCLIDEAN SPACE
YOUR NAME HERE
18.099 18.06 CI.
Due on Monday, May 10 in class.
Write a paper proving the statements formulated below. Add your own
examples, asides and discussions whenever needed.
Let V be a nite dimensional real linear
CARTAN MATRIX OF A ROOT SYSTEM
YOUR NAME HERE
18.099 18.06 CI.
Due on Monday, May 10 in class.
Write a paper proving the statements and working through the examples
formulated below. Add your own examples, asides and discussions whenever
needed.
Let V be
18.700. Problem Set 7
Due date: November 18 (Friday) in lecture or in my oce
b efore noon on due date. Late homeworks will be accepted only with
a medical note or for some other MIT approved reason. You may work
with others, but the nal write-up should be
18.700. Problem Set 6
Due date: November 7 (Monday) in lecture or in my oce
b efore noon on due date. Late homeworks will be accepted only with
a medical note or for some other MIT approved reason. You may work
with others, but the nal write-up should be
18.700. Problem Set 5
Due date: October 24 (Monday) in lecture or in my oce
b efore noon on due date. Late homeworks will be accepted only with
a medical note or for some other MIT approved reason. You may work
with others, but the nal write-up should be
18.700. Problem Set 4
Due date: October 17 (Monday) in lecture or in my oce
b efore noon on due date. Late homeworks will be accepted only with
a medical note or for some other MIT approved reason. You may work
with others, but the nal write-up should be
18.05 Lecture 12
March 2, 2005
Functions of Random Variables X - random variable, continuous with p.d.f. f(x)
Y = r(X)
Y doesn't have to be continuous, if it is, find the p.d.f.
To find the p.d.f., first find the c.d.f.
P(Y y) = P(r(X) y) = P(x : r(X
18.05 Lecture 13 March 4, 2005 Functions of random variables. If (X, Y) with joint p.d.f. f (x, y), consider Z = X + Y. p.d.f. of Z: f (z) = - f (x, z - x)dx If X and Y independent: f (z) = - f1 (x)f2 (z - x)dx
Example:
X, Y independent, uniform on [0, 1
18.05 Lecture 22 April 4, 2005
Central Limit Theorem X1 , ., Xn - independent, identically distributed (i.i.d.) 1 x = n (X1 + . + Xn ) = EX, 2 = Var(X) n(x - ) - - n- - N (0, 1) You can use the knowledge of the standard normal distribution to describe you
18.05 Lecture 32 May 2, 2005
Two-sample t-test X1 , ., Xm N (1 , 2 )
Y1 , ., Yn N (2 , 2 )
Samples are independent.
Compare the means of the distributions.
Hypothesis Tests:
H1 : 1 = 2 , 1 2
H2 : 1 = 2 , 1 > 2
By properties of Normal distribution a
18.05 Lecture 29 April 25, 2005
Score distribution for Test 2: 70-100 A, 40-70 B, 20-40 C, 10-20 D Average = 45 Hypotheses Testing. X1 , ., Xn with unknown distribution P Hypothesis possibilities: H1 : P = P 1 H2 : P = P 2 . Hk : P = P k There are k simpl
18.05 Lecture 31 April 29, 2005
t-test X1 , ., Xn - a random sample from N (, 2 ) 2-sided Hypothesis Test: H1 : = 0 H2 : = 0 2 sided hypothesis - parameter can be greater or less than 0 Take (0, 1) - level of significance (error of type 1) Construct a con
18.05 Lecture 27 April 15, 2005
Take sample X1 , ., Xn N (0, 1) A=
n(x - ) n(x2 - (x)2 N (0, 1), B = 2 n-1 2
A, B - independent.
To determine the confidence interval for , must eliminate from A:
A
1 n-1 B
Where Z0 , Z1 , ., Zn-1 N (0, 1) The standard n
18.05 Lecture 26 April 13, 2005
Confidence intervals for parameters of Normal distribution.
2 2 Confidence intervals for 0 , 0 in N (0 , 0 ) 2 2 - (x)2 ^ = x, = x ^ 2 0 , 2 0 with large n, but how close exactly? ^ ^
You can guarantee that the mean or vari
18.05 Lecture 24 April 8, 2005
Bayes Estimator. Prior Distribution f () compute posterior f (|X1 , ., Xn ) Bayes's Estimator = expectation of the posterior. E(X - a)2 minimize a a = EX Example: B(p), f (p) = Beta(, ) f (p|x1 , ., xn ) = Beta( + + xi (x1 ,
18.05 Lecture 19 March 28, 2005
Covariance and Correlation Consider 2 random variables X, Y 2 2 x = Var(X), y = Var(Y ) Definition 1:
Covariance of X and Y is defined as:
Cov(X, Y ) = E(X - EX)(Y - EY ) Positive when both high or low in deviation.
Defi
18.05 Lecture 23 April 6, 2005
Estimation Theory: If only 2 outcomes: Bernoulli distribution describes your experiment.
If calculating wrong numbers: Poisson distribution describes experiment.
May know the type of distribution, but not the parameters in
18.05 Lecture 25 April 11, 2005
Maximum Likelihood Estimators X1 , ., Xn have distribution P0 cfw_P : Joint p.f. or p.d.f.: f (x1 , ., xn ) = f (x1 |) . f (xn |) = () - likelihood function. If P - discrete, then f (x|) = P (X = x), and () - the probabili