18.443. Practice test 1.
Consider the family of distributions with p.d.f.
f (x| ) = x1 , for 0 < x < 1, and > 0.
(1)
Consider an i.i.d. sample X1 , . . . , Xn from this distribution. As always, the
underlying parameter for this sample is unknown. In probl
Section 7
Testing hypotheses about parameters
of normal distribution.
T-tests and F-tests.
We will postpone a more systematic approach to hypotheses testing until the following
lectures and in this lecture we will describe in an ad hoc way T-tests and F-t
Section 14
Simple linear regression.
Let us look at the cigarette dataset from [1] (available to download from journals website)
and [2]. The cigarette dataset contains measurements of tar, nicotine, weight and carbon
monoxide (CO) content for 25 brands o
Section 10
Chi-squared goodness-of-t test.
Example. Let us start with a Matlab example. Let us generate a vector X of 100 i.i.d.
uniform random variables on [0, 1] :
X=rand(100,1).
Parameters (100, 1) here mean that we generate a 1001 matrix or uniform ra
Section 12
Tests of independence and
homogeneity.
In this lecture we will consider a situation when our observations are classied by two dierent
features and we would like to test if these features are independent. For example, we can ask
if the number of
Section 15
Multiple linear regression.
Let us consider a model
Yi = 1 Xi1 + . . . + p Xip + i
where random noise variables 1 , . . . , n are i.i.d. N(0, 2 ). We can write this in a matrix
form
Y = X + ,
where Y and are n 1 vectors, is p 1 vector and X is
Section 13
Kolmogorov-Smirnov test.
Suppose that we have an i.i.d. sample X1 , . . . , Xn with some unknown distribution P and we
would like to test the hypothesis that P is equal to a particular distribution P0 , i.e. decide
between the following hypothe
Lecture 6
Gamma distribution, 2-distribution,
Student t-distribution,
Fisher F -distribution.
Gamma distribution. Let us take two parameters > 0 and > 0. Gamma function
() is dened by
() =
x1 ex dx.
0
If we divide both sides by () we get
1=
0
1 1 x
x e dx
Lecture 2
Maximum Likelihood Estimators.
Matlab example. As a motivation, let us look at one Matlab example. Let us generate
a random sample of size 100 from beta distribution Beta(5, 2). We will learn the denition
of beta distribution later, at this poin
Lecture 4
Multivariate normal distribution and
multivariate CLT.
We start with several simple observations. If X = (x1 , . . . , xk )T is a k 1 random vector
then its expectation is
EX = (Ex1 , . . . , Exk )T
and its covariance matrix is
Cov(X) = E(X EX)(
Lecture 3
Properties of MLE: consistency,
asymptotic normality.
Fisher information.
In this section we will try to understand why MLEs are good.
Let us recall two facts from probability that we be used often throughout this course.
Law of Large Numbers (
Section 8
Testing simple hypotheses. Bayes
decision rules.
Let us consider an i.i.d. sample X1 , . . . , Xn X with unknown distribution P on X . Suppose
that the distribution P belongs to a set of k specied distributions, P cfw_P1 , . . . , Pk . Then,
giv
Lecture 5
Condence intervals for parameters of
normal distribution.
Let us consider a Matlab example based on the dataset of body temperature measurements
of 130 individuals from the article [1]. The dataset can be downloaded from the journals
website. Th
18.443. Practice test 2.
(1) Given a sample 5, 1, 4, 1, 2, 3 from Poisson distribution (), construct
the most p owerful test for
H0 : = 1 vs. H1 : = 2,
with level of signicance = 0.05. Test H0 .
(2) p. 561, no. 1.
(3) p. 574, no. 4.
(4) Suppose that in th
Section 11
Goodness-of-t for composite
hypotheses.
Example. Let us consider a Matlab example. Let us generate 50 observations from N(1, 2):
X=normrnd(1,2,50,1);
Then, running a chi-squared goodness-of-t test chi2gof
[H,P,STATS]= chi2gof(X)
outputs
H = 0,