Assignment 4
10/36-702
Due Friday April 10 at 3:00 pm
1. Here well study convexity (and concavity) in exponential families and generalized linear models. Consider an exponential family density (or mass) function over y D Rn , of the form
f (y; ) = exp yT

Assignment 2
10/36-702
Due Friday Feb 13 3:00 pm
1. [20 points = 10+10]
Recall that the total variation distance between distributions P and Q is
TV(P,Q) = sup |P(A) Q(A)|.
A
(a) Suppose that P and Q have densities p and q. Show that
TV(P,Q) =
1
2
| p q |

Assignment 1
10/36-702
Due Friday Jan 23 3:00 pm
1. Let X 1 , . . . , X n P and let = E[X i ] and 2 = Var(X i ). Dene
Xn =
1 n
X i,
n i=1
S2 =
n
1 n
(X i X n )2 .
n i=1
P
(a) Prove that S 2 2 .
n
(b) Prove that
n(X n )
Sn
N(0, 1).
2. Let X n = O P (1) and

Assignment 1 - Solutions
10/36-702
Due Friday Jan 23 3:00 pm
1. [10 points = 5+5]
Let X 1 , . . . , X n P and let = E[X i ] and 2 = Var(X i ). Dene
Xn =
n
1
X i,
n i=1
S2 =
n
n
1
(X i X n )2 .
n i =1
P
(a) Prove that S 2 2 .
n
(b) Prove that
n(X n )
N(0,

Assignment 2
10/36-702
Due Friday Feb 13 3:00 pm
1. Recall that the total variation distance between distributions P and Q is
TV(P,Q) = sup |P(A) Q(A)|.
A
(a) Suppose that P and Q have densities p and q. Show that
TV(P,Q) =
1
2
| p q |.
(b) Suppose that X

Assignment 3
10/36-702
Due Friday March 20 by 3:00 pm
1. Consider k-nearest-neighbors regression on i.i.d. pairs (x1 , y1 ), . . . (xn , yn ) Rd R. In this problem you will derive asymptotic error bounds for the k-nearest-neighbor estimator, under the
ver

10/36-702: Minimax Theory
1
Introduction
When solving a statistical learning problem, there are often many procedures to choose from.
This leads to the following question: how can we tell if one statistical learning procedure is
better than another? One a

Midterm
10/36-702
Wednesday Mar 4
There a 3 questions each worth the same number of points, and an (optional) bonus question.
1. [33 points] Let p be a bounded continuous density dened on a bounded subset S R. Assume
further that p has bounded, continuous

Convexity and Optimization
Statistical Machine Learning, Spring 2015
Ryan Tibshirani (with Larry Wasserman)
1
An entirely too brief motivation
1.1
Why optimization?
Optimization problems are ubiquitous in statistics and machine learning. A huge number of