Let us dene the product space Xn , n and let
F (x1 , . . . , xn ) = x1 + . . . + xn
Thus F (x) is the deviation of the number of heads shown by an unfair coin from
the expected number np of heads. Theorem 12.3 tells us that the probability that
We are given an > 0 and an integer k. We would like to prove that if n is large
enough, there will be a k-dimensional subspace L for which the restriction of k kW
onto L is -close to the average. Let us choose a k-dimensional subspace A Rn
and consider th
Now we apply the multiplicative form of the Brunn-Minkowski inequality (Theorem
20.2) to claim that
1/2 (A)1/2 (B).
In particular, if K = B is the unit ball, we get the ineq
In the end, we get
E expcfw_F Ex0
krf (x0 )k2 d = E exp
krf k2 ,
(15.2) Corollary. Let f : Rn ! R be a 1-Lipschitz function and let a = Ef be
the average value of f . Then
n 2t2 o
f (x) a t exp
Proof. The (u, v)th entry of LT L is
le,u le,v .
If u = v then the eth term is 1 if and only if e is incident to v and 0 otherwise, so
the entry is deg v, If u 6= v and cfw_u, v is an edge, then the only one non-zero term
corresponds to the edge e =
Proof of Theorem 25.2. Without loss of generality, we assume that
f d = 1,
so our goal is to prove that
(ln f ) d
for some absolute constant c.
For t > 0, let
Bt = x 2 Rn :
f (x) t
denote the ball of radius t in the norm f .
Let us choose a
and our goal is to estimate the integral.
We can write
a = fn
Letting gi = fi
Let us denote
d = E e
Ek (h) = E(h|Fk )
Hence we can write
for any function h : X ! R.
= E E0 E1
U , we get a random y, meaning that the invariant probability measure n on
Sn 1 is the push-forward of the invariant probability measure n on On under the
map U 7 ! U 1 x.
kxU (L0 ) k (1 ) 1
= n y 2 S
kyL0 k (1 ) 1
Lecture 1. Wednesday, January 5
Measure concentration is a fairly general phenomenon, which asserts that a reasonable function f : X ! R dened on a large probability space X almost
always takes values that are very close to the average val
10. Concentration and Isoperimetry
We saw in Section 4 how to deduce concentration from isoperimetric inequalities.
We worked with the Boolean cube as an example, but the construction is fairly
general. Namely, if X is a metric space with a probability me
We can write
Fn (c) =
eihc,xi dn (x).
Since the measure n on n is rotation-invariant, we can assume that
c = (kck, 0, . . . , 0). Therefore,
Fn (c) =
eikck1 dn (x).
Now, using Theorem 13.3, we conclude that as n ! +1 we have
Suppose that n > 1. Let us dene the facets In , In In the sets A0 , A1 In 1
and x0 2 In 1 for x 2 In as in the proof of Theorem 4.2. Let us also denote
f (x) = dist(x, A), f0 (x) = dist(x, A0 ), and f1 (x) = dist(x, A1 ). So
dist(x, A) = min dist(x0 ,
those using the procedure of (27.2). In the end, we represent K as a nite union of
non-overlapping convex bodies Ki , i 2 I, such that
f d =
f d for all i 2 I.
(Ki ) Ki
We claim that every Ki is an -needle.
Indeed, let us pick a particular C