neumann - Explicit Constructions in HighDimensional...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Explicit Constructions in High- Dimensional Geometry Piotr Indyk MIT
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
“High-level Picture” Compressed Sensing • Random Projections • L1 minimization • (Uniform) UP •… Data Stream / Sublinear Algorithms • (Pseudo)random Projections • Isolation/Group Testing Geometric Functional Analysis (Approximation Theory) Concentration of Measure • Low distortion embeddings Pseudorandomness • Derandomization • Explicit constructions • Expanders/extractors
Background image of page 2
This talk Compressed Sensing Compressed Sensing Two explicit constructions: – A “low-distortion” embedding A:R n R m , m=n 1+o(1) , such that for any x ||Ax|| 1 = (1 ± ε )| |x | | 2 (a.k.a. Dvoretzky’s Theorem for l 1 ) – A “nice” measurement matrix B:R n R m , m=k n o(1) ,such that for any k - sparse x , one can efficiently reconstruct x from Bx (several matrices with >k 2 measurements known ) Data Streams Data Streams Geometric Functional Analysis Geometric Functional Analysis Pseudo- randomness Pseudo- randomness
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Embedding l 2 n into l 1 [Indyk’00] Prob., nlog 2 n O(n/ σ 2 ) 1+ σ [Arstein-Avidan, Milman’06] Prob., nlogn O(n/ σ 2 ) 1+ σ [Lovett-Sodin’07] Prob., n O(n/ σ 2 ) 1+ σ [Indyk’00] (cf. LLR’94) Explicit n O(log n) 1+1/n Explicit Probabilistic Type [Kashin, Figiel- Lindenstrauss-Milman, Gordon] O(n/ σ 2 ) 1+ σ [Rudin’60,…] O(n 2 ) O(1) Reference Dim. of l 1 Distortion [Indyk’06] Explicit n2 O(log log n) 2 1+1/log n [Guruswami-Lee-Razborov’07] Explicit’ n(1+o(1)) n o(1)
Background image of page 4
Other implications • Computing Ax takes time O(n 1+o(1) ), as opposed to O(n 2 ) • Similar phenomenon discovered for Johnson-Lindenstrauss dimensionality reduction lemma [Ailon-Chazelle’06], – Applications to approximate nearest neighbor problem, Singular Value Decomposition, etc (recall Muthu’s talk)
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Techniques • Uncertainty Principles • Extractors Compressed Sensing Compressed Sensing Pseudo- randomness Pseudo- randomness
Background image of page 6
Uncertainty principles (UP) Consider a vector x R n and a Fourier matrix F UP: either x or Fx must have “many” non-zero entries (for x 0 ) History: – Physics: Heisenberg principle
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 18

neumann - Explicit Constructions in HighDimensional...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online