# neumann - Explicit Constructions in HighDimensional...

This preview shows pages 1–8. Sign up to view the full content.

Explicit Constructions in High- Dimensional Geometry Piotr Indyk MIT

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
“High-level Picture” Compressed Sensing • Random Projections • L1 minimization • (Uniform) UP •… Data Stream / Sublinear Algorithms • (Pseudo)random Projections • Isolation/Group Testing Geometric Functional Analysis (Approximation Theory) Concentration of Measure • Low distortion embeddings Pseudorandomness • Derandomization • Explicit constructions • Expanders/extractors
This talk Compressed Sensing Compressed Sensing Two explicit constructions: – A “low-distortion” embedding A:R n R m , m=n 1+o(1) , such that for any x ||Ax|| 1 = (1 ± ε )| |x | | 2 (a.k.a. Dvoretzky’s Theorem for l 1 ) – A “nice” measurement matrix B:R n R m , m=k n o(1) ,such that for any k - sparse x , one can efficiently reconstruct x from Bx (several matrices with >k 2 measurements known ) Data Streams Data Streams Geometric Functional Analysis Geometric Functional Analysis Pseudo- randomness Pseudo- randomness

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Embedding l 2 n into l 1 [Indyk’00] Prob., nlog 2 n O(n/ σ 2 ) 1+ σ [Arstein-Avidan, Milman’06] Prob., nlogn O(n/ σ 2 ) 1+ σ [Lovett-Sodin’07] Prob., n O(n/ σ 2 ) 1+ σ [Indyk’00] (cf. LLR’94) Explicit n O(log n) 1+1/n Explicit Probabilistic Type [Kashin, Figiel- Lindenstrauss-Milman, Gordon] O(n/ σ 2 ) 1+ σ [Rudin’60,…] O(n 2 ) O(1) Reference Dim. of l 1 Distortion [Indyk’06] Explicit n2 O(log log n) 2 1+1/log n [Guruswami-Lee-Razborov’07] Explicit’ n(1+o(1)) n o(1)
Other implications • Computing Ax takes time O(n 1+o(1) ), as opposed to O(n 2 ) • Similar phenomenon discovered for Johnson-Lindenstrauss dimensionality reduction lemma [Ailon-Chazelle’06], – Applications to approximate nearest neighbor problem, Singular Value Decomposition, etc (recall Muthu’s talk)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Techniques • Uncertainty Principles • Extractors Compressed Sensing Compressed Sensing Pseudo- randomness Pseudo- randomness
Uncertainty principles (UP) Consider a vector x R n and a Fourier matrix F UP: either x or Fx must have “many” non-zero entries (for x 0 ) History: – Physics: Heisenberg principle

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 01/15/2012 for the course COT 4930 taught by Professor Sitharam during the Fall '09 term at University of Florida.

### Page1 / 18

neumann - Explicit Constructions in HighDimensional...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online