lecture12

lecture12 - 1 15.083J/6.859J Integer Optimization Lecture...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
1 15.083J/6.859J Integer Optimization Lecture 12: Lattices III
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 6.4 The approximate nearest vector problem In this section, we use Algorithm 6.3 to produce a reduced basis of a lattice, and then use this reduced basis to solve approximately the nearest vector problem in L introduced in Eq. (6.2) In particular, let L be a lattice and x Q n not belonging to L . The nearest vector problem is NP -hard. Algorithm 6.4 that we present next Fnds a b ∈L such that k b x k≤ (2 n 1) 1 / 2 min {k b x k : b ∈L\{ 0 }} . (6.12) Algorithm 6.4 Approximate nearest vector algorithm Input : A reduced basis b 1 ,..., b n of a lattice L , the Gram-Schmidt orthogonalization ˜ b 1 ˜ b n and x Q n . Output : A vector b satisfying Eq. (6.12) and rational multipliers λ i , i =1 ,...,n with | λ i |≤ 1 / 2 , such that b x = n X i =1 λ i ˜ b i . Algorithm : 1. Set z n +1 = 0 and x n +1 = x . 2. For i = n,. .., 1 : (a) Compute σ 1 ,i ,...σ i,i Q such that x i +1 = i X j =1 σ j,i ˜ b j . (b) Set λ i = b σ i,i e− σ i,i ,where b σ i,i e denotes the nearest integer to σ i,i . (c) Set z i = z i +1 + b σ i,i e b i . (d) Set x i = x i +1 −b σ i,i e b i + λ i ˜ b i . 3. Return b = z 1 and λ 1 ,...,λ n . Example 6.9 We revisit Example 6.6. We have seen that the vectors b 1 = (1 , 1) 0 , b 2 =(2 , 1) 0 constitute a reduced basis of the lattice L ( b 1 , b 2 ). The cor- responding Gram-Schmidt orthogonalization is ˜ b 1 =(1 , 1) 0 and ˜ b 2 . 5 , 1 . 5). Let x =(7 . 5 , 3 . 0) 0 . We start the approximate nearest vector algorithm by setting z 3 = 0 and x 3 = x .W et h e ns e t i = 2 and compute multipliers σ 1 , 2 2 , 2 such that x = σ 1 , 2 ˜ b 1 + σ 2 , 2 ˜ b 2 . This gives σ 1 , 2 = x 0 ˜ b 1 k ˜ b 1 k 2 =5 . 25 , and σ 2 , 2 = x 0 ˜ b 2 k ˜ b 2 k 2 . 5 .
Background image of page 2
Sec. 6.4 The approximate nearest vector problem 3 This leads to λ 2 = b σ 2 , 2 e− σ 2 , 2 =1 1 . 5= 0 . 5 z 2 = z 3 + b σ 2 , 2 e b 2 = 0 + b 2 =(2 , 1) 0 x 2 = x 3 −b σ 2 , 2 e b 2 + λ 2 ˜ b 2 = x b 2 0 . 5 ˜ b 2 =(4 . 75 , 4 . 75) 0 . We continue for i = 1. We compute σ 1 , 1 such that x 2 = σ 1 , 1 ˜ b 1 , leading to σ 1 , 1 =4 . 75, λ 1 = b σ 1 , 1 σ 1 , 1 =0 . 25, and z 1 = z 2 + b σ 1 , 1 e b 1 =(7 , 4) 0 . Note that b = z 1 , 4) 0 ∈L and b x = λ 1 ˜ b 1 + λ 2 ˜ b 2 . We next show that Algorithm 6.4 correctly fnds b satisFying Eq. (6.12). Theorem 6.10 The vector b = z 1 computed by the approximate nearest vector algorithm belongs to L and satisfes k b x k≤ (2 n 1) 1 / 2 min {k b x k : b ∈L\{ 0 }} . Proof. Let b 1 ,..., b n be the reduced basis oF L , which is part oF the input oF the approximate nearest vector algorithm. Let ˜ b 1 ˜ b n denote its Gram- Schmidt orthogonalization. Solving the recursions in Steps 2c and 2d oF the approximate nearest vector algorithm, we obtain x 1 = x n +1 n X i =1 b σ i,i e b i + n X i =1 λ i ˜ b i z 1 = z n +1 + n X i =1 b σ i,i e b i . Since x 1 = x 2 σ 1 , 1 e b 1 + λ 1 ˜ b 1 = σ 1 , 1 ˜ b 1 σ 1 , 1 e b 1 + λ 1 ˜ b 1 , and ˜ b 1 = b 1 , it Follows x 1 = 0 . Moreover, since x n +1 = x , x = n X i =1 b σ i,i e b i n X i =1 λ i ˜ b i .
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 10/13/2010 for the course CS 15.083J taught by Professor Dimitrisbertsimas during the Spring '04 term at MIT.

Page1 / 15

lecture12 - 1 15.083J/6.859J Integer Optimization Lecture...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online