CS 235: Algebraic Algorithms
Assignment 1 Solutions
1
1.1
Modular Multiplication
Proof by induction (10 points)
Let n and a be given. Base case: b=0. Then a 0 = 0, so the claim reduces to 0 = (a mod n) (0 mod n) (mod n), which is trivial. Induction step:
Lecture 3 Nearest Neighbor Algorithms
Shang-Hua Teng
What is Algorithm?
A computable set of steps to achieve a desired result from a given input Example:
Input: An array A of n numbers n Desired result ak
k =1
a1
a2 an
Pseudo-code of Algorithm SUM
Pseu
Lecture 4 Divide and Conquer for Nearest Neighbor Problem
Shang-Hua Teng
Merge-Sort(A,p,r)
A procedure sorts the elements in the sub-array A[p.r] using divide and conquer Merge-Sort(A,p,r)
if p >= r, do nothing if p< r then q ( p + r ) / 2
Merge-Sort(A,
Lecture 5 Hyper-planes, Matrices, and Linear Systems
Shang-Hua Teng
Guarding Art Gallery
Visibility Problem
Art Gallery Problem
To learn more about this problem, you can google Art Gallery Problem or google Art Gallery Problems
Visibility Problems: Inters
Lecture 6 Matrix Operations and Gaussian Elimination for Solving Linear Systems
Shang-Hua Teng
Matrix
(Uniform Representation for Any Dimension)
An m by n matrix is a rectangular table of mn numbers
a1,1 a 2,1 A= am ,1 a1, 2 a2 , 2 am , 2 . a1,n . a2,n
Lecture 7 Intersection of Hyperplanes and Matrix Inverse
Shang-Hua Teng
Elimination Methods for 2 by 2 Linear Systems
2 by 2 linear system can be solved by eliminating the first variable from the second equation by subtracting a proper multiple of the fi
Lecture 8 Matrix Inverse and LU Decomposition
Shang-Hua Teng
Inverse Matrices
In high dimensions
Ax = b Can we write? 1 x=A b 1 Is there a matrix A such that? 1 1 A A = AA = I
Uniqueness of Inverse Matrices
BA = I and AC = I then B=C Proof : B = BI = B(
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems
Shang-Hua Teng
Linear Independence
Linear Combination
of a set of vectors cfw_v1,v2 ,.,vn is
cfw_
Linear Independence
n
i =1
i vi
A set of vectors cfw_v1,v2 ,.,vn
is lin
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection
Shang-Hua Teng
The Whole Picture
Rank(A) = m = n Ax=b has unique solution
R =[ I] R =[I F ]
Rank(A) = m < n Ax=b has n-m dimensional solution Rank(A) = n < m Ax=b has 0 or 1
Lecture 2: Geometry vs Linear Algebra Points-Vectors and Distance-Norm
Shang-Hua Teng
2D Geometry: Points
2D Geometry: Cartesian Coordinates
y (a,b)
x
2D Linear Algebra: Vectors
y (a,b)
0
x
2D Geometry and Linear Algebra
Points Cartesian Coordinates Vect
CS 232 Geometric Algorithms: Lecture 1
Shang-Hua Teng Department of Computer Science, Boston University
Instructors
Main Lectures: Professor Shang-Hua Teng TR 2:00-3:30 PM (COM 213) Sections: TF Scott Russell CAS CS232 A2 648134 Monday 3:00pm-4:00pm in CA
Lecture 19 Singular Value Decomposition
Shang-Hua Teng
Spectral Theorem and Spectral Decomposition
Every symmetric matrix A can be written as
x1T 1 = x x T + + x x T A = [ x1 xn ] 111 nnn T n x n
where x1 xn are the n orthonormal eigenvectors of A, they
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection
Shang-Hua Teng
Projection
Projection onto an axis
(a,b)
x axis is a vector subspace
Projection onto an Arbitrary Line Passing through 0
,b) (a
Projection on
Lecture 14 Simplex, Hyper-Cube, Convex Hull and their Volumes
Shang-Hua Teng
Linear Combination and Subspaces in m-D
Linear combination of v1 (line) cfw_c v1 : c is a real number Linear combination of v1 and v2 (plane) cfw_c1 v1 + c2 v2 : c1 ,c2 are real
Lecture 17 Introduction to Eigenvalue Problems
Shang-Hua Teng
Eigenvalue Problems
Eigenvalue problems occur in many areas of science and engineering
E.g., Structure analysis
It is important for analyzing numerical and linear algebra algorithms
Impact
Lecture 22 SVD, Eigenvector, and Web Search
Shang-Hua Teng
Earlier Search Engines
Hotbot, Yahoo, Alta Vista, Northern Light, Excite, Infoseek, Lycos Main technique: inverted index
Conceptually: use a matrix to represent how many times a term appears in
Lecture 21 SVD and Latent Semantic Indexing and Dimensional Reduction
Shang-Hua Teng
Singular Value Decomposition
A = u v + u v + u v
T 111 T 222
T rrr
1 2 r
where u1 ur are the r orthonormal vectors that are basis of C(A) and v1 vr are the r orthonormal
Lecture 20 SVD and Its Applications
Shang-Hua Teng
Spectral Theorem and Spectral Decomposition
Every symmetric matrix A can be written as
x1T 1 = x x T + + x x T A = [ x1 xn ] 111 nnn T n x n
where x1 xn are the n orthonormal eigenvectors of A, they are
Lecture 18 Eigenvalue Problems II
Shang-Hua Teng
Diagonalizing A Matrix
Suppose the n by n matrix A has n linearly independent eigenvectors x1, x2, xn. Eigenvector matrix S: x1, x2, xn are columns of S. Then
1 1 S AS = = is the eigenvalue matrix
n
Matr
Lecture 12 Projection and Least Square Approximation
Shang-Hua Teng
Line Fitting and Predication
Input: Table of paired data values (x, y)
Some connection between x and y. Example: height - weight Example: revenue - stock price Example: Yesterdays tempe