CS 235: Algebraic Algorithms
Assignment 1 Solutions
1
1.1
Modular Multiplication
Proof by induction (10 points)
Let n and a be given. Base case: b=0. Then a 0 = 0, so the claim reduces to 0 = (a mod n
Lecture 3 Nearest Neighbor Algorithms
Shang-Hua Teng
What is Algorithm?
A computable set of steps to achieve a desired result from a given input Example:
Input: An array A of n numbers n Desired res
Lecture 4 Divide and Conquer for Nearest Neighbor Problem
Shang-Hua Teng
Merge-Sort(A,p,r)
A procedure sorts the elements in the sub-array A[p.r] using divide and conquer Merge-Sort(A,p,r)
if p >= r,
Lecture 5 Hyper-planes, Matrices, and Linear Systems
Shang-Hua Teng
Guarding Art Gallery
Visibility Problem
Art Gallery Problem
To learn more about this problem, you can google Art Gallery Problem or
Lecture 6 Matrix Operations and Gaussian Elimination for Solving Linear Systems
Shang-Hua Teng
Matrix
(Uniform Representation for Any Dimension)
An m by n matrix is a rectangular table of mn numbers
Lecture 7 Intersection of Hyperplanes and Matrix Inverse
Shang-Hua Teng
Elimination Methods for 2 by 2 Linear Systems
2 by 2 linear system can be solved by eliminating the first variable from the sec
Lecture 8 Matrix Inverse and LU Decomposition
Shang-Hua Teng
Inverse Matrices
In high dimensions
Ax = b Can we write? 1 x=A b 1 Is there a matrix A such that? 1 1 A A = AA = I
Uniqueness of Inverse M
Lecture 9 Symmetric Matrices Subspaces and Nullspaces
Shang-Hua Teng
Matrix Transpose
Addition: A+B Multiplication: AB Inverse: A-1 Transpose : A-T
( A) ij = A ji
Transpose
1 T 2 1 2 3 4 = 5 6 7 8 3
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems
Shang-Hua Teng
Linear Independence
Linear Combination
of a set of vectors cfw_v1,v2 ,.,vn is
cfw_
Linear Independenc
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection
Shang-Hua Teng
The Whole Picture
Rank(A) = m = n Ax=b has unique solution
R =[ I] R =[I F ]
Rank(A) = m < n Ax=b has n-
Lecture 2: Geometry vs Linear Algebra Points-Vectors and Distance-Norm
Shang-Hua Teng
2D Geometry: Points
2D Geometry: Cartesian Coordinates
y (a,b)
x
2D Linear Algebra: Vectors
y (a,b)
0
x
2D Geometr
CS 232 Geometric Algorithms: Lecture 1
Shang-Hua Teng Department of Computer Science, Boston University
Instructors
Main Lectures: Professor Shang-Hua Teng TR 2:00-3:30 PM (COM 213) Sections: TF Scott
Lecture 19 Singular Value Decomposition
Shang-Hua Teng
Spectral Theorem and Spectral Decomposition
Every symmetric matrix A can be written as
x1T 1 = x x T + + x x T A = [ x1 xn ] 111 nnn T n x n
wh
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection
Shang-Hua Teng
Projection
Projection onto an axis
(a,b)
x axis is a vector subspace
Projection onto an
Lecture 14 Simplex, Hyper-Cube, Convex Hull and their Volumes
Shang-Hua Teng
Linear Combination and Subspaces in m-D
Linear combination of v1 (line) cfw_c v1 : c is a real number Linear combination o
Lecture 15 Recursive and Iterative Formula for Determinants
Shang-Hua Teng
Pseudo-Hypercube or Pseudo-Box
n-Pseudo-Hypercube
For any n affinely independent vectors
p1 , p2 , , pn
c0 0 + c1 p1 + c2 p2
Lecture 17 Introduction to Eigenvalue Problems
Shang-Hua Teng
Eigenvalue Problems
Eigenvalue problems occur in many areas of science and engineering
E.g., Structure analysis
It is important for ana
Lecture 22 SVD, Eigenvector, and Web Search
Shang-Hua Teng
Earlier Search Engines
Hotbot, Yahoo, Alta Vista, Northern Light, Excite, Infoseek, Lycos Main technique: inverted index
Conceptually: use
Lecture 21 SVD and Latent Semantic Indexing and Dimensional Reduction
Shang-Hua Teng
Singular Value Decomposition
A = u v + u v + u v
T 111 T 222
T rrr
1 2 r
where u1 ur are the r orthonormal vectors
Lecture 20 SVD and Its Applications
Shang-Hua Teng
Spectral Theorem and Spectral Decomposition
Every symmetric matrix A can be written as
x1T 1 = x x T + + x x T A = [ x1 xn ] 111 nnn T n x n
where
Lecture 18 Eigenvalue Problems II
Shang-Hua Teng
Diagonalizing A Matrix
Suppose the n by n matrix A has n linearly independent eigenvectors x1, x2, xn. Eigenvector matrix S: x1, x2, xn are columns of
Lecture 12 Projection and Least Square Approximation
Shang-Hua Teng
Line Fitting and Predication
Input: Table of paired data values (x, y)
Some connection between x and y. Example: height - weight E