1
Fundamentals
Basic Operations:
Matrix-vector product b = Ax, where A 2 Rm
de ned by
n
X
bi = aij xj i = 1 : : : m:
n
and x 2 Rn, is
j =1
Matrix-matrix product B = AC , where A 2 R` m and C 2 Rm n,
is de ned by
m
X
bij = aik ckj i = 1 : : : ` j = 1 : : :
Gaussian Elimination
1
Consider the Consider the system
8
> a11 x1 + a12 x2 + : : : + a1 x = b1
<
> a x +a x +:+a x = b
: a121x11 + a222x22 + : : : + a2 x = b2
Gauss elimination method consists in
Using the rst equation (assuming a11 6= 0) to eliminate al
1
Least Squares Problems
Data tting (or parameter estimation) is an important technique used
for modeling in many areas of disciplines.
Assuming a physical phenomenon is modeled by a relationship
y = f (z x1 : : : xn):
(0.1)
. f is a prescribed function d
Error Analysis
1
Because of the oating-point arithmetic involved in the Gaussian elimination process, the linear system Ax = b cannot be solved exactly. It
can be shown by backward error analysis that the approximate solution
y satis es a perturbed system
Singular Value Decomposition
1
The singular value decomposition (SVD) is a matrix factorization that
serves both as the rst computational step in many numerical algorithms and as the rst conceptual step in many theoretical studies.
Given A 2 Rm n (m n), t
1
QR via Householder Transformation
Let u 2 Rm be a column unit vector. The associated Householder
matrix is de ned to be
V := I ; 2uuT :
The matrix V is an orthogonal matrix.
The transformation
V x = x ; 2uuT x:
has a special meaning: It is the re ection
Solving Triangular Systems
1
A matrix L = ` ] is lower triangular if ` = 0 whenever i < j .
If the diagonal elements ` of L are nonzero, the L is nonsingular.
The linear equation Lx = b where L is a nonsingular lower triangular matrix can be solved by for
QR Decomposition
1
The QR decomposition is perhaps the most important algorithmic idea
in numerical linear algebra.
Suppose A 2 Rm n, m n, and suppose rank (A) = n (i.e.,
suppose A has linearly independent columns).
The matrix A can always be decomposed a
1
Geometry behind Linear Least Squares
Let the columns of A 2 Rm n be denoted as A = A1 : : : An] where
each Ai 2 Rm.
The product Ax can be written as
Ax =
n
X xiAi
i=1
i.e., Ax is a linear combination of columns of A and hence is an
element in the range
1
LU Decomposition
We have seen that the Gaussian elimination process can be described
in terms of elementary matrix operations:
E 1(;m 1 ) : : : E21 (;m21 )A(1) = A(2)
E 2(;m 2 ) : : : E32 (;m32 )A(2) = A(3)
.
E ;1(;m ;1)A( ;1) = A( ) :
n
n
n
n
n
n n
n
n
1
Nonlinear Least Squares Problem
From the assumed mathematical model y = h(z x1 : : : xn) and the
observed data f(zi yi)g, i = 1 : : : m, we intend to minimize the overal
residual
m
X
R(x1 : : : xn) := krik2
2
where
i=1
ri = ri(x1 : : : xn) := yi ; h(zi
Matrix Norm
1
One of the main concerns in matrix algorithms is the sensitivity of a
system to its coe cients. To do so, we need to measure the size of
errors. The measurement is done by the notion of norm.
We have already seen some examples of vector norm