Lecture 7 Notes: Precision Mathematics
Pseudo inverse ofA :
+
A = (A A)
1
A
Condition number ofb = Ax.
+
(A) =A
(6.1)
max
A
=min
Condition number ofA (A): (normal equation)
(6.2)
2
(A)
Floating Point Arithmetic
(6.3)
(usually 2)
= radix
t = precisio
Lecture 4 Notes: Solving Linear Systems
3.4
QR Factorization
Idea: We have to solveAx = b:
If A is
square:x = A
A
1
b. IfA is rectangular, we want to xfind:min xAx b2.
x
b
OR
A
Figure 3.1: QR Factorization.
What linear systems can we solve easily?
Triang
Lecture 5 Notes: Graphical Analysis Problems
Givens rotations
s c
y
c s
x =
0
2
x +y
2
(3.20)
Therefore,
cx + sy
2
2
x +y ,
=
cy.
sx
=
y
s
=
(3.22)
=
c
(3.21)
Finally,
x
2
,
(3.23)
x+ y .
(3.24)
2
x2 +y2
Chapter 4
4.1
Householder reflectors
Example:
Find
Lecture 6 Notes: Minimization
Least Square Problems
Ax = b, Amn, m n, rank(A) = n, minimizeAx b2.
Minimized gradient = 0:
Ax b
T T
2
2
lim e A Ae
e0
e2
T
= (Ax b) (Ax b)
=
Ae2
lim
(5.1)
2
e2
e0
2
lim A e
e
e0
2
lim A e
2
e0
=
0 =
0
0
T
T
lim
e0
T
=
(A(
Lecture 2 Notes: Matrix Mathematics
1.4
Orthogonal (Unitary) matrices
Tools of the trade:
Definition:
Q=Q1
(1.7)
i.e., Q Q = I.
1
1
1
Figure 1.1: Orthogonal
hq , qi= q q
i
i
Orthogonal matrices preserve length
i
Matrices.
=
i
(1.8)
ij
of a vector:
kQxk2 =
Lecture 3 Notes: SVD
1.8
SVD
Definition: A C
mn
,mn,
the SVD Aof is
A = U V
where, U, V unitary, =
(1.26)
diag(1, , n), 1 2
A
U
Figure 1.2: SVD.
V
n0.
*
1.9
Existence and Uniqueness
Theorem: Every matrixAhas
is square
Proof:
and
an SVD. The singular valu
Lecture 9 Notes: Algorithm Mathematics
7.6
Accuracy of a Backward Stable Algorithm
Theorem: Suppose a backward stable algorithm is used tof :solvexy with
on a computer
satisfying

fl(x y)
= x( y)(1 + )
machine
condition number
(7.17)
then
f(f
(x)
(x) m
Lecture 8 Notes: Stability
7.1
Accuracy
Problem: Compute f (x) givenx. Result in floating point arithmeticf(x).
Definition: An algorithm is accurate if
x) f (x)
f ( f(x)
Usually too much to askf isif ill
7.2
=O(
machine
)
(7.1)
conditioned.
Stability
An a
Lecture 10 Notes: Decomposition
9.2 Schur
happens
Factorization
(9.7)
A = QT Q
where, T uppet triangular.
QR Decomposition
Ax
= b
QRx
= b
= Q b
Rx
=
Rx
Q
b
Therefore,
b
(R + R)x = (Q + Q)
such that
9.1
Computing Eigenvalues
Ax = x
A = XX
1
AX = X
A
X
X
F