SPANNING SETS
if S = cfw_v1, v2, . . . , vk is a set of vectors in R n , then the set of all linear combinations of v1, v2, . . . , vk is
called the Span of v1, v2, . . . , vk.
It is denoted by span(v1, v2, . . . , vk) or span(S).
If span(S) = R n , then
Test 2 Review MATH 349
This is a general list of topics that you should use while studying for Test 2. This test will
cover the sections below. You should use this list as a guide while you study the suggested HW,
notes, etc. Not everything on this list i
6.3
Change of Basis
Definition:
If we change the basis for a vector space V from an old basis B = cfw_u1 , u2 , . . . , un
to a new basis C = cfw_v1 , v2 , . . . , vn then for each v in V ,
(v)B = P (v)C
where the columns of P are the coordinate vectors
6.1
Vector Spaces and Subspaces
Definition: Vector Space Axioms
Let V be an arbitrary nonempty set of objects on which two operations are
defined: addition, and multiplication by scalars. By addition we mean a rule for
associating with each pair of object
5.2
Orthogonal Complements and Orthogonal Projections
Definition:
Let W be a subspace of Rn . We say that a vector V in Rn is Orthogonal to W
if v is orthogonal to every vector in W . The set of all vectors that are orthogonal
to W is called the Orthogona
3.3
The Inverse of a Matrix
Definition: Inverse
If A is a square matrix, and if a matrix B of the same size can be found such
that AB = BA = I, then A is said to be invertible (or nonsingular) and B is
called the inverse of A.
If such a matrix B cannot be
3.2
Matrix Operations
Properties of Matrix Arithmetic:
1. A + B = B + A
(commutative addition)
2. A + (B + C) = (A + B) + C
3. A(BC) = (AB)C
(associative addition)
(associative multiplication)
4. A(B C) = AB AC
(left distributive)
5. (B C)A = BA CA
(right
3.1
Matrix Operations
Definitions:
A Matrix is a rectangular array of numbers called Entries, or Elements, of
the matrix.
A general m n matrix has the form:
a11 a12
a
21 a22
.
.
.
.
am1 am2
a1n
a2n
. . . .
amn
This can be written more compactly
6.5
The Kernel and Range of a Linear Transformations
Definitions:
Let T : V W be a linear transformation.
The Kernel of T , denoted ker(T ), is the set of all vectors in V that are mapped
by T to 0 in W . That is,
ker(T ) = cfw_v V : T (v) = 0
The Range
6.4
Linear Transformations
Definition:
If T : V W is a function from a vector space V to a vector space W , then T
is called a linear transformation from V to W if the following hold for all u and v
in V and all scalars c.
1. T (u + v) = T (u) + T (v)
2.
3.4
The LU Factorization
Definition:
Let A be a square matrix. A factorization of A as A = LU , where L is unit
lower triangular and U is upper triangular is called an LU Factorization of A.
A Unit Lower Triangular matrix has 1s along the main diagonal an
5.3
The Gram-Schmidt Process and the QR Factorization
The Gram-Schmidt Process: Orthogonalization of a basis
To convert any basis cfw_x1 , x2 , . . . , xk into an orthogonal basis cfw_v1 , v2 , . . . , vk :
Step 1:
v1 = x1
Step 2:
v2 = x2
hx2 , v1 i
v1
Test 3 Review MATH 349
This is a general list of topics that you should use while studying. This test will cover the
sections below. You should use this list as a guide while you study the suggested HW, notes, etc.
Not everything on this list is guarantee
3.5
Subspaces, Basis, Dimension, and Rank
Definition:
A subset W of a vector space V is called a subspace of V if W itself is a vector
space under the addition and scalar multiplication defined on V .
a) If u and v are vectors in W , then u + v is in W .
DETERMINANTS
If A is an nn triangular matrix then the determinant of A is the product of the main diagonal entries:
det(A) = a11 a22 a33 ann
Let A be a square matrix. If A has a row of zeros or a column of zeros, then det(A) = 0.
Let A be a square matrix.
THE MATRIX OF LINEAR TRANSFORMATION
Let V and W be two finite-dimensional vector spaces with bases B and C, respectively, where B = cfw_v1, v2, .
. . , vn. If T : V W is a linear transformation, then the m n matrix A defined by A = [ [T(v1)]C |
[T(v2)]C |
Change of Basis
If we change the basis for a vector space V from an old basis B = cfw_u1, u2, . . . , un to a new basis C = cfw_v1,
v2, . . . , vn then for each v in V , (v)B = P(v)C where the columns of P are the coordinate vectors of the
new basis vecto
LINEAR TRANSFORMATIONS
If T : V W is a function from a vector space V to a vector space W, then T is called a linear
transformation from V to W if the following hold for all u and v in V and all scalars c.
T(u + v) = T(u) + T(v)
T(cu) = cT(u) If S = cfw_v
SCALAR MULTIPLICATION
Scalar Multiplication: Multiplying a vector v by a scalar k, kv, will stretch or shrink the vector. The
direction will not change unless k < 0 in which case it will point in the opposite direction. Two vectors are
parallel (or collin
INTRO
The Geometry and Algebra of Vectors Definitions:
1.1 Vectors are line segments with specified direction and length (or magnitude).
1.2 Equivalent or Equal vectors have the same magnitude and direction (parallel) but dont have to be in
the same locat
ORTHOGONAL COMPLEMENTS
Let W be a subspace of R n . We say that a vector V in R n is Orthogonal to W if v is orthogonal to every
vector in W.
The set of all vectors that are orthogonal to W is called the Orthogonal Complement of W, denoted W
W = cfw_v R n
LENGTH AND ANGLE
Let u, v, and w be vectors in R n and let c be a scalar. Then (a) u v = v u (b) u (v + w) = u v + u w (c)
(cu) v) = c(u v) (d) u u 0 and u u = 0 if and only if u = 0
The Length or Norm of a vector v = v1 v2 . . . vn in R n is the non-nega
LENGTH AND ANGLE
The Triangle Inequality For all vectors u and v in R n , ku + vk kuk + kvk
The Distance between vectors u and v in R n is defined by d(u, v) = ku vk
For nonzero vectors u and v in R n , the angle between them can be found by cos = u v kuk
Chapter 6
Vector Spaces
6.1
Vector Spaces and Subspaces
x
. V is the set of all vectors in R2 whose first and second components are the same.
x
We verify all ten axioms of a vector space:
x+y
y
x
y
x
V since its first and second components are the
Chapter 4
Eigenvalues and Eigenvectors
4.1
Introduction to Eigenvalues and Eigenvectors
1. Following example 4.1, to see that v is an eigenvector of A, we show that Av is a multiple of v:
1
3
01+31
0 3 1
= 3v.
=3
=
=
Av =
1
3
31+01
3 0 1
Thus v is a
Chapter 3
Matrices
3.1
Matrix Operations
1. Since A and D have the same shape, this operation makes sense.
3 6
3+0 06
0 6
3 0
0 3
3 0
=
=
+
=
+2
A + 2D =
5
7
1 4 5 + 2
4
2
1 5
2
1
1 5
2. Since D and A have the same shape, this operation makes sens
Test 4 Review MATH 349
This is a general list of topics that you should use while studying. This test will cover the
sections below. You should use this list as a guide while you study the suggested HW, notes, etc.
Not everything on this list is guarantee
A linear transformation T : V W is called One-to-one if T maps distinct vectors in V to distinct vectors
in W.
T : V W is one-to-one if, for all u and v in V u 6= v implies that T(u) 6= T(v) T : V W is one-to-one if,
for all u and v in V T(u) = T(v) impli
A subset W of a vector space V is called a subspace of V if W itself is a vector space under the addition
and scalar multiplication defined on V
. a) If u and v are vectors in W, then u + v is in W.
b) If k is any scalar and u is in W, then ku is in W