Section 3.4
LU Factorization
Introduction
In this section we revisit the problem of finding
solutions to a system of linear equations
We develop a new approach using matrices that can
be more effici
Section 7.3
Basis and Dimension
Basis
Let be a subset of a vector space V.
Then is a basis of V if is linearly independent
and spans V.
Example 1
Is the set cfw_x2 + 4x 3, x2 + 1, x 2 a basis for P
Section 9.2
Isomorphisms
Definition
A linear transformation T : V W is
an isomorphism if T is both one-to-one and onto.
If such an isomorphism exists, then we say
that V and W are isomorphic vector
Section 6.4
Diagonalization
Introduction
If D is a diagonal matrix, then it is relatively easy to
analyze the behavior of the linear transformation
T(x) = Dx because for x in Rn,
We develop a proced
Section 10.2
The GramSchmidt Process
Revisited
Thevectorscfw_v1,.,vkinaninnerproductspaceVformanorthogonalsetifvi,vj=0forij.
Thevectorscfw_v1,.,vkinaninnerproductspaceVformanorthogonalsetifvi,vj=0fori
Section 7.2
Span and Linear
Independence
Introduction
In this section we extend the concepts of span and
linear independence from Euclidean space to vector
spaces.
The definitions here are similar t
Section 6.3
Change of Basis
Introduction
We have seen that there are numerous different bases
for Rn (or a subspace of Rn).
In this section we develop a general procedure for
changing from one basis
Section 1.1
Lines and Linear Equations
Designing Hot Water System
300-liter system
Available solutions
18% glycol
50% glycol
Need 36% glycol solution
Solution
Linear Equation
A linear equation h
Section 2.2
Span
VecMobile II
Imagine that you live in the two-dimensional plane R2
You purchased a new car, the VecMobile II.
The VecMobile II is delivered at the origin (0, 0)
At any given time,
Section 3.3
Inverses
Introduction
We defined the linear transformation and developed
the properties of this type of function.
We consider the problem of reversing a linear
transformation.
An applic
Section 3.2
Matrix Algebra
Addition and Scalar
Multiplication
Let c be a scalar, and let
be n m matrices.
Then addition and scalar multiplication of matrices
are defined as follows:
Example 1
Let
Section 6.5
Complex Eigenvalues
Introduction
Somecharacteristicpolynomialshaverootsthatare
notrealnumbers.
Example:
|AI2|=(3)2+4
Complex Numbers
Complex plane
Real and imaginary parts
Modulus
Ar
Section 3.1
Linear Transformations
Definitions
Let T : Rm Rn denote a function T that takes
vectors in Rm as input and produces vectors in Rn as
output.
The domain of T is Rm and the codomain is Rn.
Section 4.3
Row and Column Spaces
Definitions
Let A be an n m matrix.
(a) The row space of A is the subspace of Rm spanned by
the row vectors of A and is denoted by row(A).
(b) The column space of A
Section 2.1
Vectors
Vectors and Rn
A vector is an ordered list of real numbers u1, u2, . . . , un
expressed as
u1
u
2
u .
.
u n
or as u = (u1, u2, . . . , un).
The set of all vectors with n ent
Section 4.1
Introduction to Subspaces
Subspace Definition
A subset S of Rn is a subspace if S satisfies the
following three conditions:
(a) S contains 0, the zero vector
(b) If u and v are in S, then
Section 10.1
Inner Products
Definition of Inner Product
We would like to extend the dot product to a similar product in
other vector spaces.
Let u, v, and w be elements of a vector space V, and let
Section 1.4
Applications of Linear
Systems
Traffic Flow
Our goal is to find x1, x2, x3, and x4, the volume of
traffic along each side of the plaza.
Planetary Orbital Periods
The planets that are clo
Section 7.1
Vector Spaces and Subspaces
Set of Polynomials
Let P2 be the set of all polynomials with real
coefficients that have degree 2 or less.
A typical element of P2 has the form
p(x) = a2x2 +
Section 1.2
Linear Systems and Matrices
Introduction
The systems rarely are in echelon form.
The primary goal of this section is to develop a
systematic procedure for transforming any linear
system
Section 5.2
Properties of the Determinant
Introduction
Instead of using cofactor expansion to compute the
determinant, it is typically faster to
1. first convert the matrix to echelon form using row
Section 8.2
Projection and
the GramSchmidt Process
Projection onto Vectors
LetuandvbevectorsinRn,withvnonzero.
Thentheprojection of u onto visgivenby
Example 1
Findprojvuforu=andv=
THEOREM 8.14
Le
Section 2.3
Linear Independence
Linear Independence
Let cfw_u1, u2, . . . , um be a set of vectors in Rn.
If the only solution to the vector equation
x1u1 + x2u2 + + xmum = 0
is the trivial solution
Section 9.1
Linear Transformation
Definitions and Properties
Extended Definition
Extending the definition to allow domains and
codomains that are vector spaces.
Let V and W be vector spaces.
Then T
Section 4.3
Basis and Dimension
Introduction
We combine the concepts of linearly independent sets
and spanning sets to learn more about subspaces.
Let S = spancfw_u1, u2, . . . , um be a subspace of
Section 6.2
Approximation Methods
Introduction
We saw how to use the characteristic polynomial to
find the eigenvalues (and then the eigenvectors) for
matrices.
Such methods work fine for the small
Section 9.3
The Matrix of a Linear
Transformation
Coordinate Vector
LetVbeavectorspacewithbasisG=cfw_g1,.,gm.
Foreachv=c1g1+cmgminV,wedefine
coordinate vector of v with respect to by
Notes
Although