MAT188 - THE BIG theorem
Let A = cfw_v1 , . . . , vn be a set of n vectors in Rm . Let A = [v1 . . . vn ] be an m n matrix. Let T : Rn Rm such that T (x) = Ax.
The followings are equivalent
The followings are equivalent
A span Rm
A is linearly independ
MAT188 - Oct 30
1
Determinant function
A square matrix is invertible if there is another square matrix B such that AB = BA = I. Here
1 0 . 0
0 1 . . . 0
I = . . .
. .
.
. .
. .
.
0 0 . 1
is the identity matrix.
Sometimes we denote In to be the n n ident
MAT188 - Nov 6
1
Review: Properties of Determinant
The determinant associates each square matrix A a number, denoted by det A or |A|.
Denote Mij the submatrix of A by deleting the i-th row and j-column.
The cofactor of aij is dened by Cij = (1)i+j det
MAT188 - Nov 2
1
Review: Determinant
The determinant associates each square matrix A a number, denoted by det A or |A|.
Denote Mij the submatrix of A by deleting the i-th row and j-column.
The minor of aij is dened by det Mij .
The cofactor of aij is
MAT188 - Oct 28
1
Review
A collection of k vectors cfw_v1 , . . . vk is a basis of S if it is linearly independent and it spans S.
All bases have the same number of vectors. The dimension of S is the number of basis vectors.
Given a linearly independe
MAT188 - Oct 26
1
Review
Let S Rn . We say S is a subspace if (and only if)
0 S.
If u, v S, then u + v S.
If u S and s R, then su S.
A collection of k vectors cfw_v1 , . . . vk is a basis of S if
cfw_v1 , . . . vk is linearly independent
cfw_v1 ,
MAT188 - Oct 23
1
Review
Let S Rn . We say S is a subspace if (and only if)
0 S.
If u, v S, then u + v S.
If u S and s R, then su S.
Last time we talked about spancfw_v1 , . . . vk is a subspace of Rn .
Given a linear tranformation T : Rn Rm such t
MAT188 - Oct 16
1
Review: Elementary Matrices
Recall that an elementary row operation corresponds to multiplying to the left by an elementary matrix.
The elementary matrix is given to be performing the same elementary row operation to the identity
matri
MAT188 - Oct 19
1
Review
Let A be an n n matrix. Recall that A is invertible if there is a matrix B such that AB = BA = I.
B is called the inverse of A and it is denoted by A1 .
Also recall that a linear transformation T : Rn Rm is a function from Rn t
MAT188 - Oct 21
1
Review
Last time we talked about subspaces in Rn . It is given by the form of the span of several vectors in
Rn .
They are innite, straight things, except for the zero subspace = spancfw_0.
2
Kernel and Null space
Last time we briey t
MAT188 - Oct 14
1
Review: Linear transformations
A linear transformation is a multivarible vector-valued function T : Rn Rm satisfying the 2 linear
conditions:
For all x, y Rn , s R, we have T (x + y) = T (x) + T (y) and T (sx) = sT (x)
A linear transfo
MAT188 - Sept 30
1
Linear Independence
In Chapter 2.2, we talked about Span. Recall that the followings are equivalent:
cfw_v1 , . . . , vn span Rm
For any b Rm , b spancfw_v1 , . . . , vn
Every row of the matrix [v1 . . . , vn ] has a pivot positio
MAT188 - Oct 9
1
Linear transformations
1 variable: A linear transformation f : R R is a function such that
.
for all x, y,
Remark: This is DIFFERENT from what you learnt before. Recall a linear function was given by
the function y = mx + b where m is t
MAT188 - Nov 4
1
Review: Properties of Determinant
The determinant associates each square matrix A a number, denoted by det A or |A|.
Denote Mij the submatrix of A by deleting the i-th row and j-column.
The minor of aij is dened by det Mij .
The cofac
MAT188 - Nov 13
1
Review
Given a linear transformation T : Rn Rn , is an eigenvalue if there is a nonzero vector v such that
T (v) = Av. Such v is called an eigenvector corresponding to the eigenvalue .
To nd and v, rst solve the characteristic equation
MAT188 - Nov 11
1
Eigenvalues and Eigenvectors: Introduction
Given the following linear transformation T : R2 R2 such that T (x, y) = (2x + y, x + 2y).
Graphically, it sends the unit square to
Notice that it will distort the unit square in terms of dir
MAT188 - Ch2.2
1
System of linear equations
A system of linear equations is given by
a11 x1
.
.
.
am1 x1
+ .
.
.
+ .
+
a1n xn
.
.
.
=
b1
.
.
.
+ amn xn
=
bm
Here aij s and bj s are numbers and xj s are unknowns.
There are several other forms to view a
Review
1
Review
6.1. Let A be a square matrix. Then a nonzero vector v = 0 is an eigenvector of A if there exists a
scalar such that Au = u. Here is called an eigenvalue of A.
6.1. The eigenspace of is the subspace null(A I), consisting of all the eigen
Review (Continued)
1
Short Questions
If A and B are n n matrices such that AB = 0, then show that rank(B) nullity(A). In particular, if
A2 = 0, then rank(A) n/2 and dim E0 n/2. (E0 is the eigenspace of 0)
Let A be an n n matrix such that it has n distin
MAT188 - Dec 2
1
Review
Two vectors u and v are orthogonal if their dot product is zero, i.e.
u v = u1 v1 + u2 v2 + . . . + un vn = 0.
Let S be a subspace in Rn . The orthogonal complement of S, denoted by S , is the set of all the
vectors that is ortho
MAT188 - Dec 4
1
Review
Let S be a subspace in Rn . The orthogonal complement of S, denoted by S , is the set of all the
vectors that is orthogonal to S, i.e.
S = cfw_v Rn such that v s = 0 for all s S
Let S be a subspace with orthogonal basis cfw_v1 ,
MAT188 - Nov 30
1
Review
Two vectors u and v are orthogonal if their dot product is zero, i.e.
u v = u1 v1 + u2 v2 + . . . + un vn = 0.
Let S be a subspace with orthogonal basis cfw_v1 , . . . , vk . Then for any v S, we have
v=
2
v v1
v vk
v1 + . . . +
MAT188 - Nov 27
1
Course evaluations
Please complete the course evaluations! Questions, comments, complaints, anything welcome!
TA evaluations.
2
Preliminaries - Orthogonality
Recall that in R2 , two vectors are perpendicular if there dot product is zero
MAT188 - Nov 23
1
Review
Let T : Rn Rn be a linear transformation such that T (x) = Ax. v = 0 is an eigenvector corresponding
to an eigenvalue if Av = v.
If A has n eigenvalues (counting multiplicity) and n independent eigenvectors, then A = P DP 1 , wh
MAT188 - Nov 25
1
Review
Given an n n matrix A. If you have a system of linear dierential equations y = Ay, and A is
diagonalizable, then the general solution will be
y = a1 e1 x v1 + . . . + an en x vn ,
where vi s are the eigenvectors corresponding to t
MAT188 - Nov 20
1
Review
Let B = cfw_v1 , . . . , vn be a basis for Rn . Then for any v Rn , v can be written as a linear combination
of the basis vectors,
a1
.
v = a1 v1 + . . . + an vn = [v1 . . . vn ] . .
.
an
P
vB
P is called the change of basis
MAT188 - Nov 16
1
Review
Given a linear transformation T : Rn Rn , is an eigenvalue if there is a nonzero vector v such that
T (v) = Av. Such v is called an eigenvector corresponding to the eigenvalue .
To nd and v, rst solve the characteristic equation
MAT188 - Nov 18
1
Review
Given a linear transformation T : Rn Rn , is an eigenvalue if there is a nonzero vector v such that
T (v) = Av. Such v is called an eigenvector corresponding to the eigenvalue .
To nd and v, rst solve the characteristic equation
MAT188 - Nov 9
1
Review: Properties of Determinant
The determinant associates each square matrix A a number, denoted by det A or |A|.
Denote Mij the submatrix of A by deleting the i-th row and j-column.
The cofactor of aij is dened by Cij = (1)i+j det