North Hennepin Community College
Course:
Math 2300, Section 91
Fall Semester 2014
CLA 114
Linear Algebra
Tuesday 6:00 8:40
3 credits
CLASS WEBSITE:
On D2L (https:/www.nhcc.edu/online/)
Use your Star I
Section 2.2 The Inverse of a Matrix Read pages 102 109
Denition: A square matriXA is called invertible if there exists another square matrix C such
that AC 2 CA 2 I . ltA is invertible, then the matri
Section 2.1 Matrix Operations Read pages 92 - 100
Denition: If the product of matrices AB is dened, then the ij entry of the matrix AB is the
sum of the products of the corresponding entries from row
Section 3.2 Properties of Determinants Read pages 169 - 175
Theorem: Let A be a square matrix.
a) If a multiple of one row of A is added to another row of A, then the new matrix has the
- z 1 z i ' Fi
Section 3.1 Introduction to Determinants Read pages 164 - 168
Denition: If A is the 2 X 2 matrix [:1 Z], the determinant of A is the number ad bC.
We saw in chapter 2 that A is invertible if this numb
Section 6.3
Orthogonal Projections
Read pages 347 - 352
Suppose that U u1 , u 2 , ., u n is an orthogonal basis for R n . Let W be the spanning set of
u1 , u 2 , u 3 and let y be any vector in
R n . C
Section 6.1
Inner Product, Length, Orthogonality
Read pages 330 - 336
Definition:
Let u and v be two vectors in R n . The inner product (or dot product) of u and v
is u v uT v .
Example:
2
3
8 and v
Section 6.2
Orthogonal Sets
Read pages 338 - 344
Definition:
A set of vectors in R n is called an orthogonal set if every vector in the set is
orthogonal to every other vector in the set.
Theorem:
If
Section 5.4
Eigenvectors and Linear Transformations
Read pages 288 - 293
Remember our definition of a linear transformation:
A linear transformation T from a vector space V to a vector space W is a fu
Stuff to know for the final exam
I am not planning to make you state definitions on the final exam
There may well be a proof or two on the final though
Problems you can do
Solve systems of linear equa
Section 5.2
Example:
The Characteristic Equation
Read pages 273 279
7
2
Find the eigenvalues of the matrix A
.
1 6
Solution:
Remember that the eigenvalue of the matrix are scalars such that the equa
Section 4.3
Definition:
Bases of Vector Spaces
Read pages 208 - 213
A basis for a vector space
Let H be a subspace of vector space V. A set of vectors 1 , b 2 , ., b p in V is called
b
a basis for H i
Section 5.3
Diagonalization
Read pages 281 - 286
Remember our definition of similar matrices from section 5.2: Two square matrices A and B are
similar if there exists an invertible matrix P such that
Section 5.1
Eigenvalues and Eigenvectors
Read pages 266 - 271
Definition:
A eigenvector of a square matrix A is a nonzero vector x such that Ax x for
some scalar . A scalar is called an eigenvalue of
Section 2.3 Characteristics of lnvertible Matrices Read pages 111 - 114
i Z i
i o I 2
Suppose thatA. is an invertible 3x3 matrix. For exam? e _ [ O I 3]
Classify each of the following as i) necessaril
Section 2.5 Matrix Factorizations
Denition: A square matrix that has only zeros above its main diagonal is called lower
triangular. A square matrix that has only zeros below its main diagonal is calle
Section 6.1 Inner Product, Length, Orthogonality Read pages 330 - 336
Denition: Let u and V be two vectors in R. The inner product (or dot product) of u and V
The (10% [produci' oi: Veer/hrs
(s a 3 Ce
Section 6.4
The Gram-Schmidt Process
Read pages 354 - 358
Let W be any subspace of R n . We will develop and algorithm to find an orthonormal basis for W.
Example:
W.
1
3
2, x 0 and let W Spanu ,u .
Section 6.2 Orthogonal Sets Read pages 338 344
Denition: A set of vectors in R is called an orthogonal set if every vector in the set is
orthogonal to every other vector in'the set. Fl Sande e Kamp/e
Section 6.3 Orthogonal Projections Read pages 347 - 352
Suppose that U = {u1,u2,.,un} is an orthogonal basis for R. Let Who the spanning set of
{111, u2,u3} and let y be any vector in R". Can we writ
>
Section 5.3 Diagonalization = Read pages 281 - 286
Remember our denition of similaf matrices from section 5.2: Two square matrices A and B are
. similar if there existS.an,inV'ertible.matrix Pusuchn
Section 5.1 Eigenvalues and Eigenvectors ' Read pages 266 - 271
Denition: A eigenvector of a square matrix A is a nonzero vector X such that Ax = xix for
some scalar x1. A scalar A is called an eigenv
Section 5.4 Eige'nvectors and Linear Transformations Read pages 288 293
Remember our denition of a linear transformation:
A linear transformation T om a vector space Vto a vector space Wis a function
Section 4.6 Rank Read pages 230 - 236
Find bases for the row space, the column space and the null space of the matrix
12 1210 loo5% 11,?
A=[ 1 3 1 0 4] QPREP [:0 (o 15 04,
0
111213 00!?" Tl
t,
The ro
Find a basis for the subspace of R4 described here and state the dimension of the subspace.
s' '5
5+2: MGR 1 $1.16
3s+22 331L529
7' .
1 0
.l 2
V O
. F n i
Q bases L S 3
O V J
1
Find the dimension of
Section 4.2 Null Space, Column Space and Linear Transformation Read pages 198 205
Denition: The null space of a matrix
Thé'null space of an m X n'matrix A'is the set'of all solutions 'to thehoniogeneo
Section 4.5 The Dimension of a Vector Space Read pages 225 - 228
Theorem: If a vector space Vhas a basis ,8 containing 71 vectors, then any subset of V
containing more than n vectors must be linearly