North Hennepin Community College
Course:
Math 2300, Section 91
Fall Semester 2014
CLA 114
Linear Algebra
Tuesday 6:00 8:40
3 credits
CLASS WEBSITE:
On D2L (https:/www.nhcc.edu/online/)
Use your Star ID to log in.
Instructor:
Matt Foss
(763) 424-0840
CLA20
Section 2.2 The Inverse of a Matrix Read pages 102 109
Denition: A square matriXA is called invertible if there exists another square matrix C such
that AC 2 CA 2 I . ltA is invertible, then the matrix C is called the inverse of A and is denoted
-1 -l n
Section 2.1 Matrix Operations Read pages 92 - 100
Denition: If the product of matrices AB is dened, then the ij entry of the matrix AB is the
sum of the products of the corresponding entries from row I of A and column j of B.
anblli +ai2b2j +.+a. b
m 17}
Section 3.2 Properties of Determinants Read pages 169 - 175
Theorem: Let A be a square matrix.
a) If a multiple of one row of A is added to another row of A, then the new matrix has the
- z 1 z i ' Fidel 4152.1 2,
same determinant as A H] delviwl :5 I eff
Section 3.1 Introduction to Determinants Read pages 164 - 168
Denition: If A is the 2 X 2 matrix [:1 Z], the determinant of A is the number ad bC.
We saw in chapter 2 that A is invertible if this number ad ~ be is not zero. Well expand on the
idea of a de
Section 6.3
Orthogonal Projections
Read pages 347 - 352
Suppose that U u1 , u 2 , ., u n is an orthogonal basis for R n . Let W be the spanning set of
u1 , u 2 , u 3 and let y be any vector in
R n . Can we write y as a sum of a vector in W and another
vec
Section 6.1
Inner Product, Length, Orthogonality
Read pages 330 - 336
Definition:
Let u and v be two vectors in R n . The inner product (or dot product) of u and v
is u v uT v .
Example:
2
3
8 and v 2 .
Find the dot product of u
1
4
Properties of
Section 6.2
Orthogonal Sets
Read pages 338 - 344
Definition:
A set of vectors in R n is called an orthogonal set if every vector in the set is
orthogonal to every other vector in the set.
Theorem:
If S is an orthogonal set of nonzero vectors in R n , S is
Section 5.4
Eigenvectors and Linear Transformations
Read pages 288 - 293
Remember our definition of a linear transformation:
A linear transformation T from a vector space V to a vector space W is a function from V to W
such that
i)
T u v T u T v for all u
Stuff to know for the final exam
I am not planning to make you state definitions on the final exam
There may well be a proof or two on the final though
Problems you can do
Solve systems of linear equations
Determine whether a given vector is in a given sp
Section 5.2
Example:
The Characteristic Equation
Read pages 273 279
7
2
Find the eigenvalues of the matrix A
.
1 6
Solution:
Remember that the eigenvalue of the matrix are scalars such that the equation
Ax x has nonzero solutions x. lets take a look at
Section 4.3
Definition:
Bases of Vector Spaces
Read pages 208 - 213
A basis for a vector space
Let H be a subspace of vector space V. A set of vectors 1 , b 2 , ., b p in V is called
b
a basis for H if
i)
is a linearly independent set
ii)
Span H
Example:
Section 5.3
Diagonalization
Read pages 281 - 286
Remember our definition of similar matrices from section 5.2: Two square matrices A and B are
similar if there exists an invertible matrix P such that A P 1 BP .
Now, suppose that the matrix B in the defini
Section 5.1
Eigenvalues and Eigenvectors
Read pages 266 - 271
Definition:
A eigenvector of a square matrix A is a nonzero vector x such that Ax x for
some scalar . A scalar is called an eigenvalue of A if there is a nontrivial solution x of
Ax x . x is ca
Section 2.3 Characteristics of lnvertible Matrices Read pages 111 - 114
i Z i
i o I 2
Suppose thatA. is an invertible 3x3 matrix. For exam? e _ [ O I 3]
Classify each of the following as i) necessarily true ii) necessarily false iii) possibly true or
poss
Section 2.5 Matrix Factorizations
Denition: A square matrix that has only zeros above its main diagonal is called lower
triangular. A square matrix that has only zeros below its main diagonal is called upper
triangular.
2 o 0 1 1 1 I 2 2 2
VerifythatifL=[
Section 6.1 Inner Product, Length, Orthogonality Read pages 330 - 336
Denition: Let u and V be two vectors in R. The inner product (or dot product) of u and V
The (10% [produci' oi: Veer/hrs
(s a 3 Cedar,
is u-V=11TV.
2 . 3
Example: Find the dot product o
Section 6.4
The Gram-Schmidt Process
Read pages 354 - 358
Let W be any subspace of R n . We will develop and algorithm to find an orthonormal basis for W.
Example:
W.
1
3
2, x 0 and let W Spanu ,u . Construct an orthogonal basis for
Let x1 2
1
2
1
1
Section 6.2 Orthogonal Sets Read pages 338 344
Denition: A set of vectors in R is called an orthogonal set if every vector in the set is
orthogonal to every other vector in'the set. Fl Sande e Kamp/e (5
O O I
Think oi: orMOLUm/ SelLS /'ke axes.-.% Xe; ax
Section 6.3 Orthogonal Projections Read pages 347 - 352
Suppose that U = {u1,u2,.,un} is an orthogonal basis for R. Let Who the spanning set of
{111, u2,u3} and let y be any vector in R". Can we write y as a sum of a vector in Wand another
' vectorin Wi?
>
Section 5.3 Diagonalization = Read pages 281 - 286
Remember our denition of similaf matrices from section 5.2: Two square matrices A and B are
. similar if there existS.an,inV'ertible.matrix Pusuchnthat A? P.f1P.- . . .M e. , . ,n .n _ e, ' , n .
Now;
Section 5.1 Eigenvalues and Eigenvectors ' Read pages 266 - 271
Denition: A eigenvector of a square matrix A is a nonzero vector X such that Ax = xix for
some scalar x1. A scalar A is called an eigenvalue of A if there is a nontrivial solution x of
Ax = x
Section 5.4 Eige'nvectors and Linear Transformations Read pages 288 293
Remember our denition of a linear transformation:
A linear transformation T om a vector space Vto a vector space Wis a function from Vto W
such that i
i) T(u+v)=T(u)+T(V)foralluandvin
Section 4.6 Rank Read pages 230 - 236
Find bases for the row space, the column space and the null space of the matrix
12 1210 loo5% 11,?
A=[ 1 3 1 0 4] QPREP [:0 (o 15 04,
0
111213 00!?" Tl
t,
The row $PGC W (5 Ct 3~clmeuswm/ gas/Dace 010
(I) 0 0
f) 508
Find a basis for the subspace of R4 described here and state the dimension of the subspace.
s' '5
5+2: MGR 1 $1.16
3s+22 331L529
7' .
1 0
.l 2
V O
. F n i
Q bases L S 3
O V J
1
Find the dimension of the subspace spanned by the set HE], [
O
3
i, '41 andnd
Section 4.2 Null Space, Column Space and Linear Transformation Read pages 198 205
Denition: The null space of a matrix
Thé'null space of an m X n'matrix A'is the set'of all solutions 'to thehoniogeneous
equation Ax = b.
In set notation: nuZA = {x e R | A
Section 4.5 The Dimension of a Vector Space Read pages 225 - 228
Theorem: If a vector space Vhas a basis ,8 containing 71 vectors, then any subset of V
containing more than n vectors must be linearly dependent. So ) yer Q/Xa/me J
Eln;al,ta,t§li a "hen aw