2.8 Consequences of Invertibility
Theorem (The Really Big Theorem on Invertibility):
The following conditions are equivalent for a linear
operator T : n n , with standard matrix
T A :
1. T is an invertible operator.
2. A is an invertible matrix.
3. The rr
Problem 1 (15 points).
Determine h. and L‘ such that the solution set of
1‘1 + 31‘? = it
411 + hr? = 8
(a) is empty, (b) contains a unique solution and (:2) contains inﬁnitely many solutions. (Give
separate answers for each part. andjnstﬂjv them.)
Problem
Problem 1 (15 points).
Determine h and I; such that the solution set of
$14-31": =
4331+ Il'tl'j = 8
{a} is empty. {b} contains a unique solution. and {c} contains inﬁnitely many solutions. {Give
separate answers for each part. and justify them.)
Solutio
1.
2.
3. (Procedure may vary; sample answer)
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14. Since ( A + AT )T = AT + (AT ) = A + AT , A + AT is symmetric.
T
15.
Math 10
Solutions to Test 1
Page 1
Page 2
Page 3
Page 4
6.
7.
8.
9. Since B is obtained from A by doing the following ERO on A:
2 R3 + R1 R1
We have that the elementary matrix E such that EA = B is obtained from I by doing the same ERO:
1 0 0
1 0 2
+ R
1. Not a subspace.
Let W be the set of all vectors of the form (a, b, c) , where a 2 + b 2 + c 2 1 .
2
2
2
3
1 1 1
1 1 1
Let u = , , W . (Note: + + = < 1 )
4
2 2 2
2 2 2
Let k = 2.
Then ku = (1, 1, 1) with 12 + 12 + 12 = 3 > 1 .
So ku W and hence W is not
1.
2.
2 6 0 0
1 3 0 0
4 12 0 0 RREF 0 0 0 0
1
0 0 0 0
3 0 0
Set x3 = t , x 2 = s, then x1 = 3s . Then
x1 3s 3 0
x 2 = s = s 1 + t 0
x t 0 1
3
So, the null space is spancfw_ (3, 1, 0), (0, 0, 1) .
Nullity(A) = 2
3.
4.
5.
6. The RREF of A is
Math 10
Test 4
Show all work/reasoning to receive full credit.
1 2 1 3
1. Consider the matrix A = 2 4 1 2 . Find a basis for the row space of A that consists entirely of row
3 6 3 7
vectors of A.
2 6 0
2. Consider the matrix B = 4 12 0 . Find the null s
5.1 Permutations and The
Determinant Concept
Definition: Let A
a b
be a 2 2 matrix.
c d
The determinant of A (and variations of its notation) is
defined by:
detA | A |
a b
c d
ad bc.
Theorem: A 2 2 matrix A is invertible if and only if
detA 0.
Section
Groupwork 6 Solutions
Chapter 4 Supplementary Exercises (page 289)
7. Under the given hypotheses in the problem, we claim that if A is invertible, then Av1 , Av 2 , K , Av n is
a linearly independent set.
Proof of the claim:
We want to show that the vecto
Groupwork 3 Solutions
Section 1.7 #29
If A is an invertible upper triangular or lower triangular matrix, then the diagonal entries of A 1 are
the reciprocal of the ones of A.
Proof:
Let A be an n n invertible upper triangular matrix, then A 1 is also uppe
3.1 Axioms for a Vector Space
Definition (The Axioms of an Abstract Vector Space):
A vector space V, , is a non-empty set V,
together with two operations:
vector addition, and
scalar multiplication,
such that: for all , and w V and all r, s ,
u v
V, , s
3.2 Linearity Properties for Finite Sets of Vectors
Linear Combinations and Spans of Finite Sets
Definition: Let S 1 , 2 , . . . , n be a set of vectors from a
v v
v
vector space V, , , and
let r 1 , r 2 , . . . , r n . Then, a linear combination of the v
3.6 Coordinate Vectors and Matrices for Linear Transformations
Definition: Let B w 1 , w 2 , . . . , w n be an ordered basis for a
finite dimensional vector space V. If is any vector in V, we know
v
that can be expressed uniquely as a linear combination
3.5 Linear Transformations on General Vector Spaces
Definition: A linear transformation:
T : V, V , V W, W , W
is a function that assigns a unique member w W to every vector
V, such that T satisfies for all , V and all scalars
v
u v
k:
The Additivity Pr
3.4 Subspaces, Basis and Dimension
Subspaces
Definition: A non-empty subset W of a vector space V, , is
called a subspace of V if W is closed under and . In other
words, for all w 1 and w 2 W, and k :
w 1 w 2 W, and k w 1 W.
As before, we write W V, and w
Section 3.8 Isomorphisms
Definition: We say that T : V W is an isomorphism if T is both
one-to-one and onto. We also say that T is invertible, T is
bijective, and the vector space V is isomorphic to W.
In the case when V W, we call T an automorphism of V.
3.7 One-to-One and Onto Linear Transformations ;
Compositions of Linear Transformations
Review:
.
V
.
0. .
W
.
V
V
T
.
.
.
0
.
.
V
0W
ker (T)
.
.
.
T
W
.0 .
. .
.
.
.
W
.
range ( T )
.
kerT
rangeT
Section 3.7
V | T 0 W
v
v
w W | w T for some V
v
v
1
On
Groupwork 5 Solutions
Section 4.2 #14
(a) cos 2 x spancfw_f, g
Q cos 2 x = cos 2 x sin 2 x = 1f 1g
(b) 3 + x 2 spancfw_f, g
(c) 1 spancfw_f, g
Q 1 = sin 2 x + cos 2 x = 1f + 1g
(d) sin x spancfw_f, g
(e) 0 spancfw_f, g
Q 0 = 0 sin 2 x + 0 cos 2 x = 0f + 0
Math 10
Groupwork for Chapter 7
Consider the bilinear form on 2 :
px |qx p1q1 p1q1 p2q2.
a. Warm up: let px 3x 2 5x 8, and let qx x 2 x 2. Find px |qx .
b. Prove that for all px 2 , where px zx: px |px 0.
c. On the other hand, find a px 3 , where px zx,
6.3 Diagonalization of Square Matrices
Definition: Let A be an n n matrix. We say that A is
diagonalizable if we can find an invertible matrix C such that:
C 1 AC D,
where D Diag 1 , 2 , . . . , n is a diagonal matrix,
equivalently:
or
A CDC 1 or AC CD
We
7.2 Geometric Constructions in Inner Product Spaces
Further Properties of Inner Products
Theorem: Let V be an inner product space under the bilinear form
| . Then the following properties also hold, for all vectors
u, v
V:
and w
1.
u | k v k
u | v
2
6.2 Computational Techniques for Eigentheory
The Integer Roots Theorem:
Let px x n c n1 x n1 c 1 x c 0 be a polynomial with
integer coefficients, and c 0 0. Then, all the rational roots of
px are in fact integers, and if x c is an integer root of px,
then
7.3 Orthonormal Sets and The Gram-Schmidt Algorithm
Definition: Let S v1 , v 2 , . . . , v k be a set of vectors in an inner
product space V. We say that S is an orthonormal set if:
v i | v j
0 if i j, and
v i | v i
1 for i 1. . k.
If we remove the cond
7.4 Orthogonal Complements and Decompositions
Orthogonal Complements
Definition/Theorem: Let W be any subspace of an inner product
space V. We define the orthogonal complement of W, another
subspace of V, by:
W
v V | v | w
0 for all w
W
Section 7.4 Ort
7.5 Orthonormality and Projection Operators
Theorem: Let B
u1,
u2, . . . ,
u n be an orthonormal basis for
be arbitrary members of V.
an inner product space V. Let v and w
If:
v B
v 1 , v 2 , . . . , v n , and
B
w
w 1 , w 2 , . . . , w n , then:
v
7.1 Inner Product Spaces
Definition (The Axioms of an Inner Product Space):
Let V be a vector space. An inner product on V is a bilinear form
| on V, that is, a function that takes two vectors
u, v V, and
produces a scalar, denoted
u | v , such that the
8.8 Simultaneous Diagonalization
We know that an n n matrix A can be diagonalized if and only if there is a basis v 1 , v 2 , , v n for
n consisting of eigenvectors for A. These vectors are assembled into the columns of an invertible
matrix, C v 1 v 2 ,