124
Chapter Two. Vector Spaces
2.4 Lemma (Exchange Lemma) Assume that B = h~ 1 , . . . , ~ n i is a basis for a
vector space, and that for the vector ~ the relationship ~ = c1 ~ 1 + c2 ~ 2 + +
v
v
cn ~ n has ci 6= 0. Then exchanging ~ i for ~ yields anoth
138
Chapter Two. Vector Spaces
of the yz-plane; here are two such combinations.
01
01
01
01
0
1
0
1
0
w1
w1
0
w1
w1
BC
BC
BC
BC
B
C
B
C
@w2 A = 1 @w2 A + 1 @ 0 A
@w2 A = 1 @w2 /2A + 1 @w2 /2A
w3
0
w3
w3
0
w3
The above denition gives one way in which we ca
Chapter Two. Vector Spaces
140
and vice versa (we can move from the bottom to the top by taking each di to
be 1).
For (1) =) (2), assume that all decompositions are unique. We will show
_
_
that B1 Bk spans the space and is linearly independent. It spans
Chapter Two. Vector Spaces
136
X
X
X
X
3.31 Show that the transpose operation is linear:
(rA + sB)T = rAT + sBT
for r, s 2 R and A, B 2 Mmn .
3.32 In this subsection we have shown that Gaussian reduction nds a basis for the
row space.
(a) Show that this b
Section III. Basis and Dimension
133
the columns containing the leading entries, h~ 1 , ~ 2 i. Thus, for a reduced echelon
ee
form matrix we can nd bases for the row and column spaces in essentially the
same way, by taking the parts of the matrix, the row
126
Chapter Two. Vector Spaces
2.13 Corollary Any linearly independent set can be expanded to make a basis.
Proof If a linearly independent set is not already a basis then it must not span
the space. Adding to the set a vector that is not in the span will
Section II. Linear Independence
115
(c) Find linearly independent sets S and T so that the union of S - (S \ T ) and
T - (S \ T ) is linearly independent, but the union S [ T is not linearly independent.
(d) Characterize when the union of two linearly ind
142
Chapter Two. Vector Spaces
4.16 Example In R2 the x-axis and the y-axis are complements, that is, R2 =
x-axis y-axis. A space can have more than one pair of complementary subspaces;
another pair for R2 are the subspaces consisting of the lines y = x a
Section III. Basis and Dimension
III.3
129
Vector Spaces and Linear Systems
We will now reconsider linear systems and Gausss Method, aided by the tools
and terms of this chapter. We will make three points.
For the rst, recall the insight from the Chapter
144
Chapter Two. Vector Spaces
4.32 Recall that no linearly independent set contains the zero vector. Can an
independent set of subspaces contain the trivial subspace?
X 4.33 Does every subspace have a complement?
X 4.34 Let W1 , W2 be subspaces of a vect
Section III. Basis and Dimension
117
1.4 Example The space R2 has many bases. Another one is this.
!
!
1
0
h
,
i
0
1
The verication is easy.
1.5 Denition For any Rn
0101
1
0
BCBC
B0C B1C
En = h B . C , B . C , . . . ,
B.C B.C
@.A @.A
0
0
01
0
BC
B0C
B . C
Section II. Linear Independence
113
(b) f(x) = cos(x) and g(x) = sin(x)
(c) f(x) = ex and g(x) = ln(x)
X 1.23 Which of these subsets of the space of real-valued functions of one real variable
is linearly dependent and which is linearly independent? (We ha
Topic
Fields
Computations involving only integers or only rational numbers are much easier
than those with real numbers. Could other algebraic structures, such as the
integers or the rationals, work in the place of R in the denition of a vector
space?
If
Topic
Crystals
Everyone has noticed that table salt comes in little cubes.
This orderly outside arises from an orderly inside the way the atoms lie is
also cubical, these cubes stack in neat rows and columns, and the salt faces tend
to be just an outer la
Section III. Basis and Dimension
119
terms, so that the two sums combine the same ~ s in the same order: ~ =
v
c1 ~ 1 + c2 ~ 2 + + cn ~ n and ~ = d1 ~ 1 + d2 ~ 2 + + dn ~ n . Now
v
c1 ~ 1 + c2 ~ 2 + + cn ~ n = d1 ~ 1 + d2 ~ 2 + + dn ~ n
holds if and only