Section 4
49
so
x
1
,
x
2
,
x
3
are linearly dependent. Consider next the vectors
x
1
,
x
2
,
x
4
.
If
X
= (
x
1
,
x
2
,
x
4
) then
det(
X
) =
1
2
2
2
5
7
2
4
4
= 0
so these three vectors are also linearly dependent. Finally if use
x
5
and form
the matrix
X
= (
x
1
,
x
2
,
x
5
) then
det(
X
) =
1
2
1
2
5
1
2
4
0
=

2
so the vectors
x
1
,
x
2
,
x
5
are linearly independent and hence form a basis for
R
3
.
16.
dim
U
= 2. The set
{
e
1
,
e
2
}
is a basis for
U
.
dim
V
= 2. The set
{
e
2
,
e
3
}
is a basis for
V
.
dim
U
∩
V
= 1. The set
{
e
2
}
is a basis for
U
∩
V
.
dim
U
+
V
= 3. The set
{
e
1
,
e
2
,
e
3
}
is a basis for
U
+
V
.
17.
Let
{
u
1
,
u
2
}
be a basis for
U
and
{
v
1
,
v
2
}
be a basis for
V
. It follows from
Theorem 3.4.1 that
u
1
,
u
2
,
v
1
,
v
2
are linearly dependent. Thus there exist
scalars
c
1
, c
2
, c
3
, c
4
not all zero such that
c
1
u
1
+
c
2
u
2
+
c
3
v
1
+
c
4
v
2
=
0
Let
x
=
c
1
u
1
+
c
2
u
2
=

c
3
v
1

c
4
v
2
The vector
x
is an element of
U
∩
V
. We claim
x
=
0
, for if
x
=
0
, then
c
1
u
1
+
c
2
u
2
=
0
=

c
3
v
1

c
4
v
2
and by the linear independence of
u
1
and
u
2
and the linear independence of
v
1
and
v
2
we would have
c
1
=
c
2
=
c
3
=
c
4
= 0
contradicting the definition of the
c
i
’s.
18.
Let
U
and
V
be subspaces of
R
n
with the property that
U
∩
V
=
{
0
}
.
If either
U
=
{
0
}
or
V
=
{
0
}
the result is obvious, so assume that both
subspaces are nontrivial with dim
U
=
k >
0 and dim
V
=
r >
0. Let
{
u
1
, . . .,
u
k
}
be a basis for
U
and let
{
v
1
, . . .,
v
r
}
be a basis for
V
. The
vectors
u
1
, . . .,
u
k
,
v
1
, . . .,
v
r
span
U
+
V
. We claim that these vectors form
a basis for
U
+
V
and hence that dim
U
+ dim
V
=
k
+
r
. To show this we
must show that the vectors are linearly independent. Thus we must show
that if
c
1
u
1
+
· · ·
+
c
k
u
k
+
c
k
+1
v
1
+
· · ·
+
c
k
+
r
v
r
=
0
(2)
then
c
1
=
c
2
=
· · ·
=
c
k
+
r
= 0. If we set
u
=
c
1
u
1
+
· · ·
+
c
k
u
k
and
v
=
c
k
+1
v
1
+
· · ·
+
c
k
+
r
v
r