19. Note that if det M = 0, there exists an
eigenvalue of M equal to 0, which implies M is
not invertible. Thus condition 8 is equivalent
to conditions 4, 5, 9, and 10. The map M is
injective if it does not have a null space by
definition, however eigenve
matrices we can think about e A = X n=0 An
n! . 389 390 Movie Scripts This means we are
going to have an idea of what An looks like for
any n. Lets look at the example of one of the
matrices in the problem. Let A = 1 0 1 . Lets
compute An for the first fe
(a.b).x1 (a.b).x2 = a.(b.x1) a.(b.x2) = a. (b.x1)
(b.x2) = a b x1 x2 , which is what we want.
(v) Unity: We need to find a special scalar acts
the way we would expect 1 to behave. I.e.
1 x1 x2 = x1 x2 . There is an obvious
choice for this special scalar-j
for x. The augmented matrix you get is 1 0
5 6 0 1 1 1 0 0 1 1 It should take only a
few step to transform it into 1 0 0 1 0 1 0
2 0 0 1 1 , which gives us the answer x =
1 2 1 . 393 394 Movie Scripts Another
LU Decomposition Example Here we will
perform
379 380 Movie Scripts This again relies on the
underlying real numbers which for any x, y
R obey x + y = y + x . This fact underlies the
middle step of the following computation x1
x2 + y1 y2 = x1 + y1 x2 + y2 = y1 + x1 y2 + x2
= y1 y2 + x1 x2 , which de
necessity for complex numbers is easily seems
from a polynomial like z 2 + 1 whose roots
would require us to solve z 2 = 1 which is
impossible for real number z. However,
introducing the imaginary unit i with i 2 = 1 ,
we have z 2 + 1 = (z i)(z + i). Retu
eigenvalue equation Lv = v becomes Mv = v,
where v is a column vector and M is an nn
matrix (both expressed in whatever basis we
chose for V ). The scalar is called an
eigenvalue of M and the job of this video is to
show you how to find all the eigenvalue
the last part of the problem. The problem can
be solved by considering a non-zero simple
polynomial, such as a degree 0 polynomial,
and multiplying by i C. That is to say we take
a vector p P R 3 and then considering ip.
This will violate one of the vecto
implies (a) is the easy direction: just think
about what it means for M to be non-singular
and for a linear function to be well-defined.
Therefore we assume that M is singular which
implies that there exists a nonzero vector X0
such that MX0 = 0. Now assu
the following relationships Alice and Bob are
friends. Alice and Carl are friends. Carl and
Bob are friends. David and Bob are friends.
Now draw a picture where each person is a
dot, and then draw a line between the dots of
people who are friends. This is
product of u and v to find kukkv k cos() u
v=u vuvuuu =uvuuvuu
u = u v u v u u u u Now you finish
simplifying and see if you can figure out what
has to be. (c) Given your solution to the
above, how can you find a third vector
perpendicular to both u and
V = cfw_(x, y, z, w)|x, y, z, w Z 5 . This is like
four dimensional space R 4 except that the
numbers can only be cfw_0, 1, 2, 3, 4. This is like
bits, but now the rule is 0 = 5 . 407 408 Movie
Scripts Thus, for example, 1 4 = 4 because 4 =
16 = 1 + 3 5
the span of two vectors. 406 G.10 Basis and
Dimension 407 G.10 Basis and Dimension
Proof Explanation Lets walk through the proof
of theorem 11.0.1. We want to show that for S
= cfw_v1, . . . , vn a basis for a vector space V ,
then every vector w V can be
matrices (M0 |V 0 ) (M|V ).) Block LDU
Explanation This video explains how to do a
block LDU decomposition. Firstly remember
some key facts about block matrices: It is
important that the blocks fit together properly.
For example, if we have matrices matri
matrix should be cn = (n 1)a + ncn1m since
det(A) = Pn i=1(1)ia 1 i cofactor(a 1 i ) and
cofactor(a 1 i ) is an (n 1) (n 1) matrix.
This is one way to prove part (c). 402 G.8
Subspaces and Spanning Sets 403 G.8
Subspaces and Spanning Sets Linear systems a
3 and 4 are not pivot variable so are
arbitrary, we set them to and , respectively.
Thus 1 = 71 25 + 4 25 , 2 = 53 25
3 25 , 3 = , 4 = . Thus we have found a
relationship among our four vectors 71 25
+ 4 25 v1 + 53 25 3 25 v2 + v3 + 4
v4 = 0 . In fact t
symmetric then. Do Matrices Commute? This
video shows you a funny property of matrices.
Some matrix properties look just like those for
numbers. For example numbers obey a(bc) =
(ab)c and so do matrices: A(BC) = (AB)C. This
says the order of bracketing do
point or column vector. Alternative Proof Here
we will prove more directly that the
determinant of a product of matrices is the
product of their determinants. First we
reference that for a matrix M with rows ri, if
M0 is the matrix with rows r 0 j = rj +
the determinant. Scalar multiplication Ri ():
multiplying a row by multiplies the
determinant by . Row addition S i j ():
adding some amount of one row to another
does not change the determinant. The
corresponding elementary matrices are
obtained by perfo
be just like a bank, where if money is owed to
somebody else, we can use a minus sign. The
vector ~x says what is in the barrel and does
not depend which mathematical description is
employed. The way nutritionists label ~x is in
terms of a pair of basis v
vectors in Z 3 2 to where they started from.
But above, we see that different vectors in Z 3
2 are mapped to the same vector in Z 2 2 by
the linear transformation L with matrix A. So B
cannot exist. However a right inverse C
obeying AC = I can. It would b
Elementary Matrices This video will explain
some of the ideas behind elementary
matrices. First think back to linear systems, for
example n equations in n unknowns:
a 1 1x 1 + a 1 2x 2 + + a 1 nx n = v 1
a 2 1x 1 + a 2 2x 2 + + a 2 nx n = v 2 . . . a n
subspace and all subspaces are vector spaces,
we know that the linear combination v + v0
U . Now repeat the same logic for W and
you will be nearly done. G.9 Linear
Independence Worked Example This video
gives some more details behind the example
for the
Which means when you add 1 + 1 you get 0. It
also means when you have a vector ~v Bn
and you want to multiply it by a scalar, your
only choices are 1 and 0. This is kind of neat
because it means that the possibilities are
finite, so we can look at an enti
there are many eigenvectors. Can you find
some? This is an example of how things can
change in infinite dimensional spaces. For a
more finite example, consider the space P C 3
of complex polynomials of degree at most 3,
and recall that the derivative D ca
M from a previous example Eigenvalues and
Eigenvectors: 2 2 Example M = 4 2 1 3 We
found the eigenvalues and eigenvectors of M,
our solution was 1 = 5, v1 = 2 1 and 2 = 2,
v2 = 1 1 . 419 420 Movie Scripts So we can
diagonalize this matrix using the formul