Math 1b Practical Basic matrices and solutions; pivot operations
January 3, 2011
We use matrices to model systems of linear equations. For example, the system
2x1
x1
x1
+
3x2
x2
5x3
5x3
4x3
7x4
+ x4
+ 3x4
+ 2x5
4x5
+ x5
=7
=3
= 4
(1)
has correspondin
1
Math 1b Practical Isometries
March 04, 2011
Recall that an orthogonal matrix is a matrix square matrix U so that U U = I
(which holds if and only if U U = I ). In other words, U is orthogonal if and only if
its columns form an orthonormal basis of R n ,
1
Math 1b Practical March 02, 2011
Real symmetric matrices
A matrix A is symmetric when A = A. In the following A will denote the conjugatetranspose of A, the matrix whose entries are the complex conjugates of the entries of A .
A (complex) matrix is call
1
Math 1b Practical February 25, 2011 revised February 28, 2011
Probability matrices
A probability vector is a nonnegative vector whose coordinates sum to 1. A square
matrix P is called a probability matrix (or a left-stochastic matrix or a column-stochas
Math 1b Practical February 25, 2011
The Leontief input-output model in economics
We are given a square matrix C of nonnegative real numbers called the consumption
matrix. The rows and columns are indexed by industries 1, 2, . . . , n. The entry cij repres
1
Math 1b Practical February 22, 2011
Digraphs, nonnegative matrices, Google page ranking
Nonnegative matrices
Theorem 1. Let A be a square matrix of nonnegative real numbers. Then A has an
eigenvector e all of whose entries are nonnegative.
Proof: Omit
1
Math 1b Practical Eigenvalues and eigenvectors, IV: The characteristic polynomial
February 16, 2011
We have talked about the geometric multiplicity of eigenvalues. There is another
kind of multiplicity. We know the is an eigenvalue of A if and only if p
1
Math 1b Practical Eigenvalues and eigenvectors, III: Diagonalization
February 14, 2011
If we can express a vector x as a linear combination of eigenvectors of an n n matrix
A, then we may compute (nd a formula for) An x and so understand what happens wh
1
Math 1b Practical Eigenvalues and eigenvectors, II
February 11, 2011
Example 6. Let P be the matrix of the orthogonal projection onto a subspace U of Rn .
If u U , then P u = u, and if w U , then P w = 0. That is, elements of U are eigenvectors correspo
1
Math 1b Practical Eigenvalues and eigenvectors, I
February 9, 2011
Let A be a square matrix. A (right) eigenvector with corresponding eigenvalue is a
nonzero (column) vector e so that
Ae = e.
(1)
We say is an eigenvalue of A when (1) holds for at least
Math 1b Practical Determinants
January 30, 2011
Square matrices have determinants, which are scalars. Determinants can be introduced in several ways; we choose to give a recursive denition. The determinant of a 1 1
matrix is the entry of the matrix. Once
Math 1b Prac The Gram-Schmidt process; orthogonal projection
January 24, 2011 extended/revised January 28, 2011
To warm up, we start with
Theorem 1. Nonzero pairwise orthogonal vectors u1 , u2 , . . . , uk are linearly independent.
Proof: Suppose u1 , u2
Math 1b Prac Bases for row spaces, null spaces; orthogonal spaces
January 18, 2011 slightly revised January 21, 2011
Given a matrix M , the row space of M is the span of its rows (the set of all linear
combinations of its rows). The null space of M is the
Math 1b Practical A theorem on linear dependence
January 12, 2011
Theorem. If vectors v1 , v2 , . . . , vk are linear combinations of vectors u1 , u2 , . . . , ur , and
if k > r, then v1 , v2 , . . . , vk are linearly dependent.
We illustrate with an exam
Math 1b Practical Nonnegative solutions
January 10, 2011
We may ask whether a (nonhomogeneous) system of linear equations with real coecients has a solution in nonnegative real numbers. This question is part of the subject of
linear programming and has ma
Math 1b Row-equivalence; matrix inverses
January 7, 2011
Recall that matrices A and B are row-equivalent when one can be obtained from the
other by a sequence of elementary row operations.
An elementary row operation on a matrix M gives us a matrix whose
Math 1b Matrix Multiplication
If A has rows ai and B has columns bj , then AB has, by denition, ai bj as the entry
in row i and column j . The matrix AB is the matrix of dot products of rows of A with
columns of B .
Here are some simple properties and fac
1
Math 1b Practical March 7, 2011
Singular value decomposition; psuedoinverses
Recall that every symmetric matrix can be written as U DU where D is nonnegative
diagonal and U is orthogonal. Something similar can be done even if the matrix is not
symmetric