18
18.1
Orthogonality and related matters
Orthogonality
Recall that two vectors x and y are said to be orthogonal if xy = 0. (This is the Greek
version of perpendicular.)
Example: The two vectors
1
2
1 and 2
0
4
are orthogonal, since their dot product
16
Eigenvalues and eigenvectors
Definition: If a vector x = 0 satises the equation Ax = x, for some real or complex
number , then is said to be an eigenvalue of the matrix A, and x is said to be an
eigenvector of A corresponding to the eigenvalue .
Exampl
17
Inner products
Up until now, we have only examined the properties of vectors and matrices in Rn . But
normally, when we think of Rn , were really thinking of n-dimensional Euclidean space - that
is, Rn together with the dot product. Once we have the do
14
Change of basis
When we rst set up a problem in mathematics, we normally use the most familiar coordinates. In R3 , this means using the Cartesian coordinates x, y, and z. In vector terms, this
is equivalent to using what weve called the standard basis
22
Approximations - the method of least squares (1)
Suppose that for some y, the equation Ax = y has no solutions. It may happpen that this
is an important problem and we cant just forget about it. If we cant solve the system
exactly, we can try to nd an
12
12.1
Basis and dimension of subspaces
The concept of basis
Example: Consider the set
1
2
S=
,
0
1
2
1
,
.
Then span(S) = R2 . (Exercise). In fact, any two of the elements of S span R2 . (Exercise).
So we can throw out any one of them, for example, the
23
23.1
Least squares approximation - II
The transpose of A
In the next section well develop a equation, known as the normal equation, which is much
easier to solve than Ax = (y), and which also gives the correct x. We need a bit of
background rst.
The tr
20
20.1
Projections onto subspaces and the Gram-Schmidt
algorithm
Construction of an orthonormal basis
It is not obvious that any subspace V of Rn has an orthonormal basis, but its true. In this
chapter, we give an algorithm for constructing such a basis,
13
13.1
The rank-nullity (dimension) theorem
Rank and nullity of a matrix
Definition: The rank of the matrix A is the dimension of the row space of A, and is
denoted R(A)
Examples: The rank of Inn is n; the rank of 0mn is 0. The rank of the 3 5 matrix
con
21
21.1
Symmetric and skew-symmetric matrices
Decomposition of a square matrix into symmetric and skewsymmetric matrices
Let Cnn be a square matrix. We can write
C = (1/2)(C + C t ) + (1/2)(C C t ) = A + B,
where At = A is symmetric and B t = B is skew-sy
15
Matrices and Linear transformations
We have been thinking of matrices in connection with solutions to linear systems of equations
like Ax = y. It is time to broaden our horizons a bit and start thinking of matrices as
functions.
In general, a function
19
19.1
Orthogonal projections and orthogonal matrices
Orthogonal projections
We often want to decompose a given vector, for example, a force, into the sum of two
orthogonal vectors.
Example: Suppose a mass m is at the end of a rigid, massless rod (an ide