18
18.1
Orthogonality and related matters
Orthogonality
Recall that two vectors x and y are said to be orthogonal if xy = 0. (This is the Greek
version of perpendicular.)
Example: The two vectors
1
16
Eigenvalues and eigenvectors
Definition: If a vector x = 0 satises the equation Ax = x, for some real or complex
number , then is said to be an eigenvalue of the matrix A, and x is said to be an
ei
17
Inner products
Up until now, we have only examined the properties of vectors and matrices in Rn . But
normally, when we think of Rn , were really thinking of n-dimensional Euclidean space - that
is
14
Change of basis
When we rst set up a problem in mathematics, we normally use the most familiar coordinates. In R3 , this means using the Cartesian coordinates x, y, and z. In vector terms, this
is
22
Approximations - the method of least squares (1)
Suppose that for some y, the equation Ax = y has no solutions. It may happpen that this
is an important problem and we cant just forget about it. If
12
12.1
Basis and dimension of subspaces
The concept of basis
Example: Consider the set
1
2
S=
,
0
1
2
1
,
.
Then span(S) = R2 . (Exercise). In fact, any two of the elements of S span R2 . (Exercise).
23
23.1
Least squares approximation - II
The transpose of A
In the next section well develop a equation, known as the normal equation, which is much
easier to solve than Ax = (y), and which also gives
20
20.1
Projections onto subspaces and the Gram-Schmidt
algorithm
Construction of an orthonormal basis
It is not obvious that any subspace V of Rn has an orthonormal basis, but its true. In this
chapt
13
13.1
The rank-nullity (dimension) theorem
Rank and nullity of a matrix
Definition: The rank of the matrix A is the dimension of the row space of A, and is
denoted R(A)
Examples: The rank of Inn is
21
21.1
Symmetric and skew-symmetric matrices
Decomposition of a square matrix into symmetric and skewsymmetric matrices
Let Cnn be a square matrix. We can write
C = (1/2)(C + C t ) + (1/2)(C C t ) =
15
Matrices and Linear transformations
We have been thinking of matrices in connection with solutions to linear systems of equations
like Ax = y. It is time to broaden our horizons a bit and start thi
19
19.1
Orthogonal projections and orthogonal matrices
Orthogonal projections
We often want to decompose a given vector, for example, a force, into the sum of two
orthogonal vectors.
Example: Suppose