Review Part 2

Review Part 2 - Review of Part II Methods and Formulas...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Review of Part II Methods and Formulas Basic Matrix Theory: Identity matrix: AI = A, IA = A, and I v = v Inverse matrix: AA−1 = I and A−1 A = I Norm of a matrix: |A| ≡ max|v|=1 |Av| A matrix may be singular or nonsingular. See Lecture 10. Solving Process: Gaussian Elimination produces LU decomposition Row Pivoting Back Substitution Condition number: |δ x|/|x| cond(A) ≡ max |δA| |δb| = max |A| + |b | Relative error of output Relative error of inputs . A big condition number is bad; in engineering it usually results from poor design. LU factorization: P A = LU. Solving steps: Multiply by P: b′ = P b Backsolve: Ly = b′ Backsolve: U x = y Eigenvalues and eigenvectors: A nonzero vector v is an eigenvector (ev) and a number λ is its eigenvalue (ew) if Av = λv. Characteristic equation: det(A − λI ) = 0 Equation of the eigenvector: (A − λI )v = 0 Complex ew’s: Occur in conjugate pairs: λ1,2 = α ± iβ and ev’s must also come in conjugate pairs: w = u ± iv. Vibrational modes: Eigenvalues are frequencies squared. Eigenvectors are modes. 63 64 LECTURE 18. ITERATIVE SOLUTION OF LINEAR SYSTEMS* Power Method: - Repeatedly multiply x by A and divide by the element with the largest absolute value. - The element of largest absolute value converges to largest absolute ew. - The vector converges to the corresponding ev. - Convergence assured for a real symmetric matrix, but not for an arbitrary matrix, which may not have real eigenvalues at all. Inverse Power Method: - Apply power method to A−1 . - Use solving rather than the inverse. - If λ is an ew of A then 1/λ is an ew for A−1 . - The ev’s for A and A−1 are the same. Symmetric and Positive definite: - Symmetric: A = A′ . - If A is symmetric its ew’s are real. - Positive definite: Ax · x > 0. - If A is positive definite, then its ew’s are positive. QR method: - Transform A into H the Hessian form of A. - Decompose H in QR. - Multiply Q and R together in reverse order to form a new H . - Repeat - The diagonal of H will converge to the ew’s of A. Matlab Matrix arithmetic: > A = [ 1 3 -2 5 ; -1 -1 5 4 ; 0 1 -9 0] . . . . . . . . . . . . . . . . . Manually enter a matrix. > u = [ 1 2 3 4]’ > A*u > B = [3 2 1; 7 6 5; 4 3 2] > B*A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . multiply B times A. > 2*A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . multiply a matrix by a scalar. > A + A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . add matrices. > A + 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . add 3 to every entry of a matrix. > B.*B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . component-wise multiplication. > B.^3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . component-wise exponentiation. Special matrices: > I = eye(3) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . identity matrix > D = ones(5,5) > O = zeros(10,10) > C = rand(5,5) . . . . . . . . . . . . . . . . . . . . . . . . . . . . random matrix with uniform distribution in [0, 1]. > C = randn(5,5) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . random matrix with normal distribution. > hilb(6) > pascal(5) General matrix commands: > size(C) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . gives the dimensions (m × n) of A. > norm(C) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . gives the norm of the matrix. 65 > > > > > > > > det(C) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . the determinant of the matrix. max(C) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . the maximum of each row. min(C) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . the minimum in each row. sum(C) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . sums each row. mean(C) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . the average of each row. diag(C) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . just the diagonal elements. inv(C) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . inverse of the matrix. C’ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . transpose of the matrix. Matrix decompositions: > [L U P] = lu(C) > [Q R] = qr(C) > [U S V] = svd(C) . . . . . . . . . . singular value decomposition (important, but we did not use it). > H = hess(C) transform into a Hessian tri-diagonal matrix. A matrix that has the same eigenvalues as A. > [U T] = schur(C) . . . . . . . . . . . . . . . . . . . . . . . . . . Schur Decomposition A = U ′ T U . > R = chol(C*C’) Cholesky decomposition of a symmetric, positive definite matrix, A = R′ R. ...
View Full Document

This note was uploaded on 01/15/2011 for the course MATH 345 taught by Professor Staff during the Spring '08 term at Ohio State.

Ask a homework question - tutors are online