Week3.5Syllabus

Week3.5Syllabus - Week 3.5 Syllabus: The Metric Structure...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Week 3.5 Syllabus: The Metric Structure of R n , Applications, Matrix Operations 1 The Metric Structure R n R n (ro vectors for ease of typography, similarly for column vectors R n ) has a dot product on vectors defined as ~u ~v = n i =1 u i v i R , i.e., multiply the coefficients term by term but then add the results to form a single scalar . It satisfies the following basic rules for all vectors ~u,~u i ,~v V and scalars c R : (1) ( symmetry ) ~u ~v = ~v ~u (2) ( additivity ) ( ~u 1 + ~u 2 ) ~v = u 1 ~v + u 2 ~v (3) ( homogeneity ) ( c~u ) ~v = c ( ~u ~v ) (4) ( positive-definiteness ) ~u ~u > 0 for all ~u 6 = ~ . These 4 rules will become the definition for an abstract inner product on any vector space. 1.1 The Norm From the dot product we can derive the the concepts of norm of a vector k ~v k := ~v ~v = p x 2 1 + x 2 n 0 satisfying k ~u k = 0 iff ~u = ~ 0 and k c~u k = | c |k ~u k . Any nonzero vector ~v can be normalized to give a (normal) unit vector ~u with k ~u k = 1 pointing a length 1 in the same direction via ~u = 1 k ~v k ~v (sloppily but conveniently written ~v k ~v k ). Remember that because the norm involves a square root, itis usually more convenient for calculations to deal with k ~v k 2 = ~v ~v, since dot products are bilinear but square roots are not. 1.2 The Cauchy-Schwarz Inequality Expanding out the positive-definiteness ~x ~x 0 for x := k ~u k ~v-k ~v k ~u yields 2 k ~u kk ~v k ( k ~u kk ~v k- ~u ~v ) , establishing (applying this to both ~u and- ~u ) the Cauchy-Schwarz Inequality | ~u ~v | 6 k ~u kk ~v k . This guarantees that the ratio ~u ~v k ~u kk ~v k falls in the interval [- 1 , 1], hence is uniquely cos( ) for some in the interval [0 , ]; the angle between two vectors ~u,~v is = arccos ~u ~v k ~u kk ~v k [0 , ] . The most important angle is pi 2 (a.k.a 90 ): orthogonality (a.k.a. perpendicularity) ~u ~v of two vectors means ~u ~v = 0. The CSI implies the triangle inequality k ~u + ~v k k ~u k + k ~v k (by squaring both sides). This will be very important in measuring distances between vectors (especially functions) 1.3 Distance We have a metric concept of distance between two vectors d ( ~u,~v ) := k ~u- ~v k , satisfying d ( x,x ) = 0 x = 0 , d ( x,y ) = d ( y,x ), and the triangle inequality d ( x,z ) 6 d ( x,y ) + 1 d ( y,z ) [this follows from the triangle inequality for vectors with u = x- y,v = y- z,u + v = x- z ]. The Pythagorean Theorem says that if ~u,~v are orthogonal then k ~u + ~v k 2 = k ~u k 2 + k ~v k 2 . From these we can (in some other course) study calculus in R n ....
View Full Document

This note was uploaded on 09/25/2009 for the course APMA 3080 taught by Professor Pindera during the Spring '09 term at UVA.

Page1 / 6

Week3.5Syllabus - Week 3.5 Syllabus: The Metric Structure...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online