This preview shows pages 1–4. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: MA2216/ST2131 Probability Notes 9 Review & Examples and Properties of Expectation § 1. Linear Transformation. Let X 1 ,X 2 ,...,X n be continuous random variables having joint density f and let random variables Y 1 ,Y 2 ,...,Y n be defined by the following linear transformation Y i = n X j =1 a ij X j , i = 1 , 2 ,...,n, where the matrix A = ( a ij ) n × n has nonzero determinant det A . Thus, ( y 1 ,...,y n ) t = A ( x 1 ,...,x n ) t , where ( x 1 ,...,x n ) t denotes the transpose of the row vector ( x 1 ,...,x n ). Since det A 6 = 0, the linear transformation A is nonsingular, and hence admits an inverse, A 1 such that ( x 1 ,...,x n ) t = A 1 ( y 1 ,...,y n ) t . (1 . 1) Equivalently, x ∼ = y ∼ ( A 1 ) t , where x ∼ = ( x 1 ,...,x n ). Note also that the Jacobian of this transformation A is nothing but its determinant: J ( x 1 ,x 2 ,...,x n ) = det A. Therefore, Y 1 ,Y 2 ,...,Y n have joint density f Y 1 ,...,Y n given by f Y 1 ,...,Y n ( y 1 ,...,y n ) = 1  det A  f ( x 1 ,...,x n ) = 1  det A  f ( y ∼ ( A 1 ) t ) . (1 . 2) 1 § 2. CauchySchwarz Inequality. 1. For sequences { a 1 ,a 2 ,...,a n } and { b 1 ,b 2 ,...,b n } , the classical Cauchy Schwartz inequality is stated as follows: " n X i =1 a i b i # 2 ≤ ˆ n X i =1 a 2 i ! · ˆ n X i =1 b 2 i ! . If a ∼ = ( a 1 ,a 2 ,...,a n ) and b ∼ = ( b 1 ,b 2 ,...,b n ) are regarded as vectors in IR n , the above inequality becomes  a ∼ · b ∼  ≤  a ∼  b ∼  , where “ · ” refers to the inner product, and  a ∼  def. = ˆ n X i =1 a 2 i ! 1 / 2 refers to its Euclidian norm. 2. Remarks. (i) The inequality becomes an equality if and only if a ∼ and b ∼ are linearly dependent , i.e., there exists a real constant t such that t · a ∼ = b ∼ . (ii) Such an inequality can be generalized to infinite series, say, { a i : i = 1 , 2 ,... } , provided that ∞ X i =1 a 2 i < ∞ . In this case, the CauchySchwartz inequality is given by " ∞ X i =1 a i b i # 2 ≤ ˆ ∞ X i =1 a 2 i ! · ˆ ∞ X i =1 b 2 i ! . 2 3. CauchySchwarz Inequality for R.V’s. Let X 1 and X 2 be random variables having joint density f ( x,y ). Assume also that X i has finite second moment for each i = 1 , 2. Then, ( EE [ XY ]) 2 ≤ EE [ X 2 ] EE [ Y 2 ] . (2 . 1) Proof. Unless Y = tX for some constant t , in which case this in equality holds with equality, it follows that for all t < EE [( tX + Y ) 2 ] = EE [ X 2 ] t 2 + 2 EE [ XY ] t + EE [ Y 2 ] . Hence the quadratic equation (in t ) EE [ X 2 ] t 2 + 2 EE [ XY ] t + EE [ Y 2 ] = 0 does not have real roots, which implies that the discriminant must be negative: (2 EE [ XY ]) 2 < 4 EE [ X 2 ] EE [ Y 2 ] . Thus, (2.1) is established, which is known as the CauchySchwartz in equality....
View
Full
Document
This note was uploaded on 03/19/2012 for the course SCIENCE ST2131 taught by Professor Forgot during the Fall '08 term at National University of Singapore.
 Fall '08
 Forgot

Click to edit the document details