010+Correlation

The joint probability distribution and the joint

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: dx · = (y − µy )fy (y )dy = 0. The joint probability distribution and the joint probability density function can be characterized by means, variances and covariance: +∞ µx = E (x) = xP (X = x) = xfx (x)dx, −∞ x∈X +∞ µy = E (y ) = yfy (y )dy, yP (Y = y ) = −∞ y ∈Y 2 σx = V ar(x) = E (x − µx )2 = (x − µx )2 P (X = x) = x∈X +∞ = (x − µx )2 fx (x)dx, −∞ 2 σy = V ar(y ) = E (y − µy )2 = (y − µy )2 P (X = y ) = y ∈X +∞ = −∞ σxy = CoV ar(x, y ) = E [(x − µx )(y − µy )] = 2 (y − µy )2 fy (y )dy, A. Zhensykbaev Correlation = (x − µx )(y − µy ) = x∈X y ∈Y = (x − µx )(y − µy )f (x, y )dxdy. Covariance matrix is 2 σx σxy 2 σxy σy Σ= Unbiased estimators are µx = x, ˆ ¯ 1 σx = ˆ n−1 µy = y , ˆ ¯ n 1 σy = ˆ n−1 2 (xi − x) , ¯ i=1 σxy ˆ 1 = n−1 n (yi − y )2 ¯ i=1 n (xi − x)(yi − y ). ¯ ¯ i=1 Correlation coecient. The stochastic dependence of X on Y or Y on X is characterized in some sense by the correlation coecient ρ= σxy 2 σx · 2 σy = E [(x − E (x))(y ...
View Full Document

This note was uploaded on 02/11/2014 for the course MATH 1390 taught by Professor Christopherstocker during the Spring '13 term at Marquette.

Ask a homework question - tutors are online