Chapter_5 - Chapter 5 Multivariate Probability...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Chapter 5 Multivariate Probability Distributions Abstract: In this chapter we will study the relationships among random variables, which will be characterized by the joint probability distribution of random variables. Most insights into multivariate distributions can be gained by focusing on bivariate distributions. We first introduce the joint probability distribution of a bivariate random vector ( X,Y ) via the characterization of the joint cumulative distribution function, the joint probability mass function (when ( ) are discrete), and the joint probability density function (when ( ) are continuous). We then char- acterize various aspects of the relationship between X and Y using the conditional distributions, correlation, and conditional expectations. The concept of independence and its implications on the joint distributions, conditional distributions and correlation are also discussed. We also introduce a class of bivariate normal distribution and examine its important properties. Key words: Bivariate normal distribution, Bivariate transformation, Conditional distribution, Conditional mean, Conditional variance, Correlation, Independence, Joint moment generating function, Joint probability distribution, Law of iterated expectations, Marginal distribution. 5.1 Random Vectors and Joint Probability Distributions Any economy is a system which consists of different units. These units are generally related to each other. As a consequence, economic variables are interrelated. The most important goal of economic analysis and econometric analysis is to identify the relationships between economic events or economic variables. As discussed in Chapter 2, given events A and B, their joint probability P ( A B ) describes a predictive relationship. Such a relationship can be exploited to predict one using the other. Definition 1 (5.1) . [Random Vector] An n -dimensional random vector, denoted as Z = ( Z 1 ,Z 2 ,...,Z n ) 0 , is a function from a sample space S into R n , the n -dimensional Euclidean space. For each outcome s S, Z ( s ) is an n -dimensional real-valued vector and is called a realization of the random vector Z. In this chapter, we will mainly focus on bivariate probability distributions, which can illustrate most (but not all) essentials of multivariate probability distributions. We now consider two random variables ( ) in most of the subsequent discussion, where both X and Y are defined on the same probability space ( S, B ,P ). A realization of ( ) will be a pair ( x,y ) R 2 . Having defined a bivariate random vector ( ) , we can now discuss probabilities of events that are defined in terms of ( ) . How to characterize the joint probability distribution of X and Y ? Like in the univariate case, we can use the CDF, now called the joint CDF, of X and Y, to characterize their joint distribution.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/07/2012 for the course ECON 6190 taught by Professor Hong during the Fall '07 term at Cornell University (Engineering School).

Page1 / 75

Chapter_5 - Chapter 5 Multivariate Probability...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online