# sol_hw6 - ECE340 spring 2010 Homework6 Solutions Problems...

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECE340 spring 2010 Homework6 Solutions Problems: 34.2, 35.1, 36.1, 36.2, 37.1, 37.2 34.2 a) To find the variances of U and V, we use the following steps: 3 4 3 4 9 24 16 3 4 24 16 9 24 16 9 9 16 24 24 16 9 9 9 16 25 481 5 2 5 2 20 4 25 5 2 25 20 4 20 4 25 25 4 20 20 4 25 25 9 4 25 325 b) To find the correlation coefficient of U and V, we follow the definition: In our case: 3 15 15 15 15 4 14 5 2 8 8 8 14 3 3 4 4 15 5 5 2 2 14 8 14 14 8 15 9 8 25 481325 0.1644 35.1 a) Firstly, if we introduce another random variable R such that R = 2Y, then the pdf of R is 1 1 2 2 4 0 This can be obtained, for example, by first going after the probability distribution function of R ()as we always do in class), relating it to that of X, and then taking the derivative with respect to r. Now we have the following: 2 Now, for every fixed , must be such that Thus, , Now, since X and Y are statistically independent, we know that X and R are also independent, we have the following: We also know that the pdf of Z is the following: , which is a convolution between the pdf of R and that of X. Note that we could have come to this expression by recalling that the pdf of the sum of two independent random variables is the convolution of the respective pdfs. To evaluate the convolution, we have the following 5 cases: Case 1: 2 0 Case 2: 2 2 1 1 2 4 Case 3: 4 1 1 2 1 2 4 1 4 Case 4: 2 3 1 2 4 3 4 4 Case 5: 3 3 0 So, to sum up we have the result of the pdf of random variable Z =X + 2Y as the following: 1 2 4 1 1 4 3 2 4 4 0 b) To find the probability that 0 0 1 2 3 1, we have the following: 1 1 4 1 4 36.1 Firstly, we introduce another random variable W as the following: Now we layout the random variables we have in hand and their connections: , , , , Now 0 1 So the joint pdf of Z and W is the following: 1 1 1 4 | | , , 4 1, 0 0 4 1, 1 0 Now the marginal pdf for Z is the following: , , , , , 1 , To evaluate the integral, we have discussions on the value of Z: Case 1: 0 , Case 2: 1 4 2 3 1 , 4 2 2 3 6 4 36.2 To sum up, we have the following: 2 0 3 2 6 4 1 3 0 1 2 Firstly, we are concerned about how we generate random variables X, Y in Matlab with the specified pdf in problem 36.1 as the following: 2 0 1 0 5 Notice that if we have two independent uniformly distributed random variables P, Q such that both of them have the following pdf: (we can easily generate uniformly distributed random variable with command `rand' in Matlab) 1 0 1 0 1 0 1 0 And we define: Then we have the following: , 0 1 So, 1 Thus, Similarly, we have the following pdf for Y: 2 0 0 1 2 0 0 1 , 0 1 2 0 0 1 What we can see from above is that if we generate two independent random variables P and Q in Matlab with command `rand', and let We get the two independent random variables that have the pdf specified in problem 36.1. The following Matlab code is used for generating a graphical approximation to the pdf of Z, where Z is the sum of two independent random variables, X and Y, distributed in region [0, 1] with the following pdf: 2 0 1 0 2 0 1 0 clc clear all close all N = 100000; %number of samples 6 M = 50;%number of histogram bins p = rand(1,N); %generate N samples of P as a uniformly distributed random variable from 0 to 1 q = rand(1,N); %generate N samples of Q as a uniformly distributed random variable from 0 to 1 x = sqrt(p); y = sqrt(q); z = x + y; %N samples of random variable Z %generate the pdf of Z calculated from problem 3-6.1 f_Z = zeros(1000); t = zeros(1000); for i = 1:1000 t(i) = (i-1)*(2/1000); if t <= 1 f_Z(i) = (2/3)*(t(i)^3); else f_Z(i) = (2/3)*(-t(i)^3+6*t(i)-4); end end figure(1) subplot(2,1,1) [m,n]=hist(x,M); bin = max(n)/M;%bin size pp = m/(N*bin); bar(n,pp)%plot the graph grid on title('PDF of X'); xlabel('x'); ylabel('f_x'); xlim([0,1]); subplot(2,1,2) [m,n]=hist(y,M); bin = max(n)/M;%bin size pp = m/(N*bin); bar(n,pp)%plot the graph grid on title('PDF of Y'); xlabel('y'); ylabel('f_y'); xlim([0,1]); figure(2) [m,n]=hist(z,M); bin = max(n)/M;%bin size pp = m/(N*bin); bar(n,pp)%plot the graph grid on title('PDF of Z'); xlabel('z'); ylabel('f_z'); xlim([0,2]); hold on plot(t,f_Z,'r--','LineWidth',4) 7 legend('approximation of pdf of Z','calculated pdf of Z from problem 36.1') The plots are shown below: Notice that the shape of the approximated pdf (blue bars in second figure) of Z is very close the pdf function we calculated in problem 36.1. 8 37.1 Firstly, the characteristic functions for random variables X and Y is found as follows: 3 3 3 1 1 3 3 37.2 a) Firstly we notice that for a Gaussian random variable X with zero mean and variance have the following pdf 1 , 2 The characteristic function of the random variable X is found via 1 1 2 2 Now recall the Fouriertransform pair: Now, Since Z = X + Y, and X and Y are independent, the characteristic function of Z is So, 1 3 1 1 3 1 2 1 3 3 3 2 , we In our case: 1 2 2 1 2 So, we have the characteristic function can be written as the following: 1 1 2 2 2 Now the characteristic function for random variable X (Gaussian) is the following: b) On page 72 of the book, we have the equation for the central moment for a Gaussian random variable (equation (227)) as the following: 0, , 1 3 5 1 To verify this with the characteristic function of X, we do the following: Since in our case the mean E[X] = 0, our central moment becomes: Also according to the equation (353) on page 150 of the book, we have the following: 9 1 So, the first central moment of X is: 1 1 1 | 0 The second central moment of X is: 1 d 1 1 1 | | | 2 1 Following these steps for n = 3, 4, 5 ..., we see the nth central moment is indeed: 0, 1 3 5 1 , 10 ...
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online