Bayesian Inference
Last time we discussed the MLE as a systematic way to derive arbitrary estimators. The MLE is the classic approach to parameter
estimation. While its nice to have a recipe and it is possible to derive
some nice properties of the MLE (es
ECE 3077, Spring 2014
Homework #3
Due Friday January 31, 2014, in class
Reading: B&T 1.6 (NOTE: Some problems are starred ( ). You DO NOT have to turn
these problems in. They are just extra practice.)
1. Using you class notes, prepare a 12 paragraph summ
Homework 5 solutions  Spring 2014
2.
c.
d
Homework 5 solutions  Spring 2014
e.
f.
g.
Yes. E[XY] = E[X]*E[Y].
h.
Since X and Y are independent,   = for any value of y.
i.
P(A) = 24/28
1/3
= > 1
=

= 2/3
0
2=
4=
Homework 5 solutions  Spring 2014
3.
Independence
Previously, we talked about conditional probability as a means to
incorporate partial information into a probability law about multiple
outcomes. In other words, if we know something about an event B,
how does that change our belief about A?
I. Introduction to Probability
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell
Basic probability models
A probability model consists of an experiment which produces exactly one out of several mutually exclusive outcomes. The essential
elements ar
Independence of random variables
We say that random variables X and Y are independent if
pX,Y (x, y) = pX (x) pY (y) for all x, y,
that is, if we can factor the joint pmf into a pmf that depends only
on X and a pmf that depends only on Y .
This is the sam
Covariance
When discussing a single RV, we used the notion of variance to capture how much that RV could dier from its expected value. With
two RVs we have a similar notion called the covariance, dened as
cov(X, Y ) = E[(X E[X])(Y E[Y ])] = E[XY ] E[X] E[
Bayes rule for random variables
There are many situations where we want to know X, but can only
measure a related random variable Y or observe a related event A.
Bayes gives us a systematic way to update the pdf for X given this
observation.
We will look
Now that we have the basic tools, lets take a closer look at some of
the most common distributions for continuous random variables.
Uniform distribution
We say that X is uniform on [a, b] if
fX (x) =
1
ba
0
x [a, b]
otherwise.
We write this as
X Uniform([
III. General Random Variables
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell
Continuous random variables
So far, we have only been concerned with random variables that take
a discrete set of values (that is, the possible outcomes are nite or
cou
Joint probability mass functions of multiple
random variables
We are often interested in multiple random variables resulting
from the same experiment. For example, if you run a manufacturing
facility, X may represent the number of failures in a batch of c
The counting principle
Many counting problems can be naturally broken down into multiple
stages. If the outcomes at one stage do not aect the number of
possibilities at the subsequent stages, we can just multiply the number of possibilities at each stage
Expectation of a random variable
Since random variables give us a way to talk quantitatively about
uncertain quantities, they should also give us a way to make predictions about future outcomes. After seeing a lot of trials we could nd
their average. Coul
Homework 8 solutions  Spring 2014
2.
a
b
3.
Homework 8 solutions  Spring 2014
4.
a
b
c
Homework 8 solutions  Spring 2014
% ECE 3077 Spring 2014
% HW 8, Problem 4
% Setup Environment
close all; clear all; clc;
% Part (c)
% Define the xaxis of the PDF
p
Homework 11 solutions, Spring 2014
2.
3.
Homework 11 solutions, Spring 2014
4. (NOTE: This problem was taken from a different book so the notation is a bit different.
/, represents the tail value from the tdistribution table with n1 degrees of freedom f
ECE 3077, Spring 2014
Homework #11
Due Monday, April 21, at the beginning of class
Reading: 9.1
(NOTE: Some problems are starred ( ). You DO NOT have to turn these problems in.
They are just extra practice.)
1. Using you class notes, prepare a 12 paragrap
II. Discrete Random Variables
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell
Discrete Random variables
Until now, we have discussed probability almost entire in the context of events, in which there are only two possibilities: either they
happen
Conditional Probability
Conditional probability gives us a systematic way to reason about
the outcome of an experiment based on partial information.
Examples:
A coin is ipped three times and two of the results are heads.
Whats the probability that the rs
V. Intro to Statistical Inference
ECE 3077 Notes by M. Davenport, J. Romberg and C. Rozell
For the last few weeks of this class, we will apply what we have
learned about probability to some classical statistical inference
problems. Many laypeople dont und
The Weak Law of Large Numbers
In the previous section, we saw that the central limit theorem tells
us that for independent and identically distributed X1, X2, . . . with
mean and variance 2,
2
X1 + X2 + + XN
.
Normal ,
lim
N
N
N
This result is qualitati
Maximum Likelihood Estimation
On previous days, weve talked about specic estimators (e.g., sample
mean, sample variance) that we just sort of picked arbitrarily. We are
interested in dealing with more sophisticated situations where it may
not be so obviou
ECE3077 A
Spring 2016
Prof. Elliot Moore II
HW01
Assigned: Friday, January 15, 2016
Due: Friday, January 22, 2016 (at the start of class)
Instructions:
STAPLE the cover sheet (link on course website
ECE3077 A
Spring 2016
Prof. Elliot Moore II
HW08
Assigned: Monday, March 14, 2016
Due: Monday, March 28, 2016 (at the start of class)
Instructions:
STAPLE the cover sheet (link on course website) t
ECE3077 A
Spring 2016
Prof. Elliot Moore II
HW09
Assigned: Monday, March 28, 2016
Due: Monday, April 4, 2016 (at the start of class)
EXAM 2 is on Wednesday, April 6, 2016 (TWO reference sheets are
ECE 3077, Spring 2016
Homework #0
Due Friday January 16, 2016 in class
1. Go to https:/gatech.instructure.com/. Download and read the syllabus and
complete the Introduction survey (link provided in the assignment description).
Also read the discussion top
Ex #8
All the code is given at the end of the exercise.
a) We get the following plot:
Relative frequencies of all characters (Kervazo, Christophe)
0.2
0.18
0.16
0.14
0.12
0.1
0.08
0.06
0.04
0.02
0
a b c d e f g h i j k l m n o p q r s t u v w x y z ws
b)