ISyE8843
Brani Vidakovic
Friday 9/10/04 Name:
Quiz 3
Select one problem!
Weighted Square Error Loss. We have seen that under squared error loss the Bayes rule [minimizer of
Bayes risk r(, ) or equivalently conditional on x minimizer of posterior expected
ISyE8843
Brani Vidakovic
Friday 9/17/04 Name:
Quiz 4
Choose one problem
1
1. Assume X | is exponential E (1/) with density f (x|) = ex/ , x 0. Let F be the cdf corresponding
to f. Assume a prior on , ().
x
Let m (x) = f (x|) ()d be the marginal and M (x)
ISyE8843
Brani Vidakovic
Friday 29/10/04 Name:
Quiz 7
Wallace-Freeman MML Estimator. Recall that the Minimum Message Length (MML) estimate, based
on X1 , . . . , Xn f (x|) is dened as
n
argmin [ log () log
f (xi |) +
i=1
1
log |I ()|],
2
where () is the p
Answer for Quiz 7 (Ni Wang)
Answer: First, lets calculate the Fisher information for from the Negative Binomial N B (m, )
distribution.
log f (x|) = m log + x log(1 ) + const
x
m
log f (x|) =
+
2
( 1)2
m
1
m(1 )
L() =
+
2
2
( 1)
m
=
2 (1 )
2
To nd the mi
ISyE8843
Brani Vidakovic
Friday 29/10/04 Name:
Quiz 7
Wallace-Freeman MML Estimator. Recall that the Minimum Message Length (MML) estimate, based
on X1 , . . . , Xn f (x|) is dened as
n
argmin [ log () log
f (xi |) +
i=1
1
log |I ()|],
2
where () is the p
ISyE8843
Brani Vidakovic
Friday 9/17/04 Name:
Quiz 4
Choose one problem
1
1. Assume X | is exponential E (1/) with density f (x|) = ex/ , x 0. Let F be the cdf corresponding
to f. Assume a prior on , ().
x
Let m (x) = f (x|) ()d be the marginal and M (x)
ISyE8843
Brani Vidakovic
Friday 9/10/04 Name:
Quiz 3
Select one problem!
Weighted Square Error Loss. We have seen that under squared error loss the Bayes rule [minimizer of
Bayes risk r(, ) or equivalently conditional on x minimizer of posterior expected
ISyE8843
Brani Vidakovic
Labor Day Name:
Take Home Quiz 2
Mushrooms. The unhappy outcome of uninformed mushroom-picking is poisoning. In many cases such
poisoning is due to ignorance or a supercial approach to identication. The most dangerous fungi are De
I SyE8843
Brani Vidakovic
Fliday, 8/27 / 04 Name:
1. Lifetime. A lifetime X of a particular m achine i s m odeled b y a n e xponential d istribution w ith u nknown
parameter 0.
onbasis
:
If theparametrizationis/(rl9)0e-0', x)0,0 )0,theMLEestimatorfordis0u
MIDTERM EXAM (ISYE8843 FALL 2004)
Chengyuan Ma
School of Electrical and Computer Engineering
Georgia Institute of Technology
Atlanta, GA 30332, USA
[email protected]
1. SOLUTION TO PROBLEM 1
the matlab code for this problem.
clear all ;
close all ;
ISyE8843
Brani Vidakovic
Labor Day Name:
Take Home Quiz 2
Mushrooms. The unhappy outcome of uninformed mushroom-picking is poisoning. In many cases such
poisoning is due to ignorance or a supercial approach to identication. The most dangerous fungi are De
I SyE8843
Brani Vidakovic
Fliday, 8/27 / 04 Name:
1. Lifetime. A lifetime X of a particular m achine i s m odeled b y a n e xponential d istribution w ith u nknown
parameter 0.
onbasis
:
If theparametrizationis/(rl9)0e-0', x)0,0 )0,theMLEestimatorfordis0u
MIDTERM EXAM (ISYE8843 FALL 2004)
Chengyuan Ma
School of Electrical and Computer Engineering
Georgia Institute of Technology
Atlanta, GA 30332, USA
[email protected]
1. SOLUTION TO PROBLEM 1
the matlab code for this problem.
clear all ;
close all ;
ISYE 8843 Final Exam
Hongmei Chen
1. Bayesian Wavelet Shrinkage. This open ended question is essentially
asking to select a data set with noise present in it (a noisy signal, function, or
noisy image), transform the data to the wavelet domain, apply shrin
ISYE8843 Final
Jinyu Li
1. Question 1
I use Bayesian Wavelet Shrinkage to denoise. I choose a simple signal as following:
t=linspace(0,1,1024);
sig = (sin(5*pi*t)+2.0*cos(10*pi*t)+3.0*sin(15*pi*t).*exp(-t);
The noise I add has different size, as 0.2, 0.4
FINAL EXAM (ISYE8843 FALL 2004)
Chengyuan Ma
School of Electrical and Computer Engineering
Georgia Institute of Technology
Atlanta, GA 30332, USA
[email protected]
1. BAYESIAN WAVELET SHRINKAGE
Wavelet Transformation transforms the original signal i
Final Exam
Ni Wang
1
Bayesian Wavelet Shrinkage
Figure 1 shows the famous blocky function from Donoho and Johnstone (1994). Figure 2 shows the
plot of noisy function when random noise is added to the function.
The signal is contaminated with additive nois
FINAL EXAM
ISyE 8843: Bayesian Statistics
Brani Vidakovic
due before noon, Thursday 12/9/2004.
Name
1. Bayesian Wavelet Shrinkage. This open ended question is essentially asking to select a data set with
noise present in it (a noisy signal, function, or n
ISyE 8843 Bayes Statistics
Fall 2004
Midterm
Version 2.0
James D. Delaney
October 27, 2004
Problem 1: Nematodes
The Nematode data is an example of an unbalanced one-way layout. Assuming the one-way ANOVA model:
yij = i + ij
2
2
ij |i N (0, i )
The practic
MIDTERM EXAM
ISyE 8843: Bayesian Statistics
Brani Vidakovic
Friday, 10/15/2004.
Name
1. Nematodes. Some varieties of nematodes (roundworms that live in the soil and are frequently so small
they are invisible to the naked eye) feed on the roots of lawn gra
Midterm Exam
ISyE8843
Abhyuday Mandal
22 October, 2004
1
Problem 1
The model is yij = i + ij ,
i = 1, . . . , k ; j = 1, . . . , ni , where k = 3, n1 = 9, n2 = 11 and n3 = 14. Among
the nine parameters = (1 , 2 , 3 , , 1 , 2 , 3 , 2 , ), we are interested
Bayesian Data Analysis, Midterm I
Bugra Gedik
[email protected]
October 23, 2004
Q1)
I have used Gibs sampler to solve this problem. 5,000 iterations with burn-in value of 1,000 is used. The resulting
2
2
histograms representing the posterior distribut
ISyE 8843, Mid-term Exam
Tirthankar Dasgupta
1
Problem 1
We have the model
yij = i +
ij ,
i = 1, . . . , k ; j = 1, . . . , ni
where k = 3, n1 = 9, n2 = 11, n3 = 14.
We thus have nine parameters of interest, i.e., = (1 , 2 , 3 , 1 , 2 , 3 , , 2 , ). The
f
ISyE 8843, Mid-term Exam
Tirthankar Dasgupta
1
Problem 1
We have the model
yij = i +
ij ,
i = 1, . . . , k ; j = 1, . . . , ni
where k = 3, n1 = 9, n2 = 11, n3 = 14.
We thus have nine parameters of interest, i.e., = (1 , 2 , 3 , 1 , 2 , 3 , , 2 , ). The
f
Bayesian Data Analysis, Midterm I
Bugra Gedik
[email protected]
October 23, 2004
Q1)
I have used Gibs sampler to solve this problem. 5,000 iterations with burn-in value of 1,000 is used. The resulting
2
2
histograms representing the posterior distribut
ISyE8843A, Brani Vidakovic
1
Handout 5
Priors
A prior is a sword and Achilles heel of Bayesian statistics. Priors are carriers of prior information that is
coherently incorporated via Bayes theorem to the inference. At the same time, parameters are unobse
ISyE8843A, Brani Vidakovic
1
Handout 4
Decision Theoretic Setup: Loss, Posterior Risk, Bayes Action
Let A be action space and a A be an action. For example, in estimation problems, A is the set of real
numbers and a is a number, say a = 2 is adopted as an
ISyE8843A, Brani Vidakovic
1
Handout 3
Ingredients of Bayesian Inference
The model for a typical observation X conditional on unknown parameter is f (x|). As a function of ,
f (x|) = () is called likelihood. The functional form of f is fully specied up t
ISyE8843A, Brani Vidakovic
Handout 1
1
Probability, Conditional Probability and Bayes Formula
The intuition of chance and probability develops at very early ages.1 However, a formal, precise definition of the probability is elusive. If the experiment can