class03-1-handouts

# class03-1-handouts - PSTAT 120B - Probability &amp;...

This preview shows pages 1–8. Sign up to view the full content.

Class # 03-1- Maximum likelihood estimation Jarad Niemi University of California, Santa Barbara 12 April 2010 Jarad Niemi (UCSB) Maximum likelihood estimation 12 April 2010 1 / 24

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Class overview Announcements Announcements Homework 2 due today by 4pm. Homework 1 answer key up. Today’s material also reviews sections 5.1–5.2, 5.4 Society of actuaries Open House (next slide). Jarad Niemi (UCSB) Maximum likelihood estimation 12 April 2010 2 / 24
Class overview Announcements Society of actuaries - open house Then you should consider a career as an actuary. Join us to learn more about one of today’s best careers. WHAT: Informational presentation and open house All interested students are welcome DINNER WILL BE PROVIDED. WHEN: Monday, April 12, 2010 4-6 p.m. WHERE: HSSB 1174 Please RSVP at: actuary@pstat.ucsb.edu For more information, contact info@riskisopportunity.net or visit www.RiskisOpportunity.net Jarad Niemi (UCSB) Maximum likelihood estimation 12 April 2010 3 / 24

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Class overview Goals Today’s goals Joint probability distributions Joint probability mass functions Joint distribution functions Joint probability density functions Likelihood Maximum likelihood estimation Two examples Jarad Niemi (UCSB) Maximum likelihood estimation 12 April 2010 4 / 24
Joint probabilities Discrete random variables Deﬁnition Let Y 1 and Y 2 be discrete random variables. The joint probability mass function (pmf) for Y 1 and Y 2 is given by p ( y 1 , y 2 ) = P ( Y 1 = y 1 , Y 2 = y 2 ) . Theorem If Y 1 and Y 2 are discrete random variables with joint probability mass function p ( y 1 , y 2 ) , then p ( y 1 , y 2 ) 0 for all y 1 , y 2 X y 1 , y 2 p ( y 1 , y 2 ) = X y 1 X y 2 p ( y 1 , y 2 ) = 1 where the sum is over all values ( y 1 , y 2 ) that are assigned non-zero probabilities. Jarad Niemi (UCSB) Maximum likelihood estimation 12 April 2010 5 / 24

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Joint probabilities Discrete random variables Joint pmf for two 6-sided die: 0 1 2 3 4 5 6 7 0.00 0.02 0.04 0.06 0.08 0.10 0 1 2 3 4 5 6 7 y 1 y 2 p ( 1 , 29 Jarad Niemi (UCSB) Maximum likelihood estimation 12 April 2010 6 / 24
Joint probabilities Discrete random variables Deﬁnition For any random variables Y 1 and Y 2 , the joint distribution function F ( y 1 , y 2 ) is F ( y 1 , y 2 ) = P ( Y 1 y 1 , Y 2 y 2 ) , -∞ < y 1 < , -∞ < y 2 < . Theorem

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 11/23/2010 for the course MATH 104b taught by Professor Ceniceros,h during the Spring '08 term at UCSB.

### Page1 / 24

class03-1-handouts - PSTAT 120B - Probability &amp;...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online