Maximum-Likelihood - Maximum Likelihood Estimation by Addie...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Maximum Likelihood Estimation by Addie Andromeda Evans San Francisco State University BIO 710 Advanced Biometry Spring 2008 Estimation Methods Estimation of parameters is a fundamental problem in data analysis. This paper is about maximum likelihood estimation, which is a method that finds the most likely value for the parameter based on the data set collected. A handful of estimation methods existed before maximum likelihood, such as least squares, method of moments and bayesian estimation. This paper will discuss the development of maximum likelihood estimation, the mathematical theory and application of the method, as well as its relationship to other methods of estimation. A basic knowledge of statistics, probability theory and calculus is assumed. Earlier Methods of Estimation Estimation is the process of determining approximate values for parameters of different populations or events. How well the parameter is approximated can depend on the method, the type of data and other factors. Gauss was the first to document the method of least squares, around 1794. This method tests different values of parameters in order to find the best fit model for the given data set. However, least squares is only as robust as the data points are close to the model and thus outliers can cause a least squares estimate to be outside the range of desired accuracy. The method of moments is another way to estimate parameters. The 1st moment is defined to be the mean, and the 2nd moment the variance. The 3rd moment is the skewness and the 4th mo- ment is the kurtosis. In complex models, with more than one parameter, it can be difficult to solve for these moments directly, and so moment generating functions were developed using sophisticated analysis. These moment generating functions can also be used to estimate their respective moments. Bayesian estimation is based on Bayes Theorem for conditional probability. Bayesian analysis starts with little to no information about the parameter to be estimated. Any data collected can be used to adjust the function of the parameter, thereby improving the estimation of the parameter. This process of refinement can continue as new data is collected until a satisfactory estimate is found. 1 Evolution of Maximum Likelihood Estimation It was none other than R. A. Fisher who developed maximum likelihood estimation. Fisher based his work on that of Karl Pearson, who promoted several estimation methods, in particular the method of moments. While Fisher agreed with Pearson that the method of moments is better than least squares, Fisher had an idea for an even better method. It took many years for him to fully conceptualize his method, which ended up with the name maximum likelihood estimation....
View Full Document

This note was uploaded on 12/05/2011 for the course CEE 3040 at Cornell University (Engineering School).

Page1 / 10

Maximum-Likelihood - Maximum Likelihood Estimation by Addie...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online