Stat841f09 - Wiki Course Notes

# Limitations of logistic regression 1 we know that

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: .statgun.com/tutorials/logistic- regression.html) , [16] (http://etd.library.pitt.edu/ETD/available/etd- 04122006102254/unrestricted/realfinalplus_ETD2006.pdf) Exte ns ion When we are dealing with a problem with more than two classes, we need to generalize our logistic regression to a Multinomial Logit model (http://en.wikipedia.org/wiki/Multinomial_logit) . Limitations of Logistic Regression: 1. We know that there is no assumptions are made about the distributions of the features of the data (i.e. the explanatory variables). However, the features should not be highly correlated with one another because this could cause problems with estimation. 2. Large number of data points (i.e.the sample sizes) are required for logistic regression to provide sufficient numbers in both classes. The more number of features/dimensions of the data, the larger the sample size required. Logistic Regression(2) - October 19, 2009 Logis tic Regres s ion Model Recall that in the last lecture, we learned the logistic regression model. wikicour senote.com/w/index.php?title= Stat841&amp;pr intable= yes 29/74 10/09/2013 Stat841 - Wiki Cour se Notes Find Crite ria: find a that maximizes the conditional likelihood of Y given X using the training data. From above, we have the first derivative of the log- likelihood: Newton- Raphson algorithm: If we want to find such that If we want to maximize or minimize , then solve for The Newton- Raphson algorithm (http://en.wikipedia.org/wiki/Newton%27s_method) requires the second- derivative or Hessian matrix (http://en.wikipedia.org/wiki/Hessian_matrix) . (note : you can check it here (http://www.ee.ic.ac.uk/hp/staff/dmb/matrix/intro.html) , it's a very useful website including a Matrix Reference Manual that you can find information about linear algebra and the properties of real and complex matrices.) (by cancellation) (since and ) The same second derivative can be achieved if we reduce the occurrences of beta to 1 by the identity And solving Starting with , the Newt...
View Full Document

Ask a homework question - tutors are online