Stat841f09 - Wiki Course Notes

# Be the vector with ith element let be an diagonal

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: on- Raphson update is where the derivatives are evaluated at The iteration will terminate when is very close to . The iteration can be described in matrix form. wikicour senote.com/w/index.php?title= Stat841&pr intable= yes 30/74 10/09/2013 Stat841 - Wiki Cour se Notes Let Let Let be the column vector of . ( ) be the input matrix. be the vector with ith element Let be an . diagonal matrix with ith element then The Newton- Raphson step is This equation is sufficient for computation of the logistic regression model. However, we can simplify further to uncover an interesting feature of this equation. where This is a adjusted response and it is solved repeatedly when , , and changes. This algorithm is called iteratively reweighted least squares (http://en.wikipedia.org/wiki/Iteratively_reweighted_least_squares) because it solves the weighted least squares problem repeatedly. Recall that linear regression by least square finds the following minimum: we have Similarly, we can say that is the solution of a weighted least square problem: WLS Actually, the weighted least squares estimator minimizes the weighted sum of squared errors where . Hence the WLS estimator is given by A weighted linear regression of the iteratively computed response Therefore, we obtain note :Here we obtain , which is a regression, will be a vector, because we construct the model like . If we construct the model like , then similar to linear vector. Choosing seems to be a suitable starting value for the Newton- Raphson iteration procedure in this case. However, this does not guarantee convergence. The procedure will usually converge since the log- likelihood function is concave(or convex), but overshooting can occur. In the rare cases that the log- likelihood wikicour senote.com/w/index.php?title= Stat841&pr intable= yes 31/74 10/09/2013 Stat841 - Wiki Cour se Notes decreases, cut step size by half, then we can always have convergence. In the case that it does not, we can just prove the local con...
View Full Document

## This document was uploaded on 03/07/2014.

Ask a homework question - tutors are online