{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

least-angle-regression-keeps-the-correlations-monotonically-decreasing-and-tied

Least-angle-regression-keeps-the-correlations-monotonically-decreasing-and-tied

This preview shows pages 1–4. Sign up to view the full content.

×Welcome to Q&A for statisticians, data analysts, data miners and data visualization experts — check out the FAQ ! Stack Exchange log in | blog | meta | about | faq Statistical Analysis Questions Tags Users Badges Unanswered Ask Question Least angle regression keeps the correlations monotonically decreasing and tied? up vote 7 down vote favorite 4 share [g+] share [fb] share [tw] I'm trying to solve a problem for least angle regression (LAR). This is a problem 3.23 on page 97 of Hastie et al., Elements of Statistical Learning, 2nd. ed. (5th printing) . Consider a regression problem with all variables and response having mean zero and standard deviation one. Suppose also that each variable has identical absolute correlation with the response:

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
1 N |⟨ , xj y = | λ , j = ,..., 1 p Let β ˆ be the least squares coefficient of y on X and let u ( α )= α X β ˆ for α [ , ] 0 1 . I am asked to show that 1 N |⟨ , xj y ( u α ) =( | 1 α ) λ , j = ,..., 1 p and I am having problems with that. Note that this can basically says that the correlations of each xj with the residuals remain equal in magnitude as we progress toward u . I also do not know how to show that the correlations are equal to: λ ( α )=( 1 α )( 1 α ) + 2 α ( 2 α ) N RSS λ Any pointers would be greatly appreciated! regression machine-learning correlation homework link | improve this question edited Feb 10 '11 at 7:55 mpiktas 10.5k21136 asked Feb 2 '11 at 3:46 Belmont 1386 86% accept rate
2 @Belmont, what is u ( α ) ? Could you provide more context about your problem? Link to article with standard properties of LAR for example would help a lot. – mpiktas Feb 2 '11 at 14:00 @Belmont, This looks like a problem from Hastie, et al., Elements of Statistical Learning , 2nd. ed. Is this homework? If so, you might add that tag. – cardinal Feb 3 '11 at 1:50 @Belmont, now that @cardinal gave a complete answer, can you specify what LAR really is, for future reference? Judging from the answer this is standard manipulation of products of least squares regressions given some initial constraints. There should not be a special name for it without serious reason. – mpiktas Feb 8 '11 at 7:59 1 @mpiktas, it's a stagewise algorithm, so each time a variable enters or leaves the model on the regularization path, the size (i.e., cardinality/dimension) of β grows or shrinks respectively and a "new" LS estimate is used based on the currently "active" variables. In the case of the lasso, which is a convex optimization problem, the procedure is is essentially exploiting special structure in the KKT conditions to obtain a very efficient solution. There are also generalizations to, e.g., logistic regression based on IRLS and Heine-Borel (to prove convergence in finite no. of steps.) – cardinal Feb 8 '11 at 13:19 1 @Belmont -1, as I recently bought the book of Hastie, I can confirm, that this is an exercise from it. So I am giving you a big -1, since you do not even manage to give all

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 8

Least-angle-regression-keeps-the-correlations-monotonically-decreasing-and-tied

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online