least-angle-regression-keeps-the-correlations-monotonically-decreasing-and-tied

Least-angle-regression-keeps-the-correlations-monotonically-decreasing-and-tied

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Welcome to Q&A for statisticians, data analysts, data miners and data visualization experts check out the FAQ ! Stack Exchange log in | blog | meta | about | faq Statistical Analysis Questions Tags Users Badges Unanswered Ask Question Least angle regression keeps the correlations monotonically decreasing and tied? up vote 7 down vote favorite 4 share [g+] share [fb] share [tw] I'm trying to solve a problem for least angle regression (LAR). This is a problem 3.23 on page 97 of Hastie et al., Elements of Statistical Learning, 2nd. ed. (5th printing) . Consider a regression problem with all variables and response having mean zero and standard deviation one. Suppose also that each variable has identical absolute correlation with the response: 1 N | , xj y = | , j = ,..., 1 p Let be the least squares coefficient of y on X and let u ( )= X for [ , ] 0 1 . I am asked to show that 1 N | , xj y ( u ) =( | 1 ) , j = ,..., 1 p and I am having problems with that. Note that this can basically says that the correlations of each xj with the residuals remain equal in magnitude as we progress toward u . I also do not know how to show that the correlations are equal to: ( )=( 1 )( 1 ) + 2 ( 2 ) N RSS Any pointers would be greatly appreciated! regression machine-learning correlation homework link | improve this question edited Feb 10 '11 at 7:55 mpiktas 10.5k21136 asked Feb 2 '11 at 3:46 Belmont 1386 86% accept rate 2 @Belmont, what is u ( ) ? Could you provide more context about your problem? Link to article with standard properties of LAR for example would help a lot. mpiktas Feb 2 '11 at 14:00 @Belmont, This looks like a problem from Hastie, et al., Elements of Statistical Learning , 2nd. ed. Is this homework? If so, you might add that tag. cardinal Feb 3 '11 at 1:50 @Belmont, now that @cardinal gave a complete answer, can you specify what LAR really is, for future reference? Judging from the answer this is standard manipulation of products of least squares regressions given some initial constraints. There should not be a special name for it without serious reason. mpiktas Feb 8 '11 at 7:59 1 @mpiktas, it's a stagewise algorithm, so each time a variable enters or leaves the model on the regularization path, the size (i.e., cardinality/dimension) of grows or shrinks respectively and a "new" LS estimate is used based on the currently "active" variables. In the case of the lasso, which is a convex optimization problem, the procedure is is essentially exploiting special structure in the KKT conditions to obtain a very efficient solution. There are also generalizations to, e.g., logistic regression based on IRLS and Heine-Borel (to prove convergence in finite no. of steps.) cardinal Feb 8 '11 at 13:19...
View Full Document

This note was uploaded on 02/27/2012 for the course STATS 315A taught by Professor Tibshirani,r during the Spring '10 term at Stanford.

Page1 / 8

Least-angle-regression-keeps-the-correlations-monotonically-decreasing-and-tied

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online