0 05 10 15 20 25 30 35 average usage mean weight

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Inference   Regression model Future ALSFRS Slope = f(features) + noise   Goal: infer f from data Unknown regression func-on •  Bayesian: Place a prior on f, infer its posterior •  Bonus: Uncertainty es-mates for each predic-on   What prior? •  Flexible and nonparametric   Avoid restric-ve assump-ons about func-onal form •  Favor simple, sparse models   Avoid overfiing to irrelevant features Bayesian Addi-ve Regression Trees*   f(features) = sum of “simple” decision trees Days since onset > 705  ­0.5 Past ALSFRS slope >  ­0.6 +  ­0.83 0.06 +  ­0.08 •  Simplicity = tree depends on few features   Irrelevant features seldom selected •  Similar to frequen-st ensemble methods   Boosted decision trees, random forests *Chipman, George, and McCulloch (2010) … … BART Inference   Es;ma;ng f: Markov Chain Monte Carlo •  R package ‘bart’ available on CRAN ^ ^ ^ ^ •  10,000 posterior samples: f1 , f2 , f3 , f4 , … ^ fi = … + … … … … … … …+ … … 100 trees •  10 minutes on MacBook Pro (2.5 GHz CPU, 4GB RAM)   Predic;on: Posterior mean •  ^ ^ ^ Average of f1(features), f2(features), f3(features), …   Variance reduc;on •  Average predic-ons of 10 BART models Accuracy...
View Full Document

Ask a homework question - tutors are online