For trees γ parameterizes the split variables and split points at the internal

For trees γ parameterizes the split variables and

This preview shows page 15 - 25 out of 31 pages.

For trees, γ parameterizes the split variables and split points at the internal nodes, and the predictions at the terminal nodes [2] c 2019 The Trustees of the Stevens Institute of Technology
Image of page 15

Subscribe to view the full document.

Bagging and Bumping Bagging and Random Forests Boosting Methods Boosting Trees Fitting Typically these models are fit by minimizing a loss function averaged over the training data: min { β m m } M 1 N X i = 1 L y i , M X m = 1 β m b ( x i ; γ m ) ! Or a simple alternative being fitting a single basis function: min β,γ N X i = 1 L ( y i , β b ( x i ; γ )) [2] c 2019 The Trustees of the Stevens Institute of Technology
Image of page 16
Bagging and Bumping Bagging and Random Forests Boosting Methods Boosting Trees Algorithm 10.2: Forward Stagewise Additive Modeling 1. Initialize f 0 ( x ) = 0 2. For m = 1 to M 2.1 Compute ( β m , γ m ) = arg min β,γ N X i = 1 L ( y i , f m - 1 ( x i ) + β b ( x i ; γ )) 2.2 Set f m ( x ) = f m - 1 ( x ) + β m b ( x ; γ m ) [2] c 2019 The Trustees of the Stevens Institute of Technology
Image of page 17

Subscribe to view the full document.

Bagging and Bumping Bagging and Random Forests Boosting Methods Boosting Trees Squared Error Loss L ( y , f ( x )) = ( y - f ( x )) 2 we have: L ( y i ; f m - 1 ( x i ) + β b ( x i ; γ )) = ( y i - f m - 1 ( x i ) - β b ( x i ; γ )) 2 = ( r im - β b ( x i ; γ )) 2 where r im = y i - f m - 1 ( x i ) is the residual of the current model on the i th observation c 2019 The Trustees of the Stevens Institute of Technology
Image of page 18
Bagging and Bumping Bagging and Random Forests Boosting Methods Boosting Trees Exponential Loss L ( y , f ( x )) = e - yf ( x ) Using this results in AdaBoost.M1 being equivalent to forward stagewise additive modeling. (as will be shown) c 2019 The Trustees of the Stevens Institute of Technology
Image of page 19

Subscribe to view the full document.

Bagging and Bumping Bagging and Random Forests Boosting Methods Boosting Trees We use the basis functions as the individual classifiers, and so using the exponential loss function, we have to solve: ( β m , G m ) = arg min β, G N X i = 1 e - y i ( f m - 1 ( x i )+ β G ( x i )) for the classifier G m and the coefficient β m c 2019 The Trustees of the Stevens Institute of Technology
Image of page 20
Bagging and Bumping Bagging and Random Forests Boosting Methods Boosting Trees We can express this as: ( β m , G m ) = arg min β, G N X i = 1 w ( m ) i e - β y i G ( x i ) with w ( m ) i = e - y i f m - 1 ( x i ) . This solution can be obtained in two steps. First for β > 0 we have: G m = arg min G N X i = 1 w ( m ) i I { y i 6 = G ( x i ) } c 2019 The Trustees of the Stevens Institute of Technology
Image of page 21

Subscribe to view the full document.

Bagging and Bumping Bagging and Random Forests Boosting Methods Boosting Trees Our criterion in ( β m , G m ) becomes: e - β X y i = G ( x i ) w ( m ) i + e β X y i 6 = G ( x i ) w ( m ) i which can be written as: ( e β - e - β ) N X i = 1 w ( m ) i I { y i 6 = G ( x i ) } + e - β N X i = 1 w ( m ) i c 2019 The Trustees of the Stevens Institute of Technology
Image of page 22
Bagging and Bumping Bagging and Random Forests Boosting Methods Boosting Trees If we plug this G m into the earlier equation, we have: β m = 1 2 log 1 - err m err m where err m = N i = 1 w ( m ) i I { y i 6 = G m ( x ) } N i = 1 w ( m ) i The approximation is then updated f m ( x ) = f m - 1 ( x ) + β m G m ( x ) c 2019 The Trustees of the Stevens Institute of Technology
Image of page 23

Subscribe to view the full document.

Bagging and Bumping Bagging and Random Forests Boosting Methods Boosting Trees Boosting Trees As a reminder, once we have partitioned a space into regions
Image of page 24
Image of page 25
  • Fall '16
  • alec schimdt

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern

Ask Expert Tutors You can ask You can ask ( soon) You can ask (will expire )
Answers in as fast as 15 minutes