If it does the exponenmal loss will increase instead

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: f ( x ) = α r h r ( x , the value ) ∑ r α cannot get negaMve. If it does, the exponenMal loss will increase instead r of decrease. A)  True € B)  False € 3 •  Adaboost is A)  SensiMve to “outliers” (hard to classify examples). The reason is that it fits too aggressively. B)  SensiMve to outliers because the exponenMal loss penalizes them too harshly. C)  InsensiMve to outliers because boosMng fits a very weak classifier at every round and therefore fits very slowly. D)  InsensiMve because the exponenMal loss effecMvely ignores outliers. 4 •  If in round “r ” of Adaboost we use a learner h(x) that separates the dataset perfectly, then this classifier would receive infinite weight in the ensemble. A)  True B)  False...
View Full Document

Ask a homework question - tutors are online