Enter your answer here enter your answer here enter

This preview shows page 4 - 11 out of 24 pages.

Enter your answer here Enter your answer here Enter your answer here Save Answer Q2 Naïve Bayes 36 Points In this question, we will train a Naive Bayes classifier to predict class labels as a function of input features . We are given the following 15 training points: k → ∞ θ = R LAP,∞ θ = G LAP,∞ θ = B LAP,∞ Y F i
4/7/2021 Submit HW 9 (Electronic Component) | Gradescope Q2.1 4 Points What is the maximum likelihood estimate of the prior ?
5/24 Enter your answer here Save Answer Q2.2 5 Points What are the maximum likelihood estimates of the conditional probability distributions? Fill in the probability values below (the tables for the second and third features are done for you). : P ( F = 1 0∣ Y = A ) Enter your answer here
4/7/2021 Submit HW 9 (Electronic Component) | Gradescope 6/24 : Enter your answer here : Enter your answer here : Enter your answer here : Enter your answer here : Enter your answer here Save Answer Q2.3 3 Points Now consider a new data point . Use your classifier to determine the joint probability of causes and this new P ( F = 1 1∣ Y = A ) P ( F = 1 0∣ Y = B ) P ( F = 1 1∣ Y = B ) P ( F = 1 0∣ Y = C ) P ( F = 1 1∣ Y = C ) ( F = 1 1, F = 2 1, F = 3 1) Y
4/7/2021 Submit HW 9 (Electronic Component) | Gradescope data point, along with the posterior probability of given the new data: Enter your answer here Enter your answer here Enter your answer here Save Answer Q2.4 3 Points Enter your answer here Enter your answer here Enter your answer here Save Answer Y P ( Y = A , F = 1 1, F = 2 1, F = 3 1) P ( Y = B , F = 1 1, F = 2 1, F = 3 1) P ( Y = C , F = 1 1, F = 2 1, F = 3 1)
7/24 3 Points
4/7/2021 Submit HW 9 (Electronic Component) | Gradescope What label does your classifier give to the new data point in part 3? (Break ties alphabetically) 4 Points The training data is repeated here for your convenience: Now use Laplace Smoothing with strength k = 2 to estimate the prior for the same data. : Enter your answer here : Enter your answer here : P ( Y ) P ( Y = A ) P ( Y = B ) P ( Y = C ) Enter your answer here B C
8/24
4/7/2021 Submit HW 9 (Electronic Component) | Gradescope 9/24 Q2.7 5 Points Use Laplace Smoothing with strength k = 2 to estimate the conditional probability distributions below (again, the second two are done for you). : Enter your answer here : Enter your answer here : Enter your answer here : Enter your answer here : Enter your answer here : Enter your answer here P ( F = 1 0∣ Y = A ) P ( F = 1 1∣ Y = A ) P ( F = 1 0∣ Y = B ) P ( F = 1 1∣ Y = B ) P ( F = 1 0∣ Y = C ) P ( F = 1 1∣ Y = C )
4/7/2021 Submit HW 9 (Electronic Component) | Gradescope 10/24 Save Answer Q2.8 9 Points Now consider again the new data point ( =1, =1, =1). Use the Laplace-Smoothed version of your classifier to determine the joint probability of causes and this new data point, along with the posterior probability of given the new data: Enter your answer here

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture