CS 260: Machine Learning-Homework 4 -Spring 2020 Solved

35.00 $

Category:

Description

5/5 - (1 vote)

The following questions are from Understanding Machine Learning: From Theory to Algorithms by Shai Shalev-Shwartz and Shai Ben-David. It can be found here http://www.cs.huji.ac.il/ shais/UnderstandingMachineLearning/ by courtesy of the authors.

 

  1.  In this problem, you are asked to implement Adaboost algorithm using decision stump as the weak learner for binary classification. Consider the base hypothesis class of decision stumps over Rd is [1]

HDS = {x 7→ sign(θ − xi) : θ ∈R,i ∈ [d]}

and the sign function is defined as:

(

1 if x ≥ 0

sign(x) =

−1 otherwise

Please report your answer of the following questions to Gradescope, and submit your source code to CCLE. Your answer will NOT be graded if we do not see your source code submission on CCLE. Note that, you are allowed to use other programming languages for your implementation. If so, you may need to create an csv data loader yourself and read the data from ./data/*.csv.

1

  • Run the skeleton code, report the training and testing error obtained by the Adaboost model using decision stump implemented in scikit-learn.
  • Implement ERM algorithm for Decision Stumps. Replace skeleton code line 52-55 with your implementation. Report the training error and testing error of a single decision stump with uniform distribution.
  • Implement Adaboost algorithm based on the above implemented Decision Stump classifier. Replace line 118-120 with your own implementation. Plot the training error and testing error with respect to the number of iterations of training the Adaboost.
  • Report the final training error and testing error of adaboost.

2

[1] Note that the definition of decision stump is slightly different from the slides. Here we will follow the definition in the textbook.

  • HW4-ecsdr5.zip