## Description

Objective Question

Consider a *K *class classification problem. Total number of binary classifier trained in a one-vs-all setting are , and for one-vs-one classifier setting are .

2 Objective Question

Which of the following methods do we use to best fit the data in Logistic Regression?

- Least Square Error
- Maximum Likelihood
- Jaccard Distance
- Both A and B

# 3 Programming Question

In tutorial, we saw how to build a multi-class classifier using a one-vs-all setting. In this problem you are required to construct a one-vs-one classifier for the same problem. The starter code is provided in ‘Logistic Regression Excercise 1.ipynb’.

# 4 Subjective Question

Suppose you train a logistic regression classifier and your hypothesis function *h *is

*h _{θ}*(

*x*) =

*g*(

*θ*

_{0 }+

*θ*

_{1}

*x*

_{1 }+

*θ*

_{2}

*x*

_{2})

where,

*θ*_{0 }= 6*, θ*_{1 }= 0*, θ*_{2 }= −1

Draw the decision boundary for the given classifier (a rough sketch is sufficient). What would happen to the decision boundary if you replace the coefficient of *x*_{1 }and *x*_{2}? Draw the decision boundary for the second case as well.

1

# 5 Programming Question

You have been given a dataset of handwritten digits, the MNIST dataset. The dataset consists of 28×28 pixel images consisting of one of 10 digits (0*,*1*,…,*9) that are handwritten. Using logistic regression in one-vs-one and one-vs-all setting construct a 10-class classifier. Report the test accuracy for one-vs-one and one-vs-all classifier. The starter code is provided in ‘Logistic Regression Excercise 2.ipynb’.

2