NPTEL IITKGP Introduction to Machine Learning Assignment 2 Answers

We Discuss About That NPTEL IITKGP Introduction to Machine Learning Assignment 2 Answers

NPTEL IITKGP Introduction to Machine Learning Assignment 2 Answers – Here All The Questions and Answers Provided to Help All The Students and NPTEL Candidate as a Reference Purpose, It is Mandetory to Submit Your Weekly Assignment By Your Own Understand Level.

Are you looking for the Assignment Answers to NPTEL IITKGP Introduction to Machine Learning Assignment 2 Answers? If Yes You are in Our Great Place to Getting Your Solution, This Post Should be help you with the Assignment answer to the National Programme on Technology Enhanced Learning (NPTEL) Course “NPTEL IITKGP Introduction to Machine Learning Assignment 2 Answers”

NPTEL IITKGP Introduction to Machine Learning

ABOUT THE COURSE :

This course provides a concise introduction to the fundamental concepts in machine learning and popular machine learning algorithms. We will cover the standard and most popular supervised learning algorithms including linear regression, logistic regression, decision trees, k-nearest neighbour, an introduction to Bayesian learning and the naĂŻve Bayes algorithm, support vector machines and kernels and neural networks with an introduction to Deep Learning. We will also cover the basic clustering algorithms. Feature reduction methods will also be discussed. We will introduce the basics of computational learning theory. In the course we will discuss various issues related to the application of machine learning algorithms. We will discuss hypothesis space, overfitting, bias and variance, tradeoffs between representational power and learnability, evaluation strategies and cross-validation. The course will be accompanied by hands-on problem solving with programming in Python and some tutorial sessions.

Next Week Assignment Answers

SciShowEngineerTelegram

This course can have Associate in Nursing unproctored programming communication conjointly excluding the Proctored communication, please check announcement section for date and time. The programming communication can have a weightage of twenty fifth towards the ultimate score.

Final score = Assignment score + Unproctored programming exam score + Proctored Exam score
  • Assignment score = 25% of average of best 8 assignments out of the total 12 assignments given in the course.
  • ( All assignments in a particular week will be counted towards final scoring – quizzes and programming assignments). 
  • Unproctored programming exam score = 25% of the average scores obtained as part of Unproctored programming exam – out of 100
  • Proctored Exam score =50% of the proctored certification exam score out of 100
YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF ASSIGNMENT SCORE >=10/25 AND
UNPROCTORED PROGRAMMING EXAM SCORE >=10/25 AND PROCTORED EXAM SCORE >= 20/50. 
If any one of the 3 criteria is not met, you will not be eligible for the certificate even if the Final score >= 40/100. 

CHECK HERE OTHERS NPTEL ASSIGNMENTS ANSWERS 

BELOW YOU CAN GET YOUR NPTEL IITKGP Introduction to Machine Learning Assignment 2 Answers 2022? :

 

Answers will be Uploaded Shortly and it will be Notified on Telegram, So JOIN NOW
JoinScishowEngineerTelegram

1. In a binary classification problem, out of 30 data points 12 belong to class I and 18 belong to class II. What is the entropy of the data set?

A. 0.97
B 0
C. 1
D. 0.67

Answer:- c

2. Decision trees can be used for the problems where

A. the attributes are categorical.
B. the attributes are numeric valued.
C. the attributes are discrete valued.
D. In all the above cases.

Answer:- d
Answers will be Uploaded Shortly and it will be Notified on Telegram, So JOIN NOW
JoinScishowEngineerTelegram

3. Which of the following is false?

A. Variance is the error of the trained classifier with respect to the best classifier in the concept class.
B. Variance depends on the training set size.
C. Variance increases with more training data.
D. Variance increases with more complicated classifiers.

Answer:- c

4. In linear regression, our hypothesis is h (x) = 6+ 0x, the training data is given in the table. A

What is the value of J(0) when 6 = (1,1).

A. 0
B. 1
C. 2
D. 0.5

Answer:- b

5. The value of information gain in the following decision tree is:

A. 0.380
B. 0.620
C. 0.190
D. 0.477

Answer:- a

6. What is true for Stochastic Gradient Descent?

A. In every iteration, model parameters are updated for multiple training samples
B. In every iteration, model parameters are updated for one training sample
C. In every iteration, model parameters are updated for all training samples
D. None of the above

Answer:- b
Answers will be Uploaded Shortly and it will be Notified on Telegram, So JOIN NOW
JoinScishowEngineerTelegram

7. The entropy of the entire dataset is

A. 0.5
B. 1
C. 0
D. 0.1

Answer:- c

8. Which attribute will be the root of the decision tree?

A. Green
B. Legs
C. Height
D. Smelly

Answer:- b

9. In Linear Regression the output is:

A. Discrete
B. Continuous and always lies in a finite range
C. Continuous
D. May be discrete or continuous

Answer:- c

10. Identify whether the following statement is true or false? Overfitting is more likely when the set of training data is small

A. True
B. False

Answer:- a
Answers will be Uploaded Shortly and it will be Notified on Telegram, So JOIN NOW
JoinScishowEngineerTelegram

Yhaa You have done it but next? if YOU Want to your Others NPTEL IITKGP Introduction to Machine Learning Assignment 2 Answers Then Follow US HEREand Join Telegram.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *