Supervised Machine Learning, Regression and Classification
NithyasriA2
38 views
147 slides
Aug 31, 2025
Slide 1 of 147
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
About This Presentation
Supervised Machine Learning, Regression and Classification
Size: 12.17 MB
Language: en
Added: Aug 31, 2025
Slides: 147 pages
Slide Content
Subject: Machine Learning Module I I : Supervised Learning 1
Unit- I Supervised Learning (Regression/Classification): Basic methods: Distance- based methods, Nearest- Neighbors, Decision Trees, Naive Bayes, Linear models: Linear Regression, Logistic Regression, Support Vector Machines, Nonlinearity and Kernel Methods, Beyond Binary Classification: Multi- class Unit- II Unsupervised Learning: Clustering: K- means, Dimensionality Reduction: PCA and kernel PCA, Generative Models (Gaussian Mixture Models and Hidden Markov Models) Unit-III Evaluating Machine Learning algorithms, Model Selection, Ensemble Methods (Boosting, Bagging, Random Forests)
Unit-IV Modeling Sequence/Time- Series Data, Deep Learning (Deep generative models, Deep Boltzmann Machines, Deep auto- encoders, Applications of Deep Networks) and Feature Representation Learning Unit- V Scalable Machine Learning (Online and Distributed Learning) Semi-supervised Learning, Active Learning, Reinforcement Learning, Inference in Graphical Models, Introduction to Bayesian Learning and Inference
Text Book(s): Kevin Murphy, Machine Learning: A Probabilistic Perspective, MIT Press, 2012. Trevor Hastie, Robert Tibshirani, Jerome Friedman, The Elements of Statistical Learning, Springer, 2017. Jiawei Han, Micheline Kamber, Jian Pei , Data Mining: Concepts and Techniques, 3/e, Morgan Kaufmann, 2016. Christopher Bishop, Pattern Recognition and Machine Learning, Springer, 2016. 19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
A(AI) Making machine to think, analyze and make decision
A(AI) Making machine to think, analyze and make decision
19EEC334A: MACHINE 9 January 2023
Machine Learning (ML)
9 January 2023 Department of EECE, GIT 19EEC334A: MACHINE LEARNING 21
22
23
24
25
A Typical Supervised Learning Workflow (for Classification)
A Typical Supervised Learning Workflow (for Classification)
A Typical Supervised Learning Workflow (for Classification)
A Typical Unsupervised Learning Workflow (for Clustering) 19EEC334A: MACHINE 9 January 2023
A Typical Unsupervised Learning Workflow (for Clustering)
A Typical Reinforcement Learning Workflow
19EEC334A: MACHINE
47
48
49
50
51
Department of EECE, GIT 19EEC334A: MACHINE 52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
19EEC334A: MACHINE 9 January 2023 80
19EEC334A: MACHINE 9 January 2023 81
19EEC334A: MACHINE 9 January 2023 82
19EEC334A: MACHINE 9 January 2023 83
19EEC334A: MACHINE 9 January 2023 84
19EEC334A: MACHINE 9 January 2023 85
19EEC334A: MACHINE 9 January 2023 86
19EEC334A: MACHINE 9 January 2023 87
19EEC334A: MACHINE 9 January 2023 88
19EEC334A: MACHINE 9 January 2023 89
19EEC334A: MACHINE 9 January 2023 90
19EEC334A: MACHINE 9 January 2023 91
19EEC334A: MACHINE 9 January 2023 92
Linear Regression Line using least square method Goodness of Fit (Performance metric) R Square value MSE ( yp y ) 2 ( y y ) 2 R 2 n 19EEC334A: MACHINE 9 January 2023 ( yp y ) 2 MSE On Board: Solve Example On Board: Solve R square Give assignment of MSE
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
9 January 2023 INE 98 Y^= Predicted 19EEC334A: MACH Y= Actual Y^= Predicted Y= Actual
19EEC334A: MACHINE 9 January 2023 99
19EEC334A: MACHINE 9 January 2023 100
19EEC334A: MACHINE 9 January 2023 101
19EEC334A: MACHINE 9 January 2023 102
19EEC334A: MACHINE 9 January 2023 103
19EEC334A: MACHINE 9 January 2023 104
19EEC334A: MACHINE 9 January 2023 106
9 January 2023 Department of EECE, GIT 19EEC334A: MACHINE LEARNING 107 Y^= Predicted Y= Actual
Y^= Predicted 19EEC334A: MACHINE 9 January 2023 Y= Actual
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
9 J anuary 2023 Department of EECE, GIT 19EEC334A: MACHINE LEARNING 115 Y^= Predicted Y= Actual
9 January 2023 Department of EECE, GIT 19EEC334A: MACHINE 116
9 January 2023 Department of EECE, GIT 19EEC334A: MACHINE 117
19EEC334A: MACHINE 9 January 2023 118
19EEC334A: MACHINE 9 January 2023 119
19EEC334A: MACHINE 9 January 2023
Applications of Linear Regression 19EEC334A: MACHINE 9 January 2023 Some popular applications of linear regression are: Analyzing trends and sales estimates Salary forecasting Real estate prediction Arriving at ETAs in traffic.
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
9 January 2023 Department of EECE, GIT 19EEC334A: MA LEARNING Dividing data set into two subsets: Training set— a subset to train a model. Test set— a subset to test the trained model. CHINE
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
9 January 2023 Department of EECE, GIT 19EEC334A: MACHINE LEARNING 132 The sigmoid function is a mathematical function used to map the predicted values to probabilities. It maps any real value into another value within a range of and 1. The value of the logistic regression must be between and 1, which cannot go beyond this limit, so it forms a curve like the "S" form. The S- form curve is called the Sigmoid function or the logistic function. In logistic regression, we use the concept of the threshold value, which defines the probability of either or 1. Such as values above the threshold value tends to 1, and a value below the threshold values tends to 0.
9 January 2023 Department of EECE, GIT 19EEC334A: MACHINE 133
19EEC334A: MACHINE 9 January 2023
The math behind Logistic regression 19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
Type of Logistic Regression: On the basis of the categories, Logistic Regression can be classified into three types: Binomial: In binomial Logistic regression, there can be only two possible types of dependent variables, such as or 1, Pass or Fail, etc. Multinomial: In multinomial Logistic regression, there can be three or more possible unordered types of the dependent variable, such as "cat", "dogs", or "sheep“ Ordinal: In ordinal Logistic regression, there can be three or more possible ordered types of dependent variables, such as "low", "Medium", or "High". 19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
19EEC334A: MACHINE 9 January 2023
Y^= Predicted Y= Actual 19EEC334A: MACHINE 9 January 2023 Y^= Predicted Y= Actual