Ensemble Method (Bagging Boosting)

1,844 views 18 slides Dec 15, 2023
Slide 1
Slide 1 of 18
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18

About This Presentation

One of the first uses of ensemble methods was the bagging technique. This technique was developed to overcome instability in decision trees. In fact, an example of the bagging technique is the random forest algorithm. The random forest is an ensemble of multiple decision trees. Decision trees tend t...


Slide Content

Ensemble Method (Bagging Boosting)

Agenda Ensemble Learning Bagging Boosting Conclusion

Ensemble Learning What is Ensemble Learning? How Ensemble Learning works? How Ensemble Learning comes to existence? 3

Ensemble Learning Ensemble learning in machine learning refers to the technique of combining predictions from multiple models to improve overall performance and accuracy. By aggregating diverse models such as decision trees or neural networks, ensemble methods like bagging and boosting enhance robustness, reduce overfitting, and yield more accurate and stable predictions for various tasks.

5

Bagging

What is Bagging

Dataset Model 1 Model 2 Model n Dataset 1 Dataset 2 Dataset n Ensemble Model

Advantages of Bagging

What is boosting?

Training Set Subset 1 Subset 2 Subset n Weak Learner Weak Learner Weak Learner False prediction False prediction Overall Prediction Training Training Training Testing Testing

12

Advantage of Boosting

Differences Between Bagging and Boosting 14 Criteria Bagging Boosting Predictions data type The simplest way of combining predictions, Which belong to the same type . A way of combining predictions that  belong to the different types . Focuses areas Aim to decrease variance , not bias. Aim to decrease bias , not variance. Weights Each model receives equal weight. Models are weighted according to their performance . Models working method Each model is built independently . New models are influenced  by the performance of previously built models . Classifier training type In this base classifiers are trained parallelly . I n this base classifiers are trained sequentially . Example The Random forest model uses Bagging. The AdaBoost uses Boosting techniques.

Real World Application of Bagging Real World Application of Boosting

Summary Enhanced Accuracy and Stability Robustness to Data Challenges Continuous Improvement and Exploration

Thank you

References: 18 ✔✔ https://www.simplilearn.com/tutorials/machine-learning-tutorial/bagging-in-machine-learning ✔✔ https://www.analyticsvidhya.com/blog/2023/01/ensemble-learning-methods-bagging-boosting-and stacking/ ✔✔ https://www.simplilearn.com/tutorials/machine-learning-tutorial/what-is-boosting ✔✔ https://www.geeksforgeeks.org/bagging-vs-boosting-in-machine-learning/ ✔✔ https://olympus.mygreatlearning.com/courses/61356