
Understanding Ensemble Methods in Machine Learning
Discover the world of ensemble learning through this comprehensive guide by Md. Azizul Hakim. Learn about the importance of bias-variance tradeoff, various ensemble techniques, and the need for ensemble learning in predictive modeling projects. Explore the concept of weak learners and how ensemble methods combine multiple algorithms for enhanced predictive performance.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Ensemble Methods Presented By: Md. Azizul Hakim
Outline What is Ensemble Learning? Need for Ensemble Learning Bias Variance Tradeoff Ensemble Learning Techniques Case Study Disadvantages of Ensemble Methods 2
Random Data Samples Random sampling is a part of the sampling techniques in which each sample has an equal probability of being chosen. A sample chosen randomly means, an unbiased representation of the total dataset 5
Weak Learners A weak learner is just one which performs relatively poor its accuracy is above chance, but just barely. There is often, but not always, the added implication that it is computationally simple. Weak learner also suggests that many instances of the algorithm are being pooled together into to create a strong ensemble classifier. 6
Final Model In machine learning, ensemble methods use multiple learning algorithms to obtain better predictive performance than could be obtained from any of the learning algorithms alone. 7
Need for Ensemble Learning Ensembles are predictive models that combine predictions from two or more other models. Ensemble learning method are popular and the go-to technique when the best performance on a predictive modeling project is the most important outcome. 9
Bias Quantify how much on an average are the predicted value different from the actual value. Image source: G. M. Tina, C. Ventura, S. Ferlito, and S. De Vito, A State-of-Art-Review on Machine-Learning Based Methods for PV, Applied Sciences, vol. 11, no. 16, p. 7550, Aug. 2021, doi: 10.3390/app11167550. 11
Variance Quantifies how much of the predictions made on the same observation different from each other. High variance can cause an algorithm to model the random noise in the training data, rather than the intended outputs. 12 Image source: G. M. Tina, C. Ventura, S. Ferlito, and S. De Vito, A State-of-Art-Review on Machine-Learning Based Methods for PV, Applied Sciences, vol. 11, no. 16, p. 7550, Aug. 2021, doi: 10.3390/app11167550.
High Bias (Under-fitting) Low Bias, Low Variance High Variance (Over-fitting) 13 Image source: https://vitalflux.com/bagging-classifier-python-code-example/
15 Image source: https://corporatefinanceinstitute.com/resources/data-science/ensemble-methods/
Bagging 16 Image source: https://vitalflux.com/bagging-classifier-python-code-example/
Stacking 17 Image source: https://www.analyticsvidhya.com/blog/2022/11/stacking-algorithms-in-machine-learning/
Boosting 18 Image source: https://www.pluralsight.com/guides/ensemble-methods:-bagging-versus-boosting
Boosting 19 Image source: https://www.pluralsight.com/guides/ensemble-methods:-bagging-versus-boosting
Case Study An experimental evaluation of ensemble methods for EEG signal classification The definition of Ensemble Learning Improve classification performance by applying ensemble methods Table 1: Test set accuracy rates (%) of ensembles with base classifier SVM Method Dataset 1 2 3 4 5 6 7 8 9 SVM 64.92 72.81 68.44 50.06 57.60 63.80 48.71 36.08 44.87 SVM+Bagging 64.98 72.68 68.42 50.09 57.75 63.91 48.88 36.15 45.33 SVM+ Boosting 69.32 73.52 69.40 50.39 57.60 63.80 48.70 36.59 45.53 S. Sun, C. Zhang, and D. Zhang, An experimental evaluation of ensemble methods for EEG Signal Classification, Pattern Recognition Letters, vol. 28, no. 15, pp. 2157 2163, 2007. 20
Disadvantages of Ensemble methods However, model ensembles are not always better. Ensembles can be more difficult to interpret, the output of the ensembled model is hard to predict and explain. Require higher computational burden, especially for classifier training 21
Conclusion In this presentation I have discussed The definition of Ensemble Learning Why we need Ensemble Learning? How Bias Variance affects machine learning? Different types of Ensemble Learning techniques Case study on how ensemble learning improve classification performance Some drawbacks of ensemble learning 22
References G. M. Tina, C. Ventura, S. Ferlito, and S. De Vito, A State-of-Art-Review on Machine-Learning Based Methods for PV, AppliedSciences,vol. 11, no. 16, p.7550,Aug.2021, doi:10.3390/app11167550. S. Sun, C. Zhang, and D. Zhang, An experimental evaluation of ensemble methods for EEG Signal Classification, Pattern RecognitionLetters, vol. 28, no. 15, pp.2157 2163, 2007. C. Makhijani, Advanced Ensemble Learning Techniques, Medium, 04-Apr-2023. [Online]. Available: https://towardsdatascience.com/advanced-ensemble-learning-techniques- bf755e38cbfb#:~:text=Disadvantages%20of%20Ensemble%20learning&text=Ensembling%20is%2 0less%20interpretable%2C%20the,and%20get%20useful%20business%20insights. [Accessed: 02-May- 2023]. 23
Thank You 24