ML 101: Ensemble Modelling — Random Forests & Gradient Boosted Trees cover art

ML 101: Ensemble Modelling — Random Forests & Gradient Boosted Trees

ML 101: Ensemble Modelling — Random Forests & Gradient Boosted Trees

Listen for free

View show details

LIMITED TIME OFFER | Get 2 Months for ₹5/month

About this listen

In this Machine Learning 101 episode, we explain ensemble modelling—how combining multiple models can create one powerful predictor. You’ll learn the difference between bagging and boosting, then dive into two of the most popular tree-based ensembles: Random Forests (many “randomised” decision trees voting/averaging together to reduce overfitting) and Gradient Boosted Trees (trees built sequentially, each correcting the last model’s mistakes). We use simple, real-world examples, then add an advanced section on key concepts such as OOB error. We finish with evaluation tips, common pitfalls, and a quick note on bias and responsible use.

No reviews yet