ML 101: Ensemble Modelling — Random Forests & Gradient Boosted Trees
Failed to add items
Sorry, we are unable to add the item because your shopping basket is already at capacity.
Add to cart failed.
Please try again later
Add to wishlist failed.
Please try again later
Remove from wishlist failed.
Please try again later
Follow podcast failed
Unfollow podcast failed
-
Narrated by:
-
Written by:
About this listen
In this Machine Learning 101 episode, we explain ensemble modelling—how combining multiple models can create one powerful predictor. You’ll learn the difference between bagging and boosting, then dive into two of the most popular tree-based ensembles: Random Forests (many “randomised” decision trees voting/averaging together to reduce overfitting) and Gradient Boosted Trees (trees built sequentially, each correcting the last model’s mistakes). We use simple, real-world examples, then add an advanced section on key concepts such as OOB error. We finish with evaluation tips, common pitfalls, and a quick note on bias and responsible use.
No reviews yet