771: Gradient Boosting: XGBoost, LightGBM and CatBoost, with Kirill Eremenko by Super Data Science: ML & AI Podcast with Jon Krohn published on 2024-04-02T09:02:39Z Kirill Eremenko joins Jon Krohn for another exclusive, in-depth teaser for a new course just released on the SuperDataScience platform, “Machine Learning Level 2”. Kirill walks listeners through why decision trees and random forests are fruitful for businesses, and he offers hands-on walkthroughs for the three leading gradient-boosting algorithms today: XGBoost, LightGBM, and CatBoost. This episode is brought to you by Ready Tensor, where innovation meets reproducibility (https://www.readytensor.ai/), and by Data Universe, the out-of-this-world data conference (https://datauniverse2024.com). Interested in sponsoring a SuperDataScience Podcast episode? Visit passionfroot.me/superdatascience for sponsorship information. In this episode you will learn: • All about decision trees [09:28] • All about ensemble models [22:03] • All about AdaBoost [38:46] • All about gradient boosting [46:51] • Gradient boosting for classification problems [1:01:26] • All about XGBoost, LightGBM and CatBoost [1:04:12] Additional materials: www.superdatascience.com/771 Genre Technology Comment by Everly 💖💕😍🔥 2024-05-08T12:47:48Z Comment by Caroline 💕🧡🍓💘 2024-05-03T02:39:44Z Comment by Athena So good 💋💕🎶 2024-04-18T17:01:43Z Comment by Morlocan Atila 🔥 2024-04-15T11:12:04Z