Hi, This week you'll learn about Scaling Kaggle Competitions Using XGBoost: Part 2. In the previous tutorial of this series, you learned how easy it is to use something as powerful as XGBoost in your standard machine learning projects. So, you might wonder why the next step was to dive into the complete underlying math behind AdaBoost. Well, the answer to that is simple. When you play Super Mario, you don't just start the game right inside Bowser's castle, do you? You go through many levels of slowly increasing difficulty, honing necessary skills before you finally face the game's main villain. Mario and Bowser Understanding the math behind AdaBoost will act as a very relevant precursor when we finally tackle the math of XGBoost. The big picture: A prerequisite for XGBoost, Adaptive Boosting (AdaBoost), is an ensemble learning technique that utilizes weak learners and combines them into one strong unit. How it works: Here, we have used decision trees with AdaBoost. Each stump (decision tree with depth 1) is created sequentially and has a weightage in the say for the final output. The previous stump determines how the next stump is going to turn out. Our thoughts: Understanding AdaBoost will bolster your foundations for decision trees and ensemble learners, and in turn, will help you understand XGBoost much better. Yes, but: AdaBoost and XGBoost differ in many aspects, as you will learn in later posts of this series. Stay smart: Keep trying out the code and concepts you learn in practice; otherwise, you will definitely forget about these! Click here to read the full tutorial Do You Have an OpenCV Project in Mind? You can instantly access all of the code for Scaling Kaggle Competitions Using XGBoost: Part 2, along with courses on TensorFlow, PyTorch, Keras, and OpenCV by joining PyImageSearch University. Guaranteed Results: If you haven't accomplished your Computer Vision/Deep Learning goals, let us know within 30 days of purchase and get a full refund. Do You Have an OpenCV Project in Mind? Your PyImageSearch Team |
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.