Gradient Boosting
Classic gradient boosting algorithm from scikit-learn
Classic gradient boosting algorithm from scikit-learn.
When to use:
- Similar to XGBoost but simpler
- Don't need cutting-edge performance
- Want more straightforward hyperparameters
Strengths: Good accuracy, interpretable feature importance, built into scikit-learn Weaknesses: Slower than XGBoost/LightGBM, fewer features
Model Parameters
Similar to XGBoost: N Estimators, Max Depth, Learning Rate, Subsample, Min Samples Split, Min Samples Leaf