Dokumentation (english)

AdaBoost

Adaptive boosting that focuses on examples with large errors

Adaptive boosting that focuses on examples with large errors.

When to use:

  • Have weak base learners
  • Want classic ensemble method
  • Simpler than gradient boosting

Strengths: Simple, less prone to overfitting, interpretable Weaknesses: Sensitive to noise and outliers, slower than modern boosting

Model Parameters

N Estimators (default: 50) Number of weak learners.

Learning Rate (default: 1.0) Shrinks the contribution of each learner.

Loss

  • linear: Linear loss (default)
  • square: Square loss
  • exponential: Exponential loss

Random State (default: 42) Seed for reproducibility.

On this page


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor 1 Tag
Release: v4.0.0-production
Buildnummer: master@64a3463
Historie: 68 Items