Dokumentation (english)

LightGBM

Microsoft's gradient boosting framework optimized for speed and memory efficiency

Microsoft's gradient boosting framework optimized for speed and memory efficiency.

When to use:

  • Large datasets (>10k rows)
  • Many features
  • Need fast training
  • Limited memory

Strengths: Very fast, low memory, handles large datasets, accurate Weaknesses: Can overfit small datasets, many hyperparameters

Model Parameters

Num Leaves (default: 31) Maximum number of leaves in one tree. More = complex.

Learning Rate (default: 0.1) Step size for weight updates.

N Estimators (default: 100) Number of boosting iterations.

Max Depth (default: -1) Maximum tree depth (-1 = unlimited).

Feature Fraction (default: 1.0) Fraction of features to use per iteration.

Bagging Fraction (default: 1.0) Fraction of data to use per iteration.

Min Data in Leaf (default: 20) Minimum samples in one leaf.

Reg Alpha, Reg Lambda L1 and L2 regularization.

Random State (default: 42) Seed for reproducibility.

On this page


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor 1 Tag
Release: v4.0.0-production
Buildnummer: master@64a3463
Historie: 68 Items