Dokumentation (english)

XGBoost Time Series

XGBoost trained on lag features for time series forecasting

XGBoost Time Series transforms the forecasting problem into supervised learning by engineering lag features, rolling statistics, and calendar features from the time index. The trained XGBoost model then predicts future values from these derived features.

When to use:

  • Non-linear patterns in the time series that statistical models miss
  • Multiple related time series (XGBoost can use cross-series features)
  • When rich feature engineering (external variables, calendar features) is available

Input:

  • Trained model checkpoint — exported XGBoost model
  • Preprocessing config — lag feature engineering settings
  • Training tail — last N observations for lag feature computation
  • Steps — forecast horizon

Output: Forecasted values for the specified horizon

Model Settings (set during training, used at inference)

The lag features used (e.g., lag-1, lag-7, rolling-mean-14) are defined during training. Key XGBoost parameters:

N Estimators (default: 100) Number of boosting rounds.

Max Depth (default: 6) Tree depth.

Learning Rate (default: 0.1) Shrinkage per step.

Lags (set during training) Which historical lag steps to include as features.

Inference Settings

No dedicated inference-time settings. The same lag features computed during training are generated from the provided training tail.


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor etwa 2 Stunden
Release: v4.0.0-production
Buildnummer: master@afa25ab
Historie: 72 Items