LightGBM Time Series
LightGBM trained on lag and calendar features for fast time series forecasting
LightGBM Time Series applies the same lag-feature transformation as XGBoost Time Series but uses LightGBM as the learner, benefiting from faster training and better handling of high-cardinality categorical calendar features.
When to use:
- Large-scale time series forecasting where training speed matters
- Datasets with many calendar and categorical features
- Multiple simultaneous series where LightGBM's speed provides an advantage
Input:
- Trained model checkpoint — exported LightGBM model
- Preprocessing config — lag feature engineering settings
- Training tail — last N observations for lag computation
- Steps — forecast horizon
Output: Forecasted values for the specified horizon
Model Settings (set during training, used at inference)
N Estimators (default: 100) Number of boosting rounds.
Num Leaves (default: 31) Maximum leaves per tree.
Learning Rate (default: 0.1) Shrinkage per step.
Lags (set during training) Historical lag steps included as features.
Inference Settings
No dedicated inference-time settings.