Time Series
Forecast future values using trained time series models
Time series inference generates forecasts for future time steps using a trained model. The model uses the learned seasonal patterns, trends, and lags to produce predictions for the specified forecast horizon.
Input required at inference time:
- Trained model checkpoint — exported from the training run
- Preprocessing config — normalization/scaling settings from training
- Training tail — the last N observations from training data (needed for lag-based features)
Available Models
- ARIMA – Autoregressive integrated moving average for stationary series
- SARIMA – ARIMA with seasonal components
- SARIMAX – SARIMA with external regressors
- Auto-ARIMA – Automatically selects ARIMA order parameters
- Prophet – Facebook's decomposable time series model with holiday support
- Vector Autoregression – Multivariate autoregressive model
- TBATS – Handles multiple seasonal periods with trigonometric seasonality
- Theta Method – Decomposition-based model competitive on many benchmarks
- Exponential Smoothing – Weighted averaging of past observations
- XGBoost Time Series – XGBoost trained on lag features
- LightGBM Time Series – LightGBM trained on lag features
- CatBoost Time Series – CatBoost trained on lag and calendar features